• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: October 26th, 2023

help-circle

  • I don’t keep up with SSD benchmarks, but the mechanism behind this phenomenon is not anything mysterious. Most consumer SSDs ship with TLC or QLC NAND, which supports either 3 or 4 bits of storage per cell. However, writing the full 3 or 4 bits is slower than just writing one bit in each cell, so the drives use available empty NAND as an SLC write cache while the drive is still not full. So when you write to a relatively empty drive, your data will go into the DRAM cache on the controller (if there is any being used as a write buffer) and then get written into available NAND chips in SLC mode. Later on, the drive will consolidate the data down into QLC/TLC properly, but you get the advantage of a fast write so long as there is enough empty NAND to use SLC caching.

    Obviously this falls apart once your drive gets close to full and there are no available empty NAND chips to write to as SLC cache. This is also why the write performance of budget drives tends to drop off worse than higher end drives. The nicer drives have faster NAND and usually have DRAM on the SSD controller to help performance in the worst case. Enterprise drives often sidestep this issue entirely via just using SLC or MLC NAND directly, or by having additional overprovisioning (extra NAND chips).



  • It’s actually less complicated than you’re making it. The reason OEMs don’t build those systems is because AMD and Intel don’t make those chips. The reason AMD and Intel don’t make large monolithic CPU/GPU designs for laptop is that up until now, the market just wasn’t there for such a product. What segments are you targeting with a strong CPU/GPU combination?

    1. Premium productivity products (Apple MBP segment)
    2. Gaming Laptops
    3. Workstation or Desktop replacement
    4. Steam deck style portable gaming devices

    What are the challenges with capturing any of these segments?

    For segment 1, you’re competing with Apple, who is an entrenched player in the market and very popular with software devs and creative professionals already. It’s worth mentioning that Apple drives profitability on their devices via upsells on storage/memory and software services, not just margins on CPUs sold like AMD/Intel. It’s also worth pointing out that AMD and Intel have been competing in this segment to varying degrees of success for as long as it has existed. Meteor Lake in particular is very clearly targeted at bringing Intel designs up to speed vs Apple silicon in idle and low load scenarios.

    For 2, the biggest problem is that Nvidia is the entrenched player in the gaming GPU market. It’s an uphill battle to convince buyers to pay a premium for an Intel/AMD only gaming laptop on the basis of improved battery life alone. Especially when an Nvidia equipped dGPU design will probably offer higher peak performance and most users will game plugged in anyways.

    For segment 3, your users are already sacrificing battery life and portability for max performance. If they can get a faster product using separate CPU/GPU chips, they will take that option.

    Segment 4 is the obvious one where such a design is already the best choice. I expect to see new products in this space consistently over the next couple years and for those chips to make their way into traditional laptops at some point.

    I generally think that these large monolithic designs will see increased adoption in various segments over time, but it’s going to be contingent on Intel/AMD delivering a product that is good enough to compete and win against Apple or Nvidia’s offerings. I just don’t think that’s the case yet outside of niche markets.



  • This reinforces an observation I made when we saw the leaks for the 40 series Super cards, which is that Nvidia is basically shoring up their raster performance against AMD in every segment except the low end.

    • +5-10% would put 4070 Super just ahead of 7800XT
    • +10% and 16GB VRAM lets 4070Ti Super compete on more even footing with 7900XT
    • 5-10% improvement puts 4080 Super decidedly ahead of 7900XTX in performance and helps justify the premium, plus makes it look better against 4090.

    Really looks to me like Nvidia is serious about moving more gaming GPUs next year than this year. They had to know the crypto oversupply hangover was going to keep GPU sales pretty slow in 2023, and the 40 series launch was very obviously kept mediocre to avoid forcing fire sales on Ampere parts to protect margins. Now that Ampere stock is truly just about dried up, I expect the GPU market to return more “to normal” with us getting better value on the Super cards, and probably at least decent value when the 50 series launches.