• sugar_in_your_tea
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    1 day ago

    I’m interested in benchmarks to compare to my current RX 6650 XT, which is pretty similar to the 4060.

    It has 12GB VRAM, which might be enough to mess around with smaller LLM models, but I really wish they’d make a high VRAM variant for enthusiasts (say, 24GB?).

    That said, with Gelsinger retiring, I’ll probably wait until the next CEO is picked to hear whether they’ll continue developing their GPUs, I’d really rather not buy into a dead-end product, even if it has FOSS drivers.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      54 minutes ago

      Got the same card and you can definitely run smaller models on 8GB. There’s no need to pay 200-300 bucks for a 4Gb ram upgrade though. Might be a nice card for people on the lower end but not in our cases. But yeah, I’d really like more vram too, especially with how expensive the higher end cards get - which AMD won’t even bother with anymore anyway. Really hoping for something with 16+ GB for a decent price.

      • sugar_in_your_tea
        link
        fedilink
        English
        arrow-up
        1
        ·
        43 minutes ago

        Yeah, I really don’t need anything higher than 6700/7700 XT performance, and my 665 XT is still more than sufficient for the games I play. All I really need is more VRAM.

        If Intel sold that, I’d probably upgrade. But yeah, 12GB isn’t quite enough to really make it make sense, the things I can run on 12GB aren’t meaningfully different than the things I can run on 8GB.

    • circuitfarmer@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 day ago

      12GB VRAM in 2024 just seems like a misstep. Intel isn’t alone in that, but it’s really annoying they didn’t just drop at least another 4GB in there, considering the uplift in attractiveness it would have given this card.

        • circuitfarmer@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          8
          ·
          24 hours ago

          The industry as a whole has really dragged ass on VRAM. Obviously it keeps their margins higher, but for a card targeting anything over 1080, 16GB should be mandatory.

          Hell, with 8GB you can run out of VRAM even on 1080, depending on what you play (e.g. flight sims).

      • sugar_in_your_tea
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        I doubt it would cost them a ton either, and it would be a great marketing tactic. In fact, they could pair it w/ a release of their own LLM that’s tuned to run on those cards. It wouldn’t get their foot in the commercial AI space, but it could get your average gamer interested in playing with it.

        • Da Bald Eagul@feddit.nl
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 hours ago

          It wouldn’t cost much, but this way they can release a “pro” card with double the vram for 5x the price.

          • sugar_in_your_tea
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 hours ago

            I doubt they will. Intel has proven to be incompetent at taking advantage of opportunities. They missed:

            • mobile revolution - waited to see if the iPhone would pan out
            • GPU - completely missed the crypto mining boom and COVID supply crunch
            • AI - nothing on the market

            They need a compelling GPU since the market is moving away from CPUs as the high margin product in a PC and the datacenter. If they produced an AI compatible chip at reasonable prices, they could get real world testing before hey launch something for datacenters. But no, it seems like they’re content missing this boat too, even when the price of admission is only a higher memory SKU…

        • Chewy@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          19 minutes ago

          It likely depends on how much they pay for power and how many users they serve.

          E.g. I’d really like AV1 support on my server (helps with slow upload), but the cost for power of a dedicated GPU is inacceptable in my country. The few transcoding reams I’d theoretically need in a worst case scenario are more than met with an iGPU.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      If that was some OEM design for a retail PC, fine. But fuck off with shit like glued back plates on dedicated GPUs you buy.

  • commander@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    B770 to hypothetical B9XX is what I’m looking for. Phoenix benchmarks because not many doing Linux benchmarks. 8700-8800xt or B700-B9XX for me next year