• Present_Bill5971@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I wish they got into a detachable laptop form factor. It’s why I’m excited for the upcoming Minisforum tablet/laptop. Touch experience on Gnome is pretty good and Gnome and KDE have touch centric environments in development that I haven’t tried

  • ResponsibleJudge3172@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Hurray and congrats to AMD I guess? /s

    Ryzen is one of the world’s strongest tech brands now. Equal to RTX in some ways superior in others

    • SkillYourself@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Ryzen division’s issues has to do with the gigantic inventory pileup that started in mid-2022 and forced them to undership the market for almost a year. It’s bound to make shipment data reported by these research firms look funky.

  • lutel@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I can’t imagine anyone in right mind buying Intel now, or maybe people buy them because they were always on Intel and are afraid of change. I’ve been loyal to Intel for ~20 years (with one AMD bought that was 386), but now their efficiency and progress is so poor that this brand is over for me.

  • ea_man@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    In the GPU market AMD is both lazy and stupid: they just try to undercut NVIDIA a bit, they should understand that they don’t have a product as good as RTX and focus to provide decent low price GPUs with a lot of RAM in the 200-400$ range.

    If someone has 500 to spend they could as well spend 650 and buy NVIDIA, those on a budget on the other hand would buy AMD if they had proposed a 12-16gb good rastering option on the cheap.

    • auradragon1@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      they should understand that they don’t have a product as good as RTX and focus to provide decent low price GPUs with a lot of RAM in the 200-400$ range.

      It’s easy to explain from a business point of view. The reason AMD doesn’t want to compete in the $200-$400 range is because there is barely any profit there. GPUs are huge dies with a lot of memory. They’re significantly more expensive than CPUs to manufacture per unit. Therefore, AMD would rather spend all of the TSMC wafers on Epyc chips than $200-$400 GPUs.

      Take for example, Navi 32 (7800XT), has 28 billion transistors and sells for $500. That $500 has to include expensive GPDDR RAM, a board, capacitors, and a heatsink.

      Conversely, a 64 core Zen2 Epyc has 40 billion transistors and sold for $5000+. No GPDDR RAM needed. No heatsink fan. No board. No capacitors. Just the chip. $5000.

      So you tell me what AMD should prioritize making.

      Lastly, if AMD starts a price war at the $200 - $400 range, Nvidia will respond with something $250 - $450 but slightly faster. Nvidia isn’t just going to let AMD take that market without any resistance.

      • ea_man@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s easy to explain from a business point of view.

        Yeah but ain’t the customer part of the market model?

        Coz there are ZERO chance in hell that I can spend 800 for a GPU, even if I get that it makes more sense for the manufacturer.

        So you tell me what AMD should prioritize making.

        The GPU that me and 90% of the people can buy, a ~250-300$ GPU, pretty please with 12-15GB of RAM as it is cheap.

    • XenonJFt@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yea why not try to bring the same investment to the market where that company with 10x the market cap and only focuses on GPUs and AI chips can just lower pricing to stay competitive.not like AMD has console,handheld, consumer or Server chips to put RnD budget alongside radeon?

      Also people would buy overpay to nvidia is what makes nvidia jack up to insanity that works towards amd undercutting.and also why gpu prices will increase all the time if behaviour is kept.look at 7800xt.it triple destroyed nvidias mid lineup to adjust prices pre and after launch.buying for the undercutter makes both companies to be more competitive at pricing. Which happened at that case.

  • From-UoM@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    There are some really good sales on the 7000 series in the us.

    7000 series had a rocky start, but the non-x chips and x3d chips are pretty good.

    Upcoming Arm chips and Intel finally using EUV for Intel 4 will be fascinating to see.

      • Put_It_All_On_Blck@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        CPU value was bad too. AMD had to immediately cut prices after the 13th gen launch since Intel was offering better performance for the same price or cheaper, or equal performance for cheaper.

  • XenonJFt@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I want to know the reason why AMD is hesitant to go to laptop/ prebuilt game. Prebuilts maybe not because of brand recognition leanient towards nvidia.(some sellers admit it very clearly) but in APU’s and iGPUs amd should pay off more laptop brands to host their igpus.their 6700m is very price competitive where I live. Like its 2060 pricing… But much better. But you jsut cant find anything on stock or catalogue.I dont know how much nvidia bankrolls partners to stay loyal and pump laptops. But AMD maybe at least try to overcut. You gain Soooooo much market share from laptop sales its insane

    • Asgard033@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      They probably aren’t able to secure enough fab capacity to do that. AMD has had some problems delivering the volume some of their clients demand before.

    • SRFoxtrot341_V2@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Not surprised at all, NVIDIA is still dominant in gaming laptops.

      As someone who owned a laptop with an RX 6500M, I wish I got an RTX 3050 one instead. But that doesn’t matter, I’m more of a desktop guy anyway.

        • PlsDntPMme@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          They already have great APUs with awesome efficiency relative to Intel. It just seems like they don’t get nearly enough attention in the higher end laptop space. I’d love to go Ryzen for my next work laptop but there’s virtually nothing.

            • PlsDntPMme@alien.topB
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Well, first and foremost because the Ryzen models are ordered out until Q2 of next year. Beyond that I’m more interested in something slimmer for the portability aspect. I’m picky and it’s stupid I know.

        • 65726973616769747461@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          still have supply issue: intel/nvidia announce laptop product, I can get it within 2 weeks; amd announce laptop product, not going to see it for at least 6 months

          I’m not from US btw

    • Valoneria@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      To be fair, it’s not a particularly large market, and i’d wager they’d have to put up a higher-than-worth-it cost to gain marketshare from Nvidia. They’re contempt with their APU’s for the time being.

  • Astigi@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    AMD doesn’t have enough production to make a dent on laptops, and they won’t sacrifice anything else to improve it

    • T1beriu@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      We don’t see AMD in laptops because AMD it’s not successful in the GPU department and doesn’t have the financial power to support OEMs to build with AMD hardware.

  • heatlesssun@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    AMD is doing well with CPUs these days. GPUs on the other hand… they just aren’t making the same progress there.

    • ManicChad@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Do they need to compete with the 0.1% of GPUs or just compete with the 4080?

      I think AMD is playing the long game on RT and they’re already kings at rasterization. A 4090 class card may not be needed anymore if their research pans out. They have some proofs of concept that would make the 4090 unnecessary.

      I say this as a 4090 user. All the 4090 is for is brute forcing performance when software refinements are what’s really needed at this point. Then build hw accelerators once we have the software side sorted.

      Hell even Nvidia is waffling at another 90 series card next gen.

    • TheRustyBird@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      huh? they still have the best $/performance cards, its only the absolute high-end they lose out to nvidia, and thats only for people who wont bat an eye dropping 1.5k on a gpu. that’s why MS put them in the new xbox’s, best bang for buck.

      • Banana_Joe85@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Radeon needs massive investments in drivers and technology.

        They had the raw compute power since a while but struggled to get this power ‘onto the street’.

        Something which is still baffling me, considering that they seem to do well on consoles.

        • RogueIsCrap@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Radeon needs massive investments in drivers and technology.

          They had the raw compute power since a while but struggled to get this power ‘onto the street’.

          Something which is still baffling me, considering that they seem to do well on consoles.

          It’s not just about drivers tho. AMD driven consoles are also weak with good upscaling and ray tracing. It’s getting to be a real problem on consoles because the resolution has to be dropped significantly when high quality ray tracing is used. Then those consoles have to upscale back to 4K with FSR2 which is still pretty bad.

      • BleaaelBa@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Never going to happen because Nvidia always brings something subjective in the mix. and it always changes.

        • theoutsider95@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          That’s what a market leader do , they invent a new market or new features to increase sales and demand or to differentiate themselves. Just like Apple’s decision of not including a charger with the phone or the Airpods.

      • dudemanguy301@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        “It’s not enough that I should succeed all others must fail”

        THAT is the true essence of the ryzen moment, it wasn’t just AMD making a strong move with Ryzen, it was also Intel falling flat on their face 10nm turmoil halted both process and architecture progress for several years.

        • Zarmazarma@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I’m pretty sure the intention of that quote is, “I will not be satisfied by merely succeededing, I must see others fail as well.”

      • theoutsider95@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s not a Ryzen “Moment.” It was Ryzen momentum. They kept on improving gen after gen until they changed the consumer perspective of their brand.

        While Radeon release one good gen, then they fumble the next two gens. And then blame Nvidia’s mindshare.

      • sudo-rm-r@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I thought 6000 series was their ryzen moment. Better power efficinency than rtx 3000 and 6900xt matched 3090 in raster. I thought 7000 would be their zen2 moment when they final beat nvidia at something important, eh…

      • DevAnalyzeOperate@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Nvidia is all hands on deck going pedal to the metal right now trying to stop AMD from gaining any market share right now. They’re worried about becoming a victim of their own success here and the AI boom allowing AMD to gain a foothold in AI with their open source strategy. They’re also worried about Intel and Google for similar reasons.

        Nvidia is quite the formidable foe especially compared to Intel, and they have a massive headstart because they have a LOT of advantages beyond having merely the best hardware on the market, but I’m still a bit bullish on AMD’s chances here.

    • jtmackay@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yet for the price… even as an RTX owner… my next GPU will definitely be an AMD. I think AMD is in a good spot when it comes to price to performance right now. It’s really hard to lead both CPU and GPU at the same time and nobody else is really even trying.

    • blueredscreen@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      AMD is doing well with CPUs these days. GPUs on the other hand… they just aren’t making the same progress there.

      Not enough money. That’s just what it is. Isn’t a technology problem necessarily.

    • Deckz@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I don’t really understand this sentiment, I have an RNDA 3 card and aside from the high idle power it’s rock solid. Yes they don’t have DLSS and RT performance of Nvidia, but in terms of raw performance it’s excellent. Just because something is second best doesn’t mean it’s shit. People who complain about problems with cards are the slim minority of buyers. I think that’s true of most products (I’ve heard nothing but horror stories about recent logitech mice, but I have one that’s 5 years old now with no double click issues).

    • DevAnalyzeOperate@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Their software is behind in the consumer market and WAY behind in enterprise market.

      Nvidia hitting production bottlenecks should help AMD move a few GPUs in Enterprise though, so long as they don’t throw away their opportunity here.

    • itsjust_khris@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Don’t think there’s the same level of focus there. Even when it comes to AI, AMD is pushing hard with ROCm. Recently progress on that has rapidly accelerated. They’re getting deals for their upcoming data center products. In comparison, how much revenue can be expected to be made with gaming? Especially since to make a gaming GPU they sacrifice being able to make a more profitable CPU or HPC/AI Accelerator. Furthermore when you look at the size difference between AMD and Nvidia (Nvidia’s profits are greater than AMD’s entire revenue) you see why the current situation has occurred. There’s no way AMD can keep pace with Nvidia given their smaller, divided resources. Not an excuse, just the reality.

      • DevAnalyzeOperate@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Pushing hard with ROCm?

        There are millions of Devs who develop for CUDA. Nvidia I believe has north of a thousand (can’t remember if it’s like 1 or 2 thousand) people working on Cuda. CUDA is 17 years old. There is SO MUCH work already done in CUDA, Nvidia is legit SO far ahead and I think people really underestimate this.

        If AMD hired like 2000 engineers to work on ROCm they would still take maybe 5 years to get to where Nvidia is now, and still be 5 years behind Nvidia. Let’s not even get into the magnitudes more CUDA GPUs floating around out there compared to ROCm GPUs, because CUDA GPUs started being made earlier at higher volumes and even really old stuff is still usable for learning/home lab. As far as I know, they’re hiring a lot less, they just open sourced it and are hoping they can convince enough other companies to write stuff for ROCm.

        I don’t mean to diminish AMD’s efforts here, Nvidia is certainly scared of ROCm, ROCm I expect to make strides in the consumer market in particular as hobbyists try and get their cheaper AMD chips to work with Diffusion models and whatever. When it comes to more enterprise facing stuff though CUDA is very very far ahead and the lead is WIDENING and the only real threat to that status quo is that there literally are not enough NVIDIA GPUs to go around.

        • itsjust_khris@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          CUDA’s moat is being undone by things like OpenAI Triton. Soon most ML code will be coded in interfaces that allow for any supported hardware vendor to run the code. AMD doesn’t have to replicate all of Nvidia’s work, especially when the industry has multiple giants all working on undoing Nvidia’s software moat.

          Nvidia’s dominance won’t last forever, they have the advantage but one day all this AI hardware/software will be commoditized.

      • XYHopGuy@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        because theyre measured with fp64 perf. not exactly a useful operation for deep learning.

    • lutel@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      AMD together with NVIDIA and Microsoft plans for new ARM CPU. Meanwhile Intel executives surrounded by their burning CPUs “its all fine…”

    • DktheDarkKnight@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That’s because you have to add the consoles lol. That’s like 95% of their GPU revenue and probably the only reason why their GPU business have survived to this day.

    • deefop@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The crazy thing is that’s not even really true. RDNA2 competed extremely well with Ampere. In Raster it was generally slightly better than its direct competition, and excluding the insanity of Covid and supply shortages, it was always cheaper as well.

      I paid $360 for my 6700xt in Dec 2022. Now 6700xt’s are routinely like $300 on sale. The equivalent from Nvidia is… what? The 4060ti is slightly faster but only has 8gb of VRAM, and launched at $400 fucking dollars. The 4060ti 16gb is the same performance with more VRAM and launched at $500 fucking dollars.

      At this point what’s propping up Nvidia is unfathomable levels of mind share. The product stack doesn’t actually deserve to sell anywhere near as well as it is currently.

      It’s certainly not unheard of in the tech world for this to happen. Look at Apple releasing a fucking $1600 MBP with 8gb of RAM, and then trying to gaslight their customers into believing it’s better than it is, like they aren’t being ripped off. Shit’s absurd, but the average consumer is a pre-programmed midwit that relies almost entirely on brand loyalty to make their purchasing decisions.

      And while I’m not super impressed with RDNA3 at launch, it’s significantly better as the prices slowly creep down. I kind of figure the 7800xt will slowly creep down towards $450 being a normal price, and the 7700xt will probably creep under $400.

      The 7600, when priced below 250, especially in the low 200’s, like $220-230, is fantastic value as well.

      Even at the high end, the 7900xt at $700-750 is a solid buy, and the 7900xt at $900 or less is very good as well.

      Lovelace is fantastic from a technical perspective, but Nvidia outdid their own previous records for greed when it comes to the product stack and the pricing.

      • XYHopGuy@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        80% market share says it absolutely is true. the pricing power is evident from all the things you listed. Radeon has not been able to keep pace with the advancements to computer graphics coming out of NVIDIA.

    • riklaunim@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Like is it that bad? Even when they have a better product at a given price point quite a lot of normal consumers will go for the Nvidia sticker because that’s the brand perception. AMD tried to be the cheaper option in the past and they almost went bankrupt.

      And not like they are pushing as much MI300 as they possibly can. Server / AI > consumer.

  • imaginary_num6er@alien.topOPB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    AMD has been gradually increasing its Server CPU market share since 2017, but 2022 and 2023 turned out to be breakthrough years for the company as its share gains accelerated rapidly in recent quarters. AMD commanded a 23.3% unit share in Q3 2023, up from 18.6% quarter-over-quarter and 17.5% year-over-year. The revenue share has increased significantly by 4.7% QoQ and 5.8% YoY. Such major increases may be attributed to the high popularity of AMD’s latest 4th Generation EPYC processors, which were the most popular data center products from AMD in Q3 as major cloud providers adopted them for internal workloads and public instances.

  • BurgerBurnerCooker@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Server side from almost non-existence to close to 25% in about 5 years is actually very impressive, given the specific market tends to be rather conservative and has a lot of inertia to steer for your favor.

    The server market is the fattest segment, Intel needs to get its $hit together, this is very alarming.

    • perflosopher@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      And considering AMD has roughly double the number of cores per CPU as Intel, AMD probably has a much higher share of CPU cores and of revenue than the 20-ish % market share in sockets sold.

    • casiwo1945@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      When you look at the best offerings of the two companies, AMD is offering their 96 core 3d v cache EPYC at almost a third less price than Intel’s 56 core sapphire rapids, and everything starts to make sense

      • jaaval@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Then you realize few customers actually buy either of those. Most sales are in the ~30ish core products. And it starts to make sense why AMD doesn’t just capture the entire market but is still sitting at 25% even though the 96 core chip is so great.

      • soggybiscuit93@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        The problem with this line of reasoning is that we don’t buy CPUs in datacenter - we buy full servers from 3rd party suppliers. Look at a company such as CDW.

        Most server customers aren’t massive hyperscalers that need to maximize computer per rack-U. Xeon servers are plentiful, can be found for cheaper than their CPU MSRP’s would suggest, and if we’re looking for a 16 - 24 core model to spin up a branch office, a lot of times we don’t even really care whether it’s SPR or Epyc. There’re other factors like “Is my LOB app certified by the vendor to run on Epyc”? etc.

        A lot of times, when I need to order, a lot of the Epyc servers may be on backorder, or may be comparable pricing, or maybe I specifically need Xeon because I already have a Xeon Hyper-V server and want this server to be able to work in the event of a failover (best to keep your VMs on the same platform).

        Hell, in Windows Server, Epyc only got support for nested virtualization with Server 2022, so it wasn’t even a consideration when we did a big refresh a few years back.

    • jaaval@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Fat in what way?

      This year intel has sold around $6-8B in client chips and $4B in server chips per quarter, give or take some hundreds of millions. At best years they were about equal at the time when intel was able to charge whatever they wanted for server chips. On AMD server chips now outsell client chips if you don’t count gaming segment stuff ($1.6B vs $1.5B respectively last quarter), but that is mostly because of very anemic client sales.

  • Bvllish@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    How does mobile, desktop, and server share go up, but overall x86 go down? The only other major x86 segment I can think of is console.