Like is it that bad? Even when they have a better product at a given price point quite a lot of normal consumers will go for the Nvidia sticker because that’s the brand perception. AMD tried to be the cheaper option in the past and they almost went bankrupt.
And not like they are pushing as much MI300 as they possibly can. Server / AI > consumer.
Don’t think there’s the same level of focus there. Even when it comes to AI, AMD is pushing hard with ROCm. Recently progress on that has rapidly accelerated. They’re getting deals for their upcoming data center products. In comparison, how much revenue can be expected to be made with gaming? Especially since to make a gaming GPU they sacrifice being able to make a more profitable CPU or HPC/AI Accelerator. Furthermore when you look at the size difference between AMD and Nvidia (Nvidia’s profits are greater than AMD’s entire revenue) you see why the current situation has occurred. There’s no way AMD can keep pace with Nvidia given their smaller, divided resources. Not an excuse, just the reality.
There are millions of Devs who develop for CUDA. Nvidia I believe has north of a thousand (can’t remember if it’s like 1 or 2 thousand) people working on Cuda. CUDA is 17 years old. There is SO MUCH work already done in CUDA, Nvidia is legit SO far ahead and I think people really underestimate this.
If AMD hired like 2000 engineers to work on ROCm they would still take maybe 5 years to get to where Nvidia is now, and still be 5 years behind Nvidia. Let’s not even get into the magnitudes more CUDA GPUs floating around out there compared to ROCm GPUs, because CUDA GPUs started being made earlier at higher volumes and even really old stuff is still usable for learning/home lab. As far as I know, they’re hiring a lot less, they just open sourced it and are hoping they can convince enough other companies to write stuff for ROCm.
I don’t mean to diminish AMD’s efforts here, Nvidia is certainly scared of ROCm, ROCm I expect to make strides in the consumer market in particular as hobbyists try and get their cheaper AMD chips to work with Diffusion models and whatever. When it comes to more enterprise facing stuff though CUDA is very very far ahead and the lead is WIDENING and the only real threat to that status quo is that there literally are not enough NVIDIA GPUs to go around.
CUDA’s moat is being undone by things like OpenAI Triton. Soon most ML code will be coded in interfaces that allow for any supported hardware vendor to run the code. AMD doesn’t have to replicate all of Nvidia’s work, especially when the industry has multiple giants all working on undoing Nvidia’s software moat.
Nvidia’s dominance won’t last forever, they have the advantage but one day all this AI hardware/software will be commoditized.
That’s what a market leader do , they invent a new market or new features to increase sales and demand or to differentiate themselves. Just like Apple’s decision of not including a charger with the phone or the Airpods.
“It’s not enough that I should succeed all others must fail”
THAT is the true essence of the ryzen moment, it wasn’t just AMD making a strong move with Ryzen, it was also Intel falling flat on their face 10nm turmoil halted both process and architecture progress for several years.
Radeon needs massive investments in drivers and technology.
They had the raw compute power since a while but struggled to get this power ‘onto the street’.
Something which is still baffling me, considering that they seem to do well on consoles.
It’s not just about drivers tho. AMD driven consoles are also weak with good upscaling and ray tracing. It’s getting to be a real problem on consoles because the resolution has to be dropped significantly when high quality ray tracing is used. Then those consoles have to upscale back to 4K with FSR2 which is still pretty bad.
I thought 6000 series was their ryzen moment. Better power efficinency than rtx 3000 and 6900xt matched 3090 in raster. I thought 7000 would be their zen2 moment when they final beat nvidia at something important, eh…
Nvidia is all hands on deck going pedal to the metal right now trying to stop AMD from gaining any market share right now. They’re worried about becoming a victim of their own success here and the AI boom allowing AMD to gain a foothold in AI with their open source strategy. They’re also worried about Intel and Google for similar reasons.
Nvidia is quite the formidable foe especially compared to Intel, and they have a massive headstart because they have a LOT of advantages beyond having merely the best hardware on the market, but I’m still a bit bullish on AMD’s chances here.
Yet for the price… even as an RTX owner… my next GPU will definitely be an AMD. I think AMD is in a good spot when it comes to price to performance right now. It’s really hard to lead both CPU and GPU at the same time and nobody else is really even trying.
I don’t really understand this sentiment, I have an RNDA 3 card and aside from the high idle power it’s rock solid. Yes they don’t have DLSS and RT performance of Nvidia, but in terms of raw performance it’s excellent. Just because something is second best doesn’t mean it’s shit. People who complain about problems with cards are the slim minority of buyers. I think that’s true of most products (I’ve heard nothing but horror stories about recent logitech mice, but I have one that’s 5 years old now with no double click issues).
Do they need to compete with the 0.1% of GPUs or just compete with the 4080?
I think AMD is playing the long game on RT and they’re already kings at rasterization. A 4090 class card may not be needed anymore if their research pans out. They have some proofs of concept that would make the 4090 unnecessary.
I say this as a 4090 user. All the 4090 is for is brute forcing performance when software refinements are what’s really needed at this point. Then build hw accelerators once we have the software side sorted.
Hell even Nvidia is waffling at another 90 series card next gen.
The crazy thing is that’s not even really true. RDNA2 competed extremely well with Ampere. In Raster it was generally slightly better than its direct competition, and excluding the insanity of Covid and supply shortages, it was always cheaper as well.
I paid $360 for my 6700xt in Dec 2022. Now 6700xt’s are routinely like $300 on sale. The equivalent from Nvidia is… what? The 4060ti is slightly faster but only has 8gb of VRAM, and launched at $400 fucking dollars. The 4060ti 16gb is the same performance with more VRAM and launched at $500 fucking dollars.
At this point what’s propping up Nvidia is unfathomable levels of mind share. The product stack doesn’t actually deserve to sell anywhere near as well as it is currently.
It’s certainly not unheard of in the tech world for this to happen. Look at Apple releasing a fucking $1600 MBP with 8gb of RAM, and then trying to gaslight their customers into believing it’s better than it is, like they aren’t being ripped off. Shit’s absurd, but the average consumer is a pre-programmed midwit that relies almost entirely on brand loyalty to make their purchasing decisions.
And while I’m not super impressed with RDNA3 at launch, it’s significantly better as the prices slowly creep down. I kind of figure the 7800xt will slowly creep down towards $450 being a normal price, and the 7700xt will probably creep under $400.
The 7600, when priced below 250, especially in the low 200’s, like $220-230, is fantastic value as well.
Even at the high end, the 7900xt at $700-750 is a solid buy, and the 7900xt at $900 or less is very good as well.
Lovelace is fantastic from a technical perspective, but Nvidia outdid their own previous records for greed when it comes to the product stack and the pricing.
80% market share says it absolutely is true. the pricing power is evident from all the things you listed. Radeon has not been able to keep pace with the advancements to computer graphics coming out of NVIDIA.
huh?
they still have the best $/performance cards, its only the absolute high-end they lose out to nvidia, and thats only for people who wont bat an eye dropping 1.5k on a gpu.
that’s why MS put them in the new xbox’s, best bang for buck.
That’s because you have to add the consoles lol. That’s like 95% of their GPU revenue and probably the only reason why their GPU business have survived to this day.
AMD is doing well with CPUs these days. GPUs on the other hand… they just aren’t making the same progress there.
Their software is behind in the consumer market and WAY behind in enterprise market.
Nvidia hitting production bottlenecks should help AMD move a few GPUs in Enterprise though, so long as they don’t throw away their opportunity here.
They doing well in a server GPU space tho. Look up top-3 supercomputers, you probably will be surprised.
because theyre measured with fp64 perf. not exactly a useful operation for deep learning.
Like is it that bad? Even when they have a better product at a given price point quite a lot of normal consumers will go for the Nvidia sticker because that’s the brand perception. AMD tried to be the cheaper option in the past and they almost went bankrupt.
And not like they are pushing as much MI300 as they possibly can. Server / AI > consumer.
Don’t think there’s the same level of focus there. Even when it comes to AI, AMD is pushing hard with ROCm. Recently progress on that has rapidly accelerated. They’re getting deals for their upcoming data center products. In comparison, how much revenue can be expected to be made with gaming? Especially since to make a gaming GPU they sacrifice being able to make a more profitable CPU or HPC/AI Accelerator. Furthermore when you look at the size difference between AMD and Nvidia (Nvidia’s profits are greater than AMD’s entire revenue) you see why the current situation has occurred. There’s no way AMD can keep pace with Nvidia given their smaller, divided resources. Not an excuse, just the reality.
Pushing hard with ROCm?
There are millions of Devs who develop for CUDA. Nvidia I believe has north of a thousand (can’t remember if it’s like 1 or 2 thousand) people working on Cuda. CUDA is 17 years old. There is SO MUCH work already done in CUDA, Nvidia is legit SO far ahead and I think people really underestimate this.
If AMD hired like 2000 engineers to work on ROCm they would still take maybe 5 years to get to where Nvidia is now, and still be 5 years behind Nvidia. Let’s not even get into the magnitudes more CUDA GPUs floating around out there compared to ROCm GPUs, because CUDA GPUs started being made earlier at higher volumes and even really old stuff is still usable for learning/home lab. As far as I know, they’re hiring a lot less, they just open sourced it and are hoping they can convince enough other companies to write stuff for ROCm.
I don’t mean to diminish AMD’s efforts here, Nvidia is certainly scared of ROCm, ROCm I expect to make strides in the consumer market in particular as hobbyists try and get their cheaper AMD chips to work with Diffusion models and whatever. When it comes to more enterprise facing stuff though CUDA is very very far ahead and the lead is WIDENING and the only real threat to that status quo is that there literally are not enough NVIDIA GPUs to go around.
CUDA’s moat is being undone by things like OpenAI Triton. Soon most ML code will be coded in interfaces that allow for any supported hardware vendor to run the code. AMD doesn’t have to replicate all of Nvidia’s work, especially when the industry has multiple giants all working on undoing Nvidia’s software moat.
Nvidia’s dominance won’t last forever, they have the advantage but one day all this AI hardware/software will be commoditized.
Radeon really needs their Ryzen moment
not gonna happen until they fix their drivers
Never going to happen because Nvidia always brings something subjective in the mix. and it always changes.
That’s what a market leader do , they invent a new market or new features to increase sales and demand or to differentiate themselves. Just like Apple’s decision of not including a charger with the phone or the Airpods.
“It’s not enough that I should succeed all others must fail”
THAT is the true essence of the ryzen moment, it wasn’t just AMD making a strong move with Ryzen, it was also Intel falling flat on their face 10nm turmoil halted both process and architecture progress for several years.
I’m pretty sure the intention of that quote is, “I will not be satisfied by merely succeededing, I must see others fail as well.”
Radeon needs massive investments in drivers and technology.
They had the raw compute power since a while but struggled to get this power ‘onto the street’.
Something which is still baffling me, considering that they seem to do well on consoles.
It’s not just about drivers tho. AMD driven consoles are also weak with good upscaling and ray tracing. It’s getting to be a real problem on consoles because the resolution has to be dropped significantly when high quality ray tracing is used. Then those consoles have to upscale back to 4K with FSR2 which is still pretty bad.
It’s not a Ryzen “Moment.” It was Ryzen momentum. They kept on improving gen after gen until they changed the consumer perspective of their brand.
While Radeon release one good gen, then they fumble the next two gens. And then blame Nvidia’s mindshare.
I thought 6000 series was their ryzen moment. Better power efficinency than rtx 3000 and 6900xt matched 3090 in raster. I thought 7000 would be their zen2 moment when they final beat nvidia at something important, eh…
Consumers won’t be cheering when they get their zen 3 moment, again. They have had it before
Nvidia is all hands on deck going pedal to the metal right now trying to stop AMD from gaining any market share right now. They’re worried about becoming a victim of their own success here and the AI boom allowing AMD to gain a foothold in AI with their open source strategy. They’re also worried about Intel and Google for similar reasons.
Nvidia is quite the formidable foe especially compared to Intel, and they have a massive headstart because they have a LOT of advantages beyond having merely the best hardware on the market, but I’m still a bit bullish on AMD’s chances here.
AMD together with NVIDIA and Microsoft plans for new ARM CPU. Meanwhile Intel executives surrounded by their burning CPUs “its all fine…”
What are you basing that on?
The 7800XT in particular seems to be doing very well. At least in EU+US.
Yet for the price… even as an RTX owner… my next GPU will definitely be an AMD. I think AMD is in a good spot when it comes to price to performance right now. It’s really hard to lead both CPU and GPU at the same time and nobody else is really even trying.
Not enough money. That’s just what it is. Isn’t a technology problem necessarily.
I don’t really understand this sentiment, I have an RNDA 3 card and aside from the high idle power it’s rock solid. Yes they don’t have DLSS and RT performance of Nvidia, but in terms of raw performance it’s excellent. Just because something is second best doesn’t mean it’s shit. People who complain about problems with cards are the slim minority of buyers. I think that’s true of most products (I’ve heard nothing but horror stories about recent logitech mice, but I have one that’s 5 years old now with no double click issues).
Do they need to compete with the 0.1% of GPUs or just compete with the 4080?
I think AMD is playing the long game on RT and they’re already kings at rasterization. A 4090 class card may not be needed anymore if their research pans out. They have some proofs of concept that would make the 4090 unnecessary.
I say this as a 4090 user. All the 4090 is for is brute forcing performance when software refinements are what’s really needed at this point. Then build hw accelerators once we have the software side sorted.
Hell even Nvidia is waffling at another 90 series card next gen.
The crazy thing is that’s not even really true. RDNA2 competed extremely well with Ampere. In Raster it was generally slightly better than its direct competition, and excluding the insanity of Covid and supply shortages, it was always cheaper as well.
I paid $360 for my 6700xt in Dec 2022. Now 6700xt’s are routinely like $300 on sale. The equivalent from Nvidia is… what? The 4060ti is slightly faster but only has 8gb of VRAM, and launched at $400 fucking dollars. The 4060ti 16gb is the same performance with more VRAM and launched at $500 fucking dollars.
At this point what’s propping up Nvidia is unfathomable levels of mind share. The product stack doesn’t actually deserve to sell anywhere near as well as it is currently.
It’s certainly not unheard of in the tech world for this to happen. Look at Apple releasing a fucking $1600 MBP with 8gb of RAM, and then trying to gaslight their customers into believing it’s better than it is, like they aren’t being ripped off. Shit’s absurd, but the average consumer is a pre-programmed midwit that relies almost entirely on brand loyalty to make their purchasing decisions.
And while I’m not super impressed with RDNA3 at launch, it’s significantly better as the prices slowly creep down. I kind of figure the 7800xt will slowly creep down towards $450 being a normal price, and the 7700xt will probably creep under $400.
The 7600, when priced below 250, especially in the low 200’s, like $220-230, is fantastic value as well.
Even at the high end, the 7900xt at $700-750 is a solid buy, and the 7900xt at $900 or less is very good as well.
Lovelace is fantastic from a technical perspective, but Nvidia outdid their own previous records for greed when it comes to the product stack and the pricing.
80% market share says it absolutely is true. the pricing power is evident from all the things you listed. Radeon has not been able to keep pace with the advancements to computer graphics coming out of NVIDIA.
I think if RDNA3 launched at these prices they would have fared a lot better.
huh? they still have the best $/performance cards, its only the absolute high-end they lose out to nvidia, and thats only for people who wont bat an eye dropping 1.5k on a gpu. that’s why MS put them in the new xbox’s, best bang for buck.
Not always true
That’s because you have to add the consoles lol. That’s like 95% of their GPU revenue and probably the only reason why their GPU business have survived to this day.