Тёмный
No video :(

Radeon Surrenders to GeForce 

AdoredTV
Подписаться 91 тыс.
Просмотров 57 тыс.
50% 1

♥ Check out adoredtv.com for more tech!
♥ Subscribe To AdoredTV - bit.ly/1J7020P
► Support AdoredTV through Patreon / adoredtv ◄
Buy Games on the Humble Store! -
►www.humblebund... ◄
Bitcoin Address - 1HuL9vN6Sgk4LqAS1AS6GexJoKNgoXFLEX
Ethereum Address - 0xB3535135b69EeE166fEc5021De725502911D9fd2

Опубликовано:

 

28 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,2 тыс.   
@Visualize01
@Visualize01 Год назад
Just wanna point out like some others have that the quote from Mr. Bergman at 8:20 is largely mistranslated. Gamer's Nexus reached out to AMD to get the original quote which is featured in their latest HW News video. The original quote was: "We're planning our next couple generations of products and we had to ask ourselves how high we can go. So obviously our competitor went to $1600 and 400 plus watts, so we could - graphics is scalabe - we could build a $2500 solution and be the highest performance at 600W or some other... At some junction, you just have to say we're going to draw a line on what we're targeting. So for this generation, we said $999." EDIT: As some people have pointed out, yes the point of the video still stands and the pricing target is nonsense since AMD seems to simply be following Nvidia, but the difference in phrasing here concerning the power target still matters.
@bobcoffee11
@bobcoffee11 Год назад
But the point of the video is that it is all nonsense. They just priced the card after nividia set theirs.
@halbouma6720
@halbouma6720 Год назад
The other thing is, the people spending $1600 on the 4090 aren't doing it just for gaming. They're buying it for professional work too - video editing, AI , etc. Things RDNA3 still doesn't do well yet. Its much more difficult for AMD to come out with a 4090 competitor when it doesn't compete half as well yet in the non gaming areas. That's another reason why AMD backed off doing an even more expensive powerful card.
@superior96
@superior96 Год назад
The point of the video still stands.
@InvidiousIgnoramus
@InvidiousIgnoramus Год назад
@@ronald666mcdonald Not really.
@m_sedziwoj
@m_sedziwoj Год назад
lol and you believe them?
@takaokita4954
@takaokita4954 Год назад
Japanese here I want to address the translsion at 8:20 the correct translation would be such; "Technically, we could develop a GPU that can compete with theirs. But, when you introduce such a GPU, with 600W TDP and $1600, would the gaming market really welcome such a product-? We gave a hard thought into it and made a decision not to. " the part "was accepted by general PC gaming fans." from that translation got it completly wrong. the article was more that AMD thought the market wouldn't accept such a product.
@ChrisM541
@ChrisM541 Год назад
I heard AMD talking about this before and what you say is 100% correct. Unfortunately though, the market does seem to have accepted such a product - there are enough braindead buying the 4090 for Ngreedia to continue raising the pricebar (and AMD following suit).
@RobBCactive
@RobBCactive Год назад
Thanks I thouht that was fishy, it didn't sound right. The point is Nvidia can sell $1,500 & $2,000 GPUs and up, because they're also bought by professionals to run software for their work and with CUDA and support of all the market niches for a decade, when AMD were cash starved, it's NOT a place AMD can break through. So it was said before the ceiling price for AMD gaming card is around $1,000; only the server and HPC products can justifiy the costs of very high end hardware and there power efficiency matters, using 600W would NOT be welcome.
@ChrisM541
@ChrisM541 Год назад
@@RobBCactive Remember that Nvidia can sell $2000 cards to...(braindead) gamers also, enough to justify their shocking series pricing - and AMD are extremely happy to attach to that same extortionate price bar !!! Professionals are a different matter entirely. If they rely on software that makes use of cuda then of course, Nvidia cards will be purchased. Understand, though, that cuda is only a very small segment of 'professional use'. AMD can - and does - compete in the much larger, non-cuda segment e.g. AI (check out some recent supercomputers!). To compete further, they need to merely produce cards at a 'competitive' price and power usage that their customers want. Just as AMD are taking significant server etc CPU business away from Intel, they are doing the same to Nvidia.
@kingofstrike1234
@kingofstrike1234 Год назад
is the 600w right or wrong bc MLiD asked AMD source and they say the OG was 400-450w
@RobBCactive
@RobBCactive Год назад
@@kingofstrike1234 the 600W is plausible in the context as Nvidia were testing such variants, they dialed the power back down as it brought more trouble than it was worth. Nvidia are left with the massively oversized coolers, but can introduce a 4090Ti if AMD AIB becomes available that challenges the 4090.
@johns.1898
@johns.1898 Год назад
Definitely no price fixing going on guys! Never investigate multinational corporations! Especially duopolies
@Mopantsu
@Mopantsu Год назад
I do not believe it is price fixing or collusion. Just simple economics of charging whatever they feel they can get away with. Crypto showed that people were prepared to pay. If nobody buys then they willl have to lower their prices eventually.
@Krazie-Ivan
@Krazie-Ivan Год назад
no need for collusion this time. AMD & Nvidia don't compete because the consumer decided there isn't a competition, by buying Nvidia when AMD's pricing brought an Nvidia product within reach. decades of this behavior later, AMD has let Nvidia set perf/price points, & simply released the easiest product they could at the highest price their perf would seem tolerable at. we did this ("we" being the avg consumer who falls for marketing trickery & fanboism).
@Namelus
@Namelus Год назад
@@Krazie-Ivan It's still price fixing. The FTC defines price fixing as: "Price fixing is an agreement (written, verbal, or inferred from conduct) among competitors to raise, lower, maintain, or stabilize prices or price levels." Which in this case is inferred from conduct.
@Krazie-Ivan
@Krazie-Ivan Год назад
​@@Namelus ...i didn't say it wasn't price fixing. i said they don't need to *collude* (a deceitful agreement or secret cooperation between two or more parties to limit open competition by deceiving, misleading or defrauding others of their legal right), like they did prior through emails. i'm not arguing, i'm agreeing with both yourself & the OP. companies/wealthy/powerful are free to commit crimes right out in the open in today's late-stage capitalism, where they run everything, & IF there is a consequence, the penalty is so low it's viewed as the price of doin biz. in the US, the supreme court has even recently decided federal regulators have no teeth.
@justaguy6216
@justaguy6216 Год назад
@@Krazie-Ivan Remember a fine is consent for the rich.
@coladict
@coladict Год назад
While I agree their GPU-only sales are not profitable enough for the company to care, they still need RTG to keep the graphics part of their APUs up to speed. That's what gets them Sony and Microsoft for the consoles. That's also who develops the graphics portion of their new MI300 datacenter juggernaut. The accounting might not work in favor of the Radeon group, but they are still vital to the company's future.
@AmurTiger
@AmurTiger Год назад
The challenge there is that if RTG is around for the consoles/APUs then AMD did nothing wrong this generation by ignoring the 1600$ price point. Making a 600W monster has no application to consoles or APUs. Heck much of what you'd spend money on making such a monster wouldn't have any application to even a midrange card. So we see an RTG that ignores the upper end of dGPUs and may have a hard time ever getting anywhere significant in market share, that's Jim's case at least. I'm not sure I agree with it really as this generation is far far more complicated then just /who has the crown/ with the huge amount of past-generation stock needing to sell through, the collapse of crypto and overall dim economic prospects mixed in with an NVidia that may finally be getting punished a bit for their pricing, maybe, hopefully. I think 6-12 months after the laptop GPU stuff comes out in numbers we'll have a clearer idea what AMD's focus/strategy is with all this and how they're reacting to events as they unfold. Going for the crown is one approach, as is pushing price ( which they are doing a little ) but even just offering differentiated products lower down the stack provides some value over NVidia. I opted for a 6700 XT since it was enough in raw speed and had the spare VRAM to handle the mods I dabble in, whittling away with that approach isn't going to revolutionize the market but were we expecting that? Wake up one day and AMD has returned us the prices of a decade ago and another 50% performance over last gen? I'm certainly not.
@earnistse4899
@earnistse4899 Год назад
The consoles are a cash cow for amd , they can NEVER afford to lose them.
@defectiveclone8450
@defectiveclone8450 Год назад
@@earnistse4899 Seeing as they have the worlds FASTEST Gaming CPU the 7800X3D and it uses way less power vs a 13900k that's SLOWER!.. and intels GPU's also being power hungry i don't see AMD every losing the Consoles simply because they are very power efficient and Sony + Microsoft can keep easy backwards comparability as a bonus with sticking with them. AMD is also not as greedy at intel and Nvidia so they play nice with Sony and Microsoft.
@kamenriderblade2099
@kamenriderblade2099 Год назад
@@earnistse4899 That means they need to continually improve their GPU technologies if they plan on keeping Sony & MS as their future customers. They can't stop the R&D on improving the GPU and Graphics Technologies. Just because AMD didn't go mad and develop a OBSCENELY large GPU to take on RTX 4090 isn't as big of a deal IMO. The Modular GCD design allows future iterations to make larger GCD's if AMD so desires. But they need to get their drivers working first. There's no point in making a larger die if their drivers can't deliver on the performance they promised
@justhitreset858
@justhitreset858 Год назад
What got AMD the consoles was pricing.
@mcmalden
@mcmalden Год назад
The current state of affairs seems to boil down purely to a lack of wafer capacity at TSMC. As you mentioned; high-margin products are then preferred. It cannot be understated that the 7000-series graphics cards may be competing against the rest of AMD's portfolio, and that AMD is not even interested in actually selling them. Once wafer capacity becomes available, hopefully AMD will want to sell graphics cards again, and we will see more interesting offerings. I would rather bet on this than Intel shaking things up.
@Ultrajamz
@Ultrajamz Год назад
Good luck if China also does blockade Taiwan… we need alternatives… tsmc will hopefully get new locations online asap
@kazedcat
@kazedcat Год назад
No AMD has a target margin 40% and they won't go lower. The only exception to this are consoles and that is because consoles gives them a fat revenue stream that serves as a life raft that keeps the company afloat no matter what happens as long as they keep that console revenue.
@quajay187
@quajay187 Год назад
yea they are pushing the 6950 XT the hardest right now
@OugaBoogaShockwave
@OugaBoogaShockwave Год назад
Only time will tell what intels battlemage will bring to the market.
@leonfa259
@leonfa259 Год назад
AI has become a major part of the reason to buy an GPU even aside of llms and Nvidia with CUDA made the smart bet. DLSS has become key in gaming to enable more performance than otherwise possible. I dislike NVIDIA as much as everyone else, but I will buy them with a 30% markup over AMD just to make sure AI performance is better. Demand is crazy for compute and there should be more than enough market for two players. Any other business would be happy about 10% profit margin in a strategic sector and a billion in revenue even with today's intrest rates. And somehow I don't expect their server side to go as well if they reduce their consumer side, one shouldn't see those as two different operations but one with two different customers. Consumers drive B2B adoption, available workforce, and economies of scale. If an engineer can quickly prototype at his hobby pc at home, he will much more likely request a similar system with hundredes of thousands of chips for the conglomerate he works at.
@powerpower-rg7bk
@powerpower-rg7bk Год назад
One thing that is overlooked in the analysis of AMD's largest customers is that they onboarded the Xilinx clients. This also includes MIcrosoft who has been pushing FPGA hardware in Azure. That does put who is that 16% customer into question. One other aspect of that is this is done in raw percentages but only to point out later in the video that AMD has doubled from 2020 to 2022. Again, part of that is from their recent acquisition of Xilinx. As for AMD going the route of a large die for a GPU, those times are effectively over. What we are likely to see is continuation and evolution of their chiplet strategy. Spin off PCIe + codec + display interfaces into their own chiplet as you only need one of them per package. A midrange die size (300 to 400 mm^2) as the base chiplet for compute and then scale upward in quantity to get the equivalent of a 600 mm^2 to 800+ mm^2 die. The memory controllers + cache have already been spin off into their own chiplets. It is possible to scale up GPU compute dies as demonstrated in Apple's M1 Ultra. The other side of chiplets is that you can develop fewer dies for chiplets and put them together for a wider range of products in a wider range of markets. I would be surprised if AMD continues separate development of RDNA and CDNA for much longer: they have already spun off much of the functional blocks used exclusively for graphics form the RDNA compute core. The consumer market is likely to get many of the chiplet dies first as a means to discover bugs/errata in them before the same chiplets are used in data center products. Worst case for AMD would be to delay the data center versions to release a new stepping. Lastly, AMD's graphics group isn't just about the PC GPU market. AMD is now in the business of licensing their IP and developing custom silicon for their customers (consoles). The PC market is the test bed of those technologies to demonstrate their capabilities. The PC board market is indeed getting smaller for them as a percentage of their overall business but it is an important stepping stone to retaining other businesses.
@Mopantsu
@Mopantsu Год назад
How quickly people forget that the 1700X was SLOWER than Intel CPU's in a lot of cases (especially single core). All it had going for it was more cores and a better price per core.
@mark11145
@mark11145 Год назад
@@Mopantsu Yep. and my 1700x is still rocking it out today. I figure I’ll upgrade when Zen5 comes out.
@959tolis626
@959tolis626 Год назад
@@Mopantsu This. While I do share Jim's overall disappointment in RTG and I think he makes some very valid points, I am still optimistic. Lisa Su's AMD has consistently been what I'd call aggresively conservative. They're pushing the envelope with new tech constantly, but when too much new tech accumulates they take steps back and go from the easy path at first. Zen is the primest of examples. If you look at it today, apart from its core count advantage, Zen kinda sucked compared to Intel. Much lower clocks, finnicky memory, lackluster single core performance etc. The only difference between RTG's efforts and Ryzen was that people were actually excited for AMD to come back to CPUs, whereas they never really left the GPU market. RDNA3 could very well still be RTG's Zen moment. If by next gen they go crazy with multi-GCD cards like they did with Zen 2, we'll be in for a treat. Problem right now is their software, IMO. RDNA3 is plagued by low level driver issues and the performance is hamstrung by them. A much faster and more expensive piece of hardware wouldn't have mattered if those issues existed. So, if RDNA3 is a test bed for chiplet GPUs, so be it. I too would've preferred if AMD had gone the 5700XT route and competed heavily on price when they couldn't compete in performance, but when your competitor is NVidia, not playing dirty puts you at a disadvantage. My biggest gripe really is that they lied during the announcement and set us up with false expectations. They had built some trust with us and they fucked it up. Other than that, if I'm right and RDNA3 is just an initial step of a bigger push, I'm ready to forgive AMD's transgressions. Whatever the case, I still refuse to buy NVidia.
@kamenriderblade2099
@kamenriderblade2099 Год назад
@@959tolis626 This was the first time they didn't have Robert Hallock reigning in the marketing team's BS. Remember with the launch of the 7900XTX, Robert Hallock just retired. So the new marketing team didn't have a veteran marketing leader hold back the excess BS.
@959tolis626
@959tolis626 Год назад
@@kamenriderblade2099 True that. I did think it sucked when I saw that Hallock left AMD, and you're right, those were the consequences. AMD needs to improve their marketing ASAP.
@thelegendaryqwerty
@thelegendaryqwerty Год назад
Imagine Radeon beating Nvidia flagship. People still will buy Nvidia anyways or I am mistaken?
@NielsC68
@NielsC68 Год назад
It has happened more than once.
@03urukhai76
@03urukhai76 Год назад
Nope. The last time AMD won at top tier was a decade ago. R9 290x has beaten Titan for about couple of months and Nvidia came back with better products. So after R9 290x AMD didn't even try to beat Nvidia ever again. Radeon divison is not crucial for the company. As long as they get console contracts, mid-tier GPU tech is good enought for them. CPU side is a different story. Most people thought Zen is gonna be Bulldozer 2,0. But AMD knew that Zen has to be better so they made sure it's better than Intel's tech. The problem with Radeon, it's not needed to be better or top tier. As long as it ends up in a console, everything is fine. AMD owns three big console contracts. The money from MS, Sony and Valve made them lazy. So they don't give a damn about PC graphics.There won't be a Zen like development in Radeon divison because mediocre tech is good enough for consoles and console sales are good enough for AMD.
@badass6300
@badass6300 Год назад
They did multiple times in the span of 2007-2015 and people still bought Nvidia... Why? Cuz they are green, for all I know.
@bornonthebattlefront4883
@bornonthebattlefront4883 Год назад
@@03urukhai76 AMD isn’t “just happy with mediocre gpu” You have to realize, Nvidia is desperate compared to them If Nvidia doesn’t win, doesn’t hold down the Desktop and Laptop GPU space They won’t be a company anymore, that’s there bread and butter, what keeps on the lights So even though AMD Radeon may want that top spot They aren’t as desperate as Nvidia is to get it RDNA 2 was so very close AMD came within striking distance of the top spot, swapping blows with the 3090ti, and making Nvidia sweat And sweat they did, with a 450 watt gpu And the 4090ti that eventually comes out will likely use 600 watts Nvidia really pushed things hard to stay ahead of AMD The only reason why they can continue to do this? Uninformed buyers Poor marketing from AMD (like calling their top card the 7900xtx instead of just the 7900) Insane marketing from Nvidia with no one to stop them from talking out of their rear “3x faster” and all this bull, with nothing even close to that Nvidia is a powerhouse in marketing It’s just that Only way Radeon wins? They come out 20-25% faster at the same price as Nvidia’s best GPU of that generation, (it HAS to be the same price, so People believe it’s a premium product) and they have to have a near perfect launch, which would mean releasing a month later then they would like Come in with a ton of stock And the card have more VRAM and better upscaling then team green Basically a RX 9900xtx with 64gb of VRAM At 2,000$ 350 watts While Nvidia only has a RTX 6090 Ti with 48gb of VRAM At 2000$ 500 watts If not They won’t take the top tier market share They just won’t
@bornonthebattlefront4883
@bornonthebattlefront4883 Год назад
@@badass6300 there was a time between Cayman and Fiji that AMD had over 50% market share But they couldn’t keep it up, as Nvidia had 50% profit margins, and Radeon had 10% While AMD CPU was losing 10% per chip Nvidia sweated them out AMD offered a much better value, and it burnt them Which is why Lisa Su has said that she won’t go lower then a 40% profit margin Which, right now they are probably closer to 50% But still a lot more generous to consumers vs Nvidia’s 70%
@falxgod6848
@falxgod6848 Год назад
GamersNExus reached out to AMD about that quote and it was a mistranslation. AMD provided GamersNexus with the original English "script" to that answer that then was translated to Japanese and then translated back to Egnlish for this article.
@The_Noticer.
@The_Noticer. Год назад
Who knows what they actually said to that outlet, and then later revised the statement to be less terrible optics.
@NANOTECHYT
@NANOTECHYT Год назад
Don't take offense, but you're grasping at straws really. The quote in and of itself is meaningless, even in the GN fixed translation as the comment above had pointed out. The point Jim's making is, when they are designing an architecture 5 years prior, they're not looking at price of sale. They might do some Bill of Materials analysis and such to keep things in the usual range, but they're not sitting down and going "We know in five years time NVIDIA's going to make a $1,600 GPU, so we will make ours $999!". No. They've made a mid-range die product from the beginning and they simply priced it to $999. The whole quote even in either translation is just typical corporate speak to save face. The fact is they made a mid-range die, used chiplets to save money and they didn't hit their performance targets so when they saw NVIDIA's pricing, they pushed out lies about performance during the announcement and put it out there at $999. They could have made a larger die, but they simply didn't. They could have pushed power, they simply didn't. They could have made a larger cooler, they simply didn't. All those things were planned in advance.
@MagnificentUsernameGuy
@MagnificentUsernameGuy Год назад
@@NANOTECHYT I agree with most of what you say here. But when it comes to accusations of lies I think it's important to be precise: AMD did not lie to the public during their _announcement_ of the products. The GPU's _did_ have "up to" 70% uplift in performance. Their presentation was however very misleading, as that was pretty extreme cherry-picking, like _Nvidia_ level cherry picking. We can speculate on whether or not AMD were lying to reviewers when they were told what performance level they should expect. But again, that might as well be "just" misleading statements as well. It might come across as nit-picking, but I think lying is such a serious accusation, that we should make a distinction between misleading marketing and outright lying.
@Safetytrousers
@Safetytrousers Год назад
@@MagnificentUsernameGuy They showed a higher than reality increase in Cyberpunk. That wasn't an 'up to' figure.
@NANOTECHYT
@NANOTECHYT Год назад
​@@MagnificentUsernameGuy Define "lie": "to create a false or misleading impression". Interesting that.
@Dangerman-zg3ui
@Dangerman-zg3ui Год назад
The 7900 XTX IMV was going to ge 1199, but then had to set it at 999 when they failed to hit their targets.
@StuvenBarkus
@StuvenBarkus Год назад
I had no idea there was such a major gulf in die size between 7900XTX and 4090! What a missed opportunity.
@dvl973
@dvl973 Год назад
Smaller die size means better yields and cheaper chip manufacturing which means higher margins for AMD compared to nvidia which means the price could be lower. But it isn't not really. Smaller die size means the information doesn't have to travel as far which are the most expensive losses in performance of a chip - the smaller the more powerful, the more integrated, efficient and lower temps - the better. Big die size is worse to deal with than small die size. AMD should have been winning at this point but their strategy is weird, instead of doing the same to nvidia what they did to Intel (buttfuck them really hard), they keep measling out their gpu's.
@jimtekkit
@jimtekkit Год назад
​@@dvl973 "Umm Lisa, maybe we should release the new RDNA3 GPU's now?" "No......that's just what they'll be expecting us to do"
@Sal3600
@Sal3600 Год назад
@@dvl973 but nobody buys it lol
@badass6300
@badass6300 Год назад
No, a missed opportunity was people protesting against GPUs over 230W, which used to be the standard for the top end GPUs to top out at. 520W consumption is just insane on the rtx 4090 and so is anything over 230W
@dvl973
@dvl973 Год назад
@@badass6300 why? more performance requires more power especially with a die size like nvidia uses. The chips are huge because they have lots of separate cores, AI optimized, RT optimized, Shader cores, Texel, Vector whatever. Unless there is a huge jump in the node (which at this point is slowly but surely reaching physical limits) or just the die size somehow, the power won't really go down. Nvidia sees this as a tradeoff, yes the chips are huge, the cards even bigger, the wattage is crazy but in a couple days, you'll be able to run a game like Cyberpunk fully pathtraced at over 100 fps in 4K HDR which is kinda insane, albeit it is with a lot of AI trickery - you can't deny people still love it and buy it for those features and I'm honestly not surprised, the image quality is more than good enough for most people even with all the AI shenanigans and the fluidity and visuals are incredible.
@stephanhart9941
@stephanhart9941 Год назад
I thought this was a foregone conclusion! It's like they just don't want market share. They keep ignoring opportunities, not just flubbing them.
@MarkoKraguljac
@MarkoKraguljac Год назад
Its a duopoly! Its a deal! (likely not formal) They are de facto working together to keep prices high and everyone forgot they should be competing. Its a sham.
@elon6131
@elon6131 Год назад
Nvidia also keeps capitalizing on every advantage they have, and they don’t rest. GSYNC? RT? DLSS? FG? Nvidia isn’t trying to do maximise PPA, optimise costs at every step, or use not-quite-ready tech just to look good (RDNA3 in a nutshell). They keep pushing their advantage every generation while AMD lamely follows behind. The issue is people keep thinking AMD has magical opportunities to "win", but really all they have is opportunities for single generation relative success (RDNA 2, good raster perf right? But RT was trash, software features were lacking, etc) before Nvidia comes in with a massive GPU on a cutting edge node and *even more software features* to destroy it all in a snap. The only way to compete with Nvidia is to spend what Nvidia’s been spending on their RnD for decades now. AMD simply can’t do that.
@goblinphreak2132
@goblinphreak2132 Год назад
Yeah when I saw that news where AMD was quoted to not want to make a high power, high heat card to compete with the 4090 s's bullshit. Whoever at AMD keeps saying no to these high horsepower cards needs to be fired. More power or GTFO.
@elon6131
@elon6131 Год назад
@@goblinphreak2132 AMD's marketing has been a garbage fire for decades lol. as long as frank azor still works there, don't expect it to get any better.
@Fhyrne
@Fhyrne Год назад
This is probably all premeditated between Intel, amd and Nvidia. It's all for show and they all are here to suck out as much money as possible.
@ShaneMcGrath.
@ShaneMcGrath. Год назад
Great to see you posting more videos, Always enjoy your perspective on things.
@102728
@102728 Год назад
I find this one a bit weird, as if it's either Radeon wins or Radeon is dead. With both RDNA 1 and RDNA 2 we saw the Radeon cards offering better value for raw performance, with Nvidia offering the top performers and an overall better feature set. This allowed me to get about 20% higher raw performance with Radeon than with Nvidia at my budget - with an RX 6600 over an RTX 3050. And this gen so far is weird pricing-wise, but at least the xtx continues that trend, and the xt is slowly falling in line by inching towards $800 retail. And yes, that is compared to Nvidia's 40's series which offers bad value compared to previous gen, and yes that means Radeon's 7000's series follows that trend in Nvidia's footsteps. But like you pointed out, they're playing second fiddle to Nvidia, content with just offering better raw performance with a worse feature set. And yes, they're not correcting Nvidia's pricing by pricing their own products more aggressively, but that doesn't mean that that relative value proposition isn't there. They're not the competition we hoped for as consumers, but at least it's there offering an alternative. I don't even know what 'giving up on Radeon' even is supposed to mean, I'll just upgrade when my system warrants an upgrade, and I hope I'll have more than just one brand to choose from. And I'll choose the card that I fancy most, regardless of which brand it is - though as I don't expect to get suddenly rich in the coming years, it'll probable be the better value Radeon card from whatever generation will be out by then.
@mark11145
@mark11145 Год назад
Well said. Sums up my gaming position as well.
@onomatopoeia162003
@onomatopoeia162003 Год назад
I would agree with the 'giving up on Radeon' part.
@Fractal_32
@Fractal_32 Год назад
I agree, personally I see the RX 7000 series as not bad so far, pricing could be better but we would have said that even if it was lower in the first place. I personally bought some Radeon cards because of their great Linux support opposed to Nvidia’s lackluster/terrible Linux support. (It is funny how on Linux its the opposite of Windows for support and drivers.)
@nothingam9983
@nothingam9983 Год назад
At the premium price people will choose the best even if it cost more How many people will spend above 800 dollars just to play games ?
@Pushing_Pixels
@Pushing_Pixels Год назад
The 7900 XTX is better than the 4080 hands down and costs $200 less. That IS competing on price. It would be nice if it were lower, but that would be AMD cutting its own throat. Previous experience has taught them that aggressive price competition isn't worth it because consumer brand loyalty is hard to shift with prices alone. At the lower end they do compete on price more aggressively. The RX 6600 class was far superior to Nvidia's low end offerings last gen at a significantly lower price. I've had both AMD and Nvidia cards, and while I currently have an RTX 3000 card I'm pretty sure my next card will be AMD (I won't be upgrading this gen). My experience with both has led me to believe that they are equivalent on balance. The "superior features" of Nvidia are being overstated in my opinion. RT is still a gimmick anywhere but the very high end and while DLSS is nice, and apparently looks better, I'm sure I wouldn't be able to tell the difference between it and FSR 98% of the time while playing. AMD's software is vastly better than Nvidia's, from what I remember. It's modern, easy to use, all the features are easy to find and toggle, and you don't have to do stupid things like log in just to update drivers (that really annoys me). In the end it's price to performance that is the biggest factor IMO. With the "giving up on Radeon" thing, I think Jim is a bit burnt by his previous enthusiasm for them, and defence of them, not being validated by how things played out. He's copped a lot of flak over the years, from fanboys and other online idiots and trolls, and I think he let it get to him. He basically got burned out arguing with idiots online, and that's why he doesn't post much these days. He is a smart analyst of the tech scene, but I think he's now overly negative on AMD due to these experiences.
@EthelbertCoyote
@EthelbertCoyote Год назад
I have sort of expected the same thing for a while. The reason I think AMD is in the PC gaming space is mostly to support consoles, hear me out. 1)The console developers are kept much healthier by sales of PC ports keeping the high cost of console game development subsidized by a bit more market cushionsecondary market. 2) Radeon knows they can always compete on price with nVidia and likely win the low to mid end with chiplets making more ROI. 3) AMD can use the PC platform as a huge console tech testing platform that still creates some revenue. 4) With a larger presence in the market than if they were just on consoles they can shape the market to support and introduce the technolgies and standards good for them and consoles.
@TheDiner50
@TheDiner50 Год назад
So RX 7900 is just a beta release of what AMD is going to do for consoles? Maybe it makes more sense then it should be. 7950x3d is also just a beta/tool for developers. But AMD just suckers in people with more money then sense to play beta tester. It still makes no sense that AMD launched RX 7800xt as a RX 7900xt. And even if AMD launched RDNA3 just to be on schedule for investors. Why upset people so deeply with 7900xt that people look back to Nvidia to buy a GPU? If there was a time to lose margins it was with RDNA3. Even if the name was not possible to change so close to launching the GPU. Why not price it low enough to get credibility and praise? Just to damage control RDNA3 performance and make sales and so profit? Instead of just not selling the 7900xt for months and having everyone so agents AMD. Ryzen 7800x3d delay made people even more agents AMD and now both Ryzen and Radeon have no good will left in the public eye. Ryzen 7000 and AM5 was so horrible. And RDNA3 8k gaming made it clear AMD have taken out Intel. Just for AMD to AMD themselves twice in a row!
@jankratochvil9779
@jankratochvil9779 Год назад
Actualy out of touch. Same as PC helped to rendering back in time. on Consoles to efficency of witj expyriences from rendering on pc, with properly changed hardware functionality. Amd actualy succesed insanly well on console market, now they are trying to pretty much bring pc closer to the console technicaly, becouse its obviously pc platform as whole what suffers with technical rendering inneficency. Thats why you can see chiplets or 3dV cache technologies. Pc curently just scales horribly with its brutal power compare to consoles in gaming. Why would they do that, becouse they just want to overtake gaming marketshare itself in the future. Without amd we wouldnt even have pipelines... nvidia is more about software features historicaly amd is more about hardware efficency and piece of hardware itself. At last Power of competion is always welcome.
@EthelbertCoyote
@EthelbertCoyote Год назад
@@TheDiner50 I don't think any company intends to be "upsetting" or evil in any way. I think with the 7900xt AMD saw a market segment this year that had they could set a "high" target with the xtx but really did not have a firm grasp where the xt should fall price wise so they put it close and let it fall to where it needed to be so they could set a good target for lower cards. The 7950x3d is a bit more interesting to me than a beta tool but the lowest end of the x3d 7800 is I think going to be very close to what next gen consoles may get as a cpu, I could be very wrong on that however. In consoles and pc gaming feeding the GPU is the function of the CPU games are not spiking up atm "hard processing wise" for either but are for graphics memory. That means more assets need to be loaded faster but they are not a lot harder to draw just need faster access and larger buffer space (vram). Vcache helps not having to call to slower ram to get these assets but targeting what is the min CPU you need so you can feed the GPU is still hard.
@EthelbertCoyote
@EthelbertCoyote Год назад
@@jankratochvil9779 Standardizing the market is what AMD would love to do sure but it's still not a win for them if they do so as graphics and gaming are becoming less and less important to what a GPU does. One of the reasons framebuffer demands are going up even in gaming is compute and to do GPU compute you have to move as I understand it everything about the task it's doing closer to the GPU into VRAM. PC is not inefficient I think it's a mistake to call it that the consoles are just focused hardware. With your "inefficient" PC's you can do way more even in graphics than a console ever could but not as well granted in gaming. Bottum line for either company gaming is less and less of a focus and more of a way to pioneer new tech to solve the problems of other areas as a two birds with one stone approach.
@Mopantsu
@Mopantsu Год назад
PS5 Pro where nearly every game will run at 60 FPS minimum at 4K (with VRR, variable resolution and improved RT) is so going to change the market. APU's with locked 1080p60 on high settings could also open up a whole new market for the low end.
@bilalfouzi507
@bilalfouzi507 Год назад
Amd showing everyone that you either dir a hero or live long enough to see yourself become the villian. But not in a cool Batman way. They just straight out said fuck it, I don't wanna be the hero anymore, I got kids to feed
@goblinphreak2132
@goblinphreak2132 Год назад
The thing I hated the most was seeing that news article where it was stated AMD did not want to compete versus 4090. That they didn't want to make a 450 watt or more graphics card that ran hot. I find it absolutely appalling. Whoever at AMD keeps saying no to competition needs to be fired. We could right now drop me a 7970 XTX that is a 600 watt monster and I would buy it. This idea that we don't want massive fast cards is bullshit. 4090s are selling like hotcakes and AMD is nowhere to be found. The consumers are screaming for it but whoever's saying no at AMD is ruining it. They seriously need to be fired. It's Raja Koduri all over again. Raja didn't want to make large GPUs. He wanted to keep the core count small. And now look at them at Intel, while the chip itself is large the core count is low. He hates making fast GPUs and said so himself in an interview a long time ago. He wanted to push for max performance for low power. Gamers don't want that. Gamers want the fastest no matter what. If the graphics card is designed for gaming and meant to be sold to gamers then it needs to please the gamers. Playing this bullshit low power mediocre performance doesn't cut it. The selfishly choose what gamers want instead of giving gamers what they actually want. Granting the videos no better because they make one GPU and sell it to their entire product stack. Said GPU was sold for machine learning, AI, gaming, servers, etc. The video is no better. Honestly I'm disappointed with both brands. I miss the old Nvidia, GPUs made for gamers by gamers for gaming. Not this generic piece of shit that they pawn off on gamers while they market and profit on AI and servers.... It's sad that you actually think AMD or Nvidia sets their prices based on the competition. That's not how business works. Sale pricing is pre-planned long before the product ever gets told to the consumer that it exists. It's not like 1 hour before a press conference AMD picks a price. Pure insanity that people like you keep thinking that this is how business works. AMD 100% could have planned to make a GPU that was a thousand bucks. The funny thing is if you go on to their Reddit, which isn't even officially run by AMD, You will see tons of people saying that they will pay no more than $1,000 for a GPU. So it's obvious why AMD pushed for $1,000 GPU. It actually makes perfect logical sense unlike your idea of being 1000 only because the 4080 was 1200.... But I guess ignorance is bliss. You also ignorantly think that AMD lied about performance. The sad part is you were taking into account aftermarket cards which are overclocked and thinking that those are what AMD is using for their reference benchmarks. That isn't how business works. When AMD says 6950 XT versus 7900 XTX they are comparing their reference card versus reference card. The results are actually absolutely correct. The issue is the 6950 XT overclocked cards that you would typically buy from an AIB are already faster than the reference 6950 XT. Therefore the gap and performance to the new 7900 XTX reference is smaller. Which you pointed out. But you pointed it out ignorantly not realizing your mistake. When AMD compares anything to anything they aren't using aftermarket cards or overclocks. When AMD compares the 7800x3d to a 13900K they are not comparing it to an insanely overclocked 13900K that a typical Intel owner would do. They are comparing it to a bone stock 13900K as if it was slotted into a motherboard and tested. You'll even hear people say well my 13600K gets the same performance as a 13900K because I overclocked it. Well no shit you ever clocked it. The benchmarks aren't for people who overclock it's for people who buy a product and use it. Enthusiast will always have to wait for overclocking information from third party reviewers. To claim AMD being deceptive due to your own ignorance is actually hilarious. AMD not competing is the reason why they're not profiting. Why buy second best when you can buy the fastest. It's basic economy and basic psychology. A basic consumer mindset is I want the best for my money. AMD releasing a budget GPU for $1,000 that doesn't actually beat the competition won't sell. It's funny how all they have to do is compete. They just competed with the 6,900 XT last gen. It literally tied to 3090 when you look at a multi-game benchmark. Sure pick one game and it loses. But you can pick another game and it wins. They were there. And someone high up said no no no wait wait wait, we can't make a GPU that fast because no one will buy it. Whoever that person is needs to be fired.
@goblinphreak2132
@goblinphreak2132 Год назад
AMD's biggest issue is listening to the broke third world idiots on their Reddit who upgrade their PC once every 5 to 10 years complaining that they won't spend more than $1,000 for a GPU. And people wonder why AMD chose to limit to $1,000 for their reference model.... AMD needs to stop listening to the broke third world minority and start paying attention to those gamers who actually have money and will upgrade if the product performs.
@defeqel6537
@defeqel6537 Год назад
It wasn't that they didn't want to compete, if you look at what they actually said, ie. from the latest GamersNexus video, it was more along the lines of: when we started designing the chip a couple years back we set some targets in terms of costs and power draw, nVidia set different goals. It's not like AMD knew ahead of time what nVidia would do and just decided not to match it.
@goblinphreak2132
@goblinphreak2132 Год назад
@@defeqel6537 except the early leaks of a large GPU with eight cache chiplets, And what we got was a smaller GPU with six cache chiplets. Where is that other card that was leaked? Could it be 100% made up? Maybe. But the initial leak of core plus cache chiplets was that huge chip with eight chiplets for a total of nine. That card was clearly scrapped because of the price segment it would have been put in and AMD chose not to go over a thousand. Looking at AMD's Reddit you can see all the broke bitches crying they won't spend more than a thousand for a GPU. So of course AMD made their GPU max price a thousand. They're listening to the wrong people. The broke and loud minority on Reddit guiding your decisions is not a way to run a business. Whoever AMD said no to competition and no to a big card needs to be fired. They clearly spend all their time on Reddit instead of reality.
@morpheus9137
@morpheus9137 Год назад
People will buy a £1600 NVidia GPU in numbers, the same is not true for AMD, they are different companies with different products. I think they got it right personally. NVidia really threw a lot at 4090, huge cooler, high TDP, massive die, new node, this all costs a lot of money, money that NVidia can recoup, but AMD probably can't.
@JohnMillerLifting
@JohnMillerLifting Год назад
for reasons just like this, are the main reason why I have zero brand loyalty. Yes I want AMD to do well but that mainly just so they keep everyone else in check. I fell the same way about intel. at the end of the day I dont buy anything until its out and had 3rd parties review it. only then will I get what I feel is good value for my money.
@brianhiggins2399
@brianhiggins2399 Год назад
I had a 6900 XT at launch and now I have a 7900 XTX and I find it a big improvement over the 6900 XT at 4K. Also AMD made a great improvement to ray tracing as the 7900 XTX beats Nvidias 3090 Ti in ray tracing which is pretty good in my eyes.
@TrojanHell
@TrojanHell Год назад
Recently purchased a 6950XT and I think I will sell/return it and upgrade too; The 6950XT card is amazing especially compared to the 5500XT I had before it, but it still doesn't have that sense of relief of not having to mess with any settings except to max them out and getting at least 60fps off hand at 4k.
@Derpynewb
@Derpynewb Год назад
Last gen Ray tracing with current gen raster. The 7000 series is nothing special imo. Its like the 6000 series except because nvidia is being extra anal with pricing, everyone wants to stroke amd
@Phil_529
@Phil_529 Год назад
8fps in Portal RTX. Killing it with that 7900XTX.
@raeferwilson2599
@raeferwilson2599 Год назад
​@@Phil_529 Portal ROFLMAO
@Phil_529
@Phil_529 Год назад
@@raeferwilson2599 FSR2/3 ROFLMAO
@LannisterFromDaRock
@LannisterFromDaRock Год назад
It's especially sad because they are the only company atm who could work on synergizing their CPU+GPU units to achieve better results. :/
@AoyagiAichou
@AoyagiAichou Год назад
Remember AMD Fusion?
@defeqel6537
@defeqel6537 Год назад
funny enough, they are already doing that in consoles, and data center
@PaulSpades
@PaulSpades Год назад
meh, like all modern arm devices and the consoles? we need a unified programing model and libraries besides the hardware unified memory.
@ChrispyNut
@ChrispyNut Год назад
At least they're finally putting GPUs in all their desktop CPUs now. Was a baffling decision not to do this a decade ago. Soooo many systems are perfectly fine with an iGPU, which made Intel a far more obvious choice in a lot of those cases. I hope they still make "proper" "APUs" as well though, for when a little extra grunt's required, but a mid-range GPU's overkill (adding cost, space and additional point of failure).
@TheTardis157
@TheTardis157 Год назад
@@ChrispyNut I think they did that to keep costs down for people who wanted a discreet graphics card instead of wasting costs on a integrated gpu they will never use. They hands down make the best APUs though. I hope they keep making strong ones that can rival or beat entry level cards.
@The_Noticer.
@The_Noticer. Год назад
AMD exists so Nvidia can claim it doesn't hold a monopoly, when it in fact does. But this will keep the antitrust off their backs. Which makes sense considering AMD/Nvidia/Intel all have the same major shareholders. Lo and behold, the newcomer Intel also prices to match. Who'd have thought. Just like in politics, it's a collection of players that pretend to be engaging in competition/debate but are in reality alike and have the same owner. And that also means that you cannot "vote with your wallet", because you're still paying either of these two disingenous companies. Unless you forgo the sale completely.
@charliebrownnz1
@charliebrownnz1 Год назад
Great video - but I'm going to provide a steelman argument about AMD's current pricing: Global Economics. Economic projections of the global economy are terrible - spending on luxury goods are going to drop substantially - so the market for high end consumer goods are likely going to shrink. People aren't likely going to remember the really competitive products that were available at the point in time where they couldn't afford them. So AMD could be just staying in a holding pattern - don't rock the market and wait for things to recover before pushing for huge market share. Why show your best hand when the rewards are terrible? As I said - this is a steelman argument. I don't necessarily agree with it but there is an argument to be had.
@Mopantsu
@Mopantsu Год назад
Agreed. We have not even seen the worst of it yet. Massive economic upheaval is on the horizon due to printing money. QE/QT, BRICS, De-Dollarization, banks failing, wars, shortages. Crisis after crisis.
@charliebrownnz1
@charliebrownnz1 Год назад
@@Mopantsu And there is the taiwan issue as well.
@trousersnake1486
@trousersnake1486 Год назад
I thought AMD really had a strong showing with the 6000 series, then they completely fumble the 7000 series.
@Moon-ty2hn
@Moon-ty2hn Год назад
only two cards have been released yet, chill out
@bleack8701
@bleack8701 Год назад
@@Moon-ty2hn they've fumbled those cards and haven't released anything since. Where's the mid range? The low range? Nowhere, because they're waiting for nvidia. They're letting nvidia do whatever they want
@wahidpawana424
@wahidpawana424 Год назад
@@bleack8701 yep, people are hungry for mid an lows, instead of spear heading those market with newer cards, they are waiting for NVidia to gobble the market with their leftover 30 series and then play catch up.
@Moon-ty2hn
@Moon-ty2hn Год назад
@@bleack8701 high-end always comes out many months before anything else, while nvidia releases an 8gb card in 2023 for 600$, if you don't care about raytracing, the xtx beats the 4080 and isn't too far off the 4090 for 100-300$ and 600-1200$ less respectively
@Mopantsu
@Mopantsu Год назад
@@wahidpawana424 AMD still want to clear out the 600's series of which they have tons. Until they clear out the 6800 and 6700 stock don't expect to see the 7800 and 7700 cards.
@xFluing
@xFluing Год назад
It was proven that the "600 watt" quote was grossly mistranslated, i think steve from GN made it clear
@we0921
@we0921 Год назад
AdoredTV upload notifications are always a pleasant surprise :)
@HuntaKiller91
@HuntaKiller91 Год назад
Imho their business strategy is subpar because of pricing If the XTX sells for $900 now and their XT is around $650-700 They shud sell well even with less features
@elon6131
@elon6131 Год назад
These are not cheap to make.
@sinephase
@sinephase Год назад
if their profit is so low I don't think they can get away with pricing it cheaper, though
@morpheus9137
@morpheus9137 Год назад
Prices outside the US are typically 20-30% higher. Which does seem to make them expensive.
@really7187
@really7187 Год назад
@@morpheus9137 For a big part that's because the prices outside the US are prices with VAT included. The listed prices in the US are without Sales tax/VAT added. In EU VAT is on average 21%. Where in the US it's much lower between 2.9 and 7ish% and als o different per state. There are of course counties with will end up have paying even more because of bad exchange rates VS the dollar.
@jooch_exe
@jooch_exe Год назад
High end graphics cards used to be a market space where you could buy the future graphics technology. That's what SLI was all about, which happened to push power supplies to the next level.
@lowrivera
@lowrivera Год назад
I waited for an entire year before I finally built my first personal windows machine in over 15 years. I got and XFX Radeon 7900 XT, as the performance increase for the price wasn't worth it, IMO for the XTX. I paid $840 for my XFX 7900 XT (Early March 2023), the price has now drop another $10. But I had zero intensions of paying $900+ USD. All said, I built it exclusively for gaming. The performance paired with the 5900X has been amazing for me. Loving it.
@ChrisM541
@ChrisM541 Год назад
Damn annoying for the customer but AMD are simply jumping onto Ngreedia's pricebar, 100% unwilling to repeat their 'aggressive pricing' Zen2 strategy. It's clear AMD has been collaborating with Nvidia on pricing since any fanboy argument over the 'benefits of RTX/DLSS/AI/etc' can be countered if AMD were only to make a serious attempt to price cut Nvidia. Nothing wrong with the 7900 apart from pricing. Of course, the same applies to Nvidia's cards, though they're price-set at even more sphincter-clenching levels.
@KyleJohnsonVA
@KyleJohnsonVA Год назад
16:43 I gotta throw up the red flag here. In my own experience with ChatGPT I have found that it will absolutely throw out WRONG information if what you are asking about is not something it was trained on. It also is incapable of doing math. I'll ask it for information on something like calculating the kinetic energy of a projectile, and not only will it pull a completely incorrect formula out of its ass, it will also give you an "answer" that wouldn't even be correct if you followed the formula it presented you with.
@TM86880
@TM86880 Год назад
Jim, a pleasure to hear from you and see your point of view! It's really sad to see the RTG group lose all of the "goodwill" they would've earned with this launch of Radeon 7000, with issues like their vapor chamber problems, lack of proper drivers etc, however, I'll still support them as I don't find the offerings from Nvidia worth my $$$, I can't use Raytracing in any multi-player games, so I find that technology case "pointless", also after the cyptoboom has died customers like myself need to purchase these cards at these "inflated prices" is beyond nonsensical! I would like for the RTG group to really ensure that their cards have good use cases, such as proper support with the software developers like Adobe etc. If they can get more developers/development support i believe that this can help in getting more customer buy-in. Looking forward to the next video drop!
@jerkojuric6137
@jerkojuric6137 Год назад
Great to hear you as always Jim! I have to say that as someone who built his first gaming PC with AMD XP2000+ Palomino and GF 4200Ti I am pretty into all this CPU/GPU saga for the last 20+ years. My take on this RDNA 3 story is quite simple - AMD quit on the GPU crowd because it is not worth their time. And I am 100% behind their decision! They have locked console market and there is no threat from nVidia or Intel in foreseeable future...or ever. Similar story is with mobile CPU division. They have the best price/performance ratio, excellent efficiency for the last 3 generation (5 years) of mobile Ryzen CPUs and still have 15% of laptop market...maybe not even that much. The latest intel mobile CPU draws more power than R9 5950X3D so right now for AMD management is like - WTF people...are you insane??? Mobile RDNA2 6800M/6600M are also great GPUs and I can bet that in the high end segment of gaming laptops they could not get 1% of market. I think that nVidia spent in last 5 years more on the marketing than AMD spent on R&D for both CPU and GPU and you just can not fight that. Next RTX 5090 will be 2500$ and marketing will make it great product and people will buy it. In the meantime AMD will use every square mm of TSMCs wafers to make the products where it is appreciated - datacenters. Epyc and MI cards will rule in performance, efficiency and design (chiplets all around) and make 70-100% margin on these products. For client segment they will make GPUs comparable with nVidias under 1000$ and will be happy. For nVidia i predict the same future as for German luxury car makers like BMW or Audi - they are running out of customers able to pay 100% premium over the competition for 20% of more performance. Bottom line - consumers quit on AMD in the mobile/desktop GPU segment so now AMD quit on them.
@MrPrimiR
@MrPrimiR Год назад
I wish you could have a private conversation with Lisa and carefully inform us what's going on behind the scenes at AMD -while also giving her your perspective. It's very frustrating b/c it feels like AMD is shooting itself in the foot on purpose with every decision.
@badass6300
@badass6300 Год назад
And then he can talk to the Illuminati and tell us what they are up to... You don't just talk to CEOs and if you do, they won't tell you anything of value.
@The_Noticer.
@The_Noticer. Год назад
You think Lisa is ignorant of this? What reality do you people live in. This is all cynical pretense to rack up the prices. They are pretending to compete when they're not. As Jim specifically said, they could very well release products to undermine Nvidia in price and performance. But they're not. They would rather just trail the 'competition' so Nvidia can always have the "fastest" and have the "halo product premium", because we've accepted that stupid premise. Then all AMD has to do is slot right behind that premium and boom, prices raised again for a generation. Repeat.
@gungrave10
@gungrave10 Год назад
​@@The_Noticer.Jim did warned us about not buying Nvidia, letting them get away with price increase every year. Now it happens, and we only have ourselves to blame.
@The_Noticer.
@The_Noticer. Год назад
@@gungrave10 You dont get it dude. You cannot vote with your wallet when they're not even actually competing. You can see how slim their margins are, for whatever reason that is the case. So they're just doing this fake back-and-forth and slotting in between eachothers product stacks instead of aggressively undercutting because they want prices to be higher. You cannot "vote with your wallet" in this scenario. Which is true for more things than just this GPU thing. You have the same lack of choice when choosing politicians as well. They all bow down to US/IL here in the west. And it's all different flavors of fake dichotomy.
@Morphium14
@Morphium14 Год назад
@@The_Noticer. Yeah I agree. And there has to be price-fixing agreement between those two. Other than that i cant explain it why AMD doesn't target their cards more aggressivly.
@sinephase
@sinephase Год назад
My thought for AMD giving such good deals to M$ and Sony was to get devs to optimize for AMD architecture but there's still few games that do better on AMD than nvidia cards.
@bigcjm
@bigcjm Год назад
Yeah and the the consoles despite having 70 class power on release were 2060 performance and have remained so.
@m_sedziwoj
@m_sedziwoj Год назад
Imagine what would happen if consoles would be on Nvidia too. AMD GPU would be trashed.
@worldtownfc
@worldtownfc Год назад
For consoles, the titles are optimized for custom Zen II/RDNA II SoC, but we don't see that translate uniformly across to PC ports. NVIDIA has an army of software engineers to get their drivers to reduce any deficiencies in AMD-favored titles, and they have overbuilt super high-end cards like the GTX 1080 Ti (352-bit memory bus/GDDR5X) and RTX 3090/4090 (384-bit memory bus/GDDR6X) to brute force to make up the difference. AMD is not willing to sponsor more game engines, which is highlighted in the video that Radeon Technologies Group is cutting costs to be more profitable. The next Tomb Raider game is going to use Unreal Engine V instead of an in-house Foundation game engine for the recent prequel trilogy, which favored AMD. AMD could have sponsored the next Tomb Raider game engine, but allegedly, they gave that up due to high costs. AMD loses a key benchmarking pillar in Tomb Raider. As a result, AMD is going to have a tougher time with less benchmarks to get wins.
@badass6300
@badass6300 Год назад
The GTX 1080Ti was a slightly above mid range GPU: 471mm^2 out of 851mm^2 reticle limit. The GTX 1000 series was the most overpriced at MSRP generation ever, which is what enabled Nvidia to take commanding lead over AMD since then, though I'd argue rdna2 beat ampere, mainly due to the process node difference.
@worldtownfc
@worldtownfc Год назад
@@badass6300 Yeah, Pascal was a big price increase, but the 1080 Ti delivered the goods on ~2X performance at 4K vs. 980 Ti. NVIDIA was head and shoulders above AMD's Vega 56/64 and solidified premium pricing power for the RTX 2000 series. RDNA II was extremely competitive against Ampere, but AMD failed to seize the moment when cryptomining took over. Gamers were left to the scalpers or wait until the crypto crash to get RDNA II/Ampere GPUs at the tail end of the generation for decent prices.
@Xearin
@Xearin Год назад
Great stuff once more Jim. I still have a gut feeling AMD is high on the APU sauce. If they can shift everything into one package and reduce the need for dedicated graphics. It would be something to see them trying to make nvidia obsolete, while not having to compete with dedicated graphics.
@TheDiner50
@TheDiner50 Год назад
APU? At best that is a task for 3nm. Super expensive laptops. Or we are talking really low end. Xbox X and PS5 is AMD's APU. We know that AM5 socket is barely able to hold 16cores currently. Even if Zen5 or Zen6 cram 8 cores into a small 3nm the space left for the GPU is what? RX 6600xt? RX 6750xt? Using as much high cost silicon as possible just to avoid a PCIE card? What is AM5 and PCIE5.0 meant to be used for then? Is it just to support more M.2 SSD's and Nvidia GPU's? Just to beta test PCIE 5.0 for servers? I got a 5600G with Vega11 graphics. It is amazing and almost all the SFF soldered down APU pre builds from Lenovo and whatever are awesome. Still they are awesome for someone running Linux or barely do anything in Windows. Work computers etc. If 7600xt and 7800xt manages to avoid the disaster that was Navi31? And have drivers as good as RDNA2? That basically is a better use of silicon. Instead of cramming a big GPU into a CPU? Just put a decent sized GPU die on a card and make it affordable. Agen Navi31 only real clear win was that it has enough Vram to not look completely stupid.
@Speak_Out_and_Remove_All_Doubt
This might be my eternal optimism talking but I actually read the current situation slightly differently. So with Nvidia clearly not interested in bringing GPU prices down or they they can't much because of how big their dies are and how expensive 4nm is but either way it means AMD can do their normal thing of price below Nvidia but this time actually make some profits for a change which will hopefully mean they don't give up. But also how I'm reading all the rumours at the moment is that AMD might actually surprise us and launch RDNA 4 much earlier than expected and take the fight to Nvidia that way. I think AMD are pretty far down the line with RDNA 4 development. N31 had issues and they had a choice of either spending a fair bit of money on a re-spin to try to fix these issues for a refresh release in Sept/Oct time or use that money and man-power to pull in the launch date of RDNA 4. And I believe they went with the latter! So even though it looks like Nvidia had already reacted to this by cancelling their own refresh too, it all depends how far along they are with Blackwell development and how much they can pull in the launch date. But if AMD can get 8900 XTX to complete with Nvidia's current 4000 series for maybe 4-6 months then that would result in massive sales for Radeon at decent margins for them.
@badass6300
@badass6300 Год назад
The rtx 4070Ti has a 295mm^2 die size that's smaller than the Rx 6700XT, on par with the GTX 1660Ti, on par with the rtx 3060 The 4080 is 389mm^2, the rtx 3070 is 392mm^2 yet the 4080 costs more than double
@arenzricodexd4409
@arenzricodexd4409 Год назад
....pipe dream
@PaulHawxby
@PaulHawxby Год назад
I can kinda understand AMDs thinking. The sheep buy on brand, and the Nvidia brand is strong. Even if they came out and blew away Nvidia at every price point their sales wouldn't reflect it. It sucks massively but the consumer is as much a problem as AMD.
@morpheus9137
@morpheus9137 Год назад
Agreed, AMD must first win market share at the low and mid range before attempting high end. Its a long game. I don't think you can count the peroid before Lisa when saying they had 10 years. In reality they have had about 7 years with Lisa, and some of that they had financial issues. The company almost went bankrupt. The balance sheet looks far better now.
@XenZenSen
@XenZenSen Год назад
Well, this is how people got about Intel and AMD until zen came out it took until Zen+ to really turn some heads and zen 2 to build hype. AMD needs to win. The 6900xt was actually great, if they succeeded with the 7900 xtx and dated to push the power further with a 7950 xtxtx then they'l could called in on the halo effect. I'm getting an Nvidia card because AMD still has issues with VR
@spoonikle
@spoonikle Год назад
gezz that HD 4850 got me hooked on AMD ever since. I guess I am the kind if girl who thinks they can change him…
@Dilo22
@Dilo22 Год назад
Hd4k and 5k series were truly the peak of GPUs
@Loundsify
@Loundsify Год назад
Loved my 4850, ran MW2 at 1080p ❤
@HighYield
@HighYield Год назад
Really enjoyed your analysis, lets hope AMD decides to compete in the high-end again.
@Argoon1981
@Argoon1981 Год назад
It will not and if it did it, wouldn't make them any good, history has shown that again and again, Nvidia has reached a level of public perception that is very hard to beat. I have been following the GPU market for 25 years now.
@Fezzy976
@Fezzy976 Год назад
They did, the 6000 series a true head to head with Nvidia only around 5% faster in raster performance and look how it turned out. Nvidia still sold more GPU's, 7000 series was a dud due to MCD architecture. Hopefully RDNA4 is another RDNA2 moment, if its not I too am done with AMD GPU's.
@jelipebands1700
@jelipebands1700 Год назад
One thing Jim did not talk about is that Amd will continue to integrate there cpu and gpus together. Who will be buying low end nvidia gpus when Amd and intel apus crush 1080p gaming?
@RobBCactive
@RobBCactive Год назад
Nvidia can profit from $1,500 & $2,000 cards because of the professional uses who buy high end gaming cards, rather than the workstation ones with fully enabled drivers. There's only so many whale gamer buyers, who want the best and even when AMD produce the best card, it has to be repeated twice before perception changes. Buyers are NOT objective and have brand loyalty. I think though, disaggregated chips are a core AMD strength and Tom's new video on AMD & Nvidia strategy is closer to the mark; Jim is impatient and wants a gaming high end challenge, while making profitable cards and long term R&D, while leveraging the CPU/Xlinx synergies can create disruptive breakthrough products. Look at RDNA2's Infinity Cache, those VRAM sizes .. the engineering & market needs fitted better, new games are no longer running well on the RTX 30 series where Nvidia skimped on VRAM. MLiD's latest video "RDNA 4 Fights RTX 5000 with Complexity: Does AMD have a better Strategy than Nvidia?" seems more balanced, no financials without reference to general market conditions and write downs made by competitors. The bottom line is, the MCD & V-cache allows using 6 & 7nm cheap processes for things like VRAM busses & cache that don't scale, while the core GCD die on the best node can enjoy better yields and be made cheaper. RDNA3 was the first step, there's a flaw that killed performance goals, but there'll be more products and Nvidia's always pushing up costs and seeking the most lucrative market; once their binning advantage through large scale is lost, like Intel Nvidia will have technical challenges to overcome, ones that AMD have been tackling by stepwise refinement in a steady methodical iterative manner for more than 5 years with real products.
@holthuizenoemoet591
@holthuizenoemoet591 Год назад
This is a classical economic problem, if AMD builds a faster chip, Nvidia will respond etc. Its similar to the prisoners dilemma.
@AmurTiger
@AmurTiger Год назад
Companies aren't our friends be it NVidia, AMD or Intel. The flip side to that is that we can't expect companies to give gamers something without there being some evidence that it will help their bottom line. AMD's got a ton of evidence that if they push price/performance gamers will largely ignore them and they'll just have a division making less money then it could. Until NVidia sales get punished be it either by shifting to ( the only viable alternative in discrete GPUs ) AMD or by a YoY decline in sales overall that's notably worse then general economic trends then there's no reason for AMD to believe that chasing market share with price is going to get them anywhere. A chance for this to change happened with the anemic launch sales NVidia managed that at least suggests that the stranglehold NVidia has on the gaming consumer may be loosening, what we've seen from AMD hasn't been dramatic certainly but steady whittling down of prices have it surpassing NVidia in mindfactory sales, a small sample to be sure . The reasoning is easy enough to understand, they want to test if the market actually reacts to a lower price with higher sales, not lock themselves into a lower MSRP that doesn't net them the sales needed to make up for the lost profit by cutting margins down. Not trying to make a top tier ~1600$ card is a shame absolutely for those in that market but I suspect they saw such a fight as a considerable gamble and as you noted their approach seems to be more focused on extracting cheaper costs out of chiplets instead of maximizing performance. Which along with less stingyness with memory then NVidia still allows them to have compelling offers at some price points, thus the mindfactory results, certainly that's a big part of why I went for an AMD card. A 4090 was just way more performance then I needed and my use of mods means I couldn't tolerate NVidia's VRAM stingyness lower down the product stack. If these positions were reversed I'd have to look at getting NVidia instead. I don't think AMD's position in the market is quite that hopeless but neither do I expect them to swoop in and 'fix' the GPU market any time soon, they have just as much to gain from high GPU prices as NVidia after all, only consistent rejection of those higher priced offerings by gamers will bring it back to earth and at best we're seeing the first hints of that this past year or so. Also as you pointed out Jim AMD's got a lot of other markets to focus on, dedicating a ton of resources to decisively push NVidia in their key market when they still only have a quarter the market cap NVidia has doesn't make a lot of sense compared to focusing on the markets Intel is struggling with holding onto and these days Intel is much closer to being a peer size wise then NVidia. This doesn't mean anyone should be happy or satisfied with somewhat backburner efforts from AMD but they're not going anywhere either and as a heathier company then they've been in years they at least have the option to tackle NVidia, even if they're a bit gun shy about it thusfar.
@DaTanMan99
@DaTanMan99 Год назад
I've read through so many comments in this thread and yours is the best take by far. Consumers are so keen to jump on AMD for not chasing market share by offering GPUs for essential zero profit but they fail to realize, perhaps because they only know how to see things from their own perspective, that AMD or any business' actions largely depend on what benefits them the most. History has shown that the last time AMD chased marketshare, they lost any marketshare gains because they made less profit than Nvidia in the long run. Profit is what allows companies to grow, not marketshare. Without profit, there's zero reinvestment into the company. So, AMD now knows that 1) chasing market share by trimming margins is essentially robbing Peter to pay Paul, and 2) Nvidia's mind share is too dominant for them to match Nvidia on profit margin. What this means is that even if AMD and Nvidia had 50-50 market share and revenue split as certain consumers so desperately desire, Nvidia would be making more profit per dollar of revenue and thus swing back harder in subsequent generations. Lastly, opportunity cost is a BIG consideration for any company's decision making. Why spend $1 to hopefully make 10c in profit when you can spend $1 to make a more guaranteed 50c in profit? The former is investing in RTG against the behemoth that is Nvidia, and the latter is investing in CPUs and data center against the likes of Intel. Fighting Nvidia is trying to ice-skate uphill, so why fight gravity if you don't have to? AMD has finite resources and they'll chase all other alternative revenue streams with better profit margins and return on investment first. If and only if AMD have exhausted all other revenue streams that generate better returns than going up against Nvidia will they have the incentive to do so.
@fturla___156
@fturla___156 Год назад
You are correct. AMD isn't really trying to out compete Nvidia. It's how and when they sell their GPU hardware that tell you they don't want the lead role for selling video cards. They are happy simply undercutting Nvidia but at the maximum price that they think the public will tolerate.
@andersjjensen
@andersjjensen Год назад
Oh, they want it. But they also want to make decent margins doing so. And that is currently not in the cards. RDNA3 didn't pan out the way they though it would.
@ArtisChronicles
@ArtisChronicles Год назад
Regardless of the role, it's the customer's duty to keep the prices in check. That only happens if people aren't buying.
@andersjjensen
@andersjjensen Год назад
@@ArtisChronicles Exactly. But people are. PC gaming has become a sport for rich kids and a past time activity for the nerds who grew up with it and are now engineers in high paying jobs. It is no longer the domain of the plebs. Those get consoles.
@andersjjensen
@andersjjensen Год назад
@@nowayslowman I'm 45, but nice try. And the problem with just using an inflation calculator is that it doesn't also account for average income which has absolutely not kept up with inflation since then. We've had two financial recessions since then and to nobody's surprise it's the lower middle class who's seen the worst recoveries of both. $650 was steep back then, but I knew many who shopped GPUs in that range. Today I know practically nobody (who don't have a senior engineer salary like myself) who buys better than upper mid-range. But even if we take your inflation calculation at face value your argument still doesn't hold up: The 8800 Ultra was the fastest model in the generation. At $1250 in today's money that dwarfs compared to the $2000 3090ti of last gen and $1600 for the 4090 where we still don't know if the 4090ti is going to push above $2000 or not.
@billy65bob
@billy65bob Год назад
If they have truly given up, I can't even blame them. Whether they give a decent effort or do their best, what's the point if the public never rewards them when they're winning?
@thecooletompie
@thecooletompie Год назад
When did AMD win though? Vega was a disaster, Navi 1 was a disaster, RDNA2 was overpriced due to covid and mining and even if you ignore all that it lost to NV in rt performance, RDNA3 is disappointing. Where's the win? If you have to go back all the way to Polaris to find them winning in the low end you know you you're coping.
@jeremiahnatano4841
@jeremiahnatano4841 Год назад
​@thecooletompie winning isn't all about performance man, it could also be on price which amd does win on majority of it, especially during towards the end of covid Era.
@m_sedziwoj
@m_sedziwoj Год назад
Let me ask, when AMD win in 2-3 generations in row, and it was clean win? Never? So how you think people could change they mind?
@thecooletompie
@thecooletompie Год назад
@@jeremiahnatano4841 If you want margin it's all about performance though. But even ignoring that the only win you will find is the 6000 series mid range. Throughout much of it's life the 5000 series had dysfunctional drivers explaining the discount compared to NV GPUs and the Vega cards really struggled to match NV as well (because they had their price inflated due to mining). Leaving you again with only Polaris.
@bleack8701
@bleack8701 Год назад
I'll tell you why I at least find it harder to buy an AMD GPU 1. Over here they're more expensive on average compared to equivalently performing nvidia GPUs 2. They didn't even have much going on until RDNA2 as far as I'm concerned. There needs to be some sort of consistency so a good GPU here and there doesn't count. RDNA 2 and 3 is basically the first time they've been consistent in awhile, in my opinion. I've been eyeing up the 6800xt, but pricing is so scuffed that I can't buy it. There's a 6900 xt and a 6800 xt available for the exact same proce and the 6700 is just 50 euro cheaper than those two. They either have to do something to fix the pricing or finally launch the mid range GPUs for this generation.
@tiavor
@tiavor Год назад
look at GN for the 600W statement, this was a wrong translation
Год назад
Those graphs nicely show how good Polaris and first-gen Navi was. Well, I guess that they concluded that they make more money making cpu chiplets than rather large gpus. I guess that I'll upgrade my aging zen1 chips with zen 2 and 3 chips, and then keep my mix of Polaris and Navi2 GPUs until they die. There is not much time for gaming, my large library of games will last me for many years so there is no need for any further hw upgrades.
@justbob8294
@justbob8294 Год назад
It's like radeon is aiming for the Mozilla Firefox business strategy get so little market share that Google gives them money to be in business but they want nvidia to do it. I would have bought the 7900 xtx if it was faster than the rtx 4090. Too bad it's more of a 4080 at best. It should have been named the 7800 none xt
@jurepecar9092
@jurepecar9092 Год назад
Thank you for this video. I was not sure whether to get amd of nvidia gpu for my next build, but this convinced me that amd is the way to go. I couldn't care less about who has the best performance card, I care who has better price/performance product. That's what REALLY matters. Trust me, I work in HPC.
@oldtimergaming9514
@oldtimergaming9514 Год назад
Same, price/performance. I can't justify the tons more cash just for a small performance gain.
@Jimster481
@Jimster481 Год назад
@@oldtimergaming9514 7900xtx definitely has that over Nvidia. I have one and its a monster of a card. I think that this whole video complaining about the 4090 is kind of pointless. From a business standpoint they know that the ultra high end buyers will buy Nvidia because they have the mindshare. People mostly bought the 3090 over the 6900xt even though the 6900xt was generally faster in everything that wasn't 4K and most people weren't playing at 4K anyway... So this generation they offer a price drop on their highest end card and a performance bump to go along with it. How is that a bad job from RTG? As a Radeon buyer; I am happy with this move. My friends with 4090s kind of feel dumb when we compare frame rates because my machine is so Economical and my FPS isn't very far behind... my Entire gaming system cost around $1700 (including some recycled parts like my x370 motherboard) and a 4090 alone is easily over that $1700 price for an AIB card + tax.
@word2RG
@word2RG Год назад
finally. NVDA doesnt even WANT to sell you a GPU.
@tichaclin
@tichaclin Год назад
Was really disappointed with 7000 series launch. As we speak, if I have to spend $1000 on a gpu, I don't really care adding extra $700 to get 4090.
@ARedditor397
@ARedditor397 Год назад
I 100% agree
@JensenHuangTensorEnthusiast
I’m about halfway through, and I’m assuming I’m gonna mostly agree, but uh. But AMD did not say that. Gamersnexus video where they asked AMD: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-qlPSNOU2WEA.html Still bad, just less bad.
@omgponies111
@omgponies111 Год назад
I think it all boils down to opportunity cost, especially when it comes to wafer availability. If AMD had the resources and wafers to do so then they would compete in the high-end but they decided to focus on data-center and the consoles instead. Regarding RTG not making much profit it could be partly explained if the profit is made by other parts of AMD. If 3 divisions are involved in a product then it is not uncommon for 1 division to book the profit for the product and the others get break-even.
@Messametti
@Messametti Год назад
I think this video comes one generation too early. This gen is in my eyes more or less a beta test for the chiplets which AMD plays safe. If the RX8000 series dont have an ultra high end card that would be the perfect time for this video.
@benjaminoechsli1941
@benjaminoechsli1941 Год назад
Agreed. As Jim said in a previous episode, "If they don't win with RDNA 3, they'll win with RDNA 4. I'd bet my career on it." The 5800X3D was a prototype of V-cache, with Zen 4 X3D offering more options and a refined tech. Years of "playing it safe" to avoid bankruptcy are not easily overcome, so it makes sense that AMD would avoid going all-in until the tech was ready. See you in a couple years when the 8900 XTX beats the 5090 by 10, even 15%. If not? Well, let's hope Intel really is serious about disproving the rumors that Arc will be doing the bare minimum.
@adoredtv
@adoredtv Год назад
@@benjaminoechsli1941 They have all the technology they need, but unless they are willing to ramp up their coolers and power draw Nvidia will just win with higher power.
@benjaminoechsli1941
@benjaminoechsli1941 Год назад
@@adoredtv I don't know if you'll see this, but while we were discussing this video, a fellow enthusiast pointed out to me that the reason the profits are so low is because that "profit" is what's left after re-investing the majority of profits back into the division. If that's the case, then the division is getting funds it can use to make those better coolers, unlocking more of the potential of future Radeon generations. Is that right?
@adoredtv
@adoredtv Год назад
@@benjaminoechsli1941 That's just R&D really. Building a "Founders Edition" cooler made financial sense to Nvidia because they saw a benefit in cutting out the middle-man. AMD won't have that benefit, ergo they haven't gone down that route. The thing is, it gives Nvidia another level to pull. They can literally go as high as physics allow, but AMD are limiting themselves to "sensible" TDPs which their partners can build cooling solutions for while still being profitable.
@benjaminoechsli1941
@benjaminoechsli1941 Год назад
@@adoredtv You're right, I confused operating income with profit. What we're really seeing is their operating income didn't change because they greatly increased their operational expenses. Look at how their headcount increased - 15,500 in 2021 to 25,000 in 2022 (some of which is due to the xilinx merger, of course, but the company _is_ growing, not just coasting.) You touch on an interesting point: the AIBs. AMD's partners are adapting their 3090 cooler designs for Team Red cards this gen, aren't they? That's saving both them and AMD money, and unlocks the potential already in the silicon. Sure, it's an example of AMD technically following Nvidia again, but this time it's exactly what you want: them going nuts a bit, pure performance.
@user-yc5fq9bv3u
@user-yc5fq9bv3u Год назад
08:48 the citation is wrong and is a double translation: English -> Japan -> English. (at least GN said so)
@rajivkishore
@rajivkishore Год назад
In India the price of the RTX 4080 and 7900xtx are the same so I got the 4080. CPU wise AMD are killing it, got the 7600x but the mobo is too expensive, got the AS ROCK Pro RS x670e for the gen 5 GPU and SSD support.
@detroid89
@detroid89 Год назад
You could have got the b650 bro. They are starting to roll out more now.
@madb132
@madb132 Год назад
@@detroid89 B60 boards only support 1 pcie 5 device, That's why they bought a X670 board.
@detroid89
@detroid89 Год назад
@@madb132 makes sense mate 😁
@Altrop
@Altrop Год назад
Enjoy choking on that 16GB VRAM. An UE5 developer said we're spearheading twoards 24-32GB VRAM use in just 2 years. Punked by Nvidia again.
@DantesDentist
@DantesDentist Год назад
Corteks has been suggesting price fixing for ages, I tend to agree. AMD have pulled an Intel/Nvidia, screw the customer for as much as possible
@andersjjensen
@andersjjensen Год назад
Price fixing is when they sit down and agree. They don't even have to. Nvidia can read AMDs financial statements perfectly fine and see that AMD isn't making money on graphics. So there need to be no agreement. Nvidia figured out that AMD would be happy to make SOME money while Nvidia makes a shit ton. And don't forget that they're a company. Companies don't "screw the customer for as much as possible". They price at the optimal profit/volume point. Because that's their job. If you want companies to be charities you'll be disappointed for the rest of you life.
@Psychx_
@Psychx_ Год назад
I wouldn't think that AMD could double the amount of CUs just like that. The command frontend does have to run increasingly faster than the shaders in RDNA3, the higher you push their clocks, just in order to keep them fed. Heck, in OC scenarios, the frontend is touching 3.5Ghz+ when the shaders haven't even reached 3.0. I'd argue that a completely redesigned frontend will be needed as CU counts increase further, and we'll probably see that in a future generation of RDNA. Doing that extra design, validation and driver work for a low volume, halo RDNA3 product wouldn't make any business sense and probably result in a net loss of money; thus I can understand why they didn't approach it just now.
@mattparker7568
@mattparker7568 Год назад
They may need to do the redesign before the PS6 and next Xbox come out. Will need to target 4k 120fps or higher fps for TVs.
@Psychx_
@Psychx_ Год назад
@@mattparker7568 The current console generation is still rather early in its lifecycle; Games haven't even taken full advantage of the HW yet. Then, there will probably be a backwards compatible "Pro" series/refresh again. We won't see a true successor/future generation of home consoles before 2026~2028.
@GhOsThPk
@GhOsThPk Год назад
Why should they Adored? Let's be honest here, take for example the RTX 3050 and RX 6600XT situation, the 6600XT is more than 50% faster than the RTX 3050 for less or about same money, what consumers buy you ask? Yes, they buy the RTX 3050, the RX 6600XT is faster than the RTX 3060 too, do you see the RX 6600XT in the leading spot on Steam Hardware Survey? No it is the RTX 3060 which costed and still cost way more than the RX 6600XT, why should AMD heavily undercut NVIDIA or even launch an card for let's say $1800 that beats the RTX 4090 if people won't buy it? This is also consumers fault at the end of the day, people want AMD to beat NVIDIA so they can buy NVIDIA cards cheaper, in the end AMD decided it would be better to try and at least maximize margins with MCM if they can't reach market share since consumers don't even consider their cards.
@veduci22
@veduci22 Год назад
It's because AMD is non-existent in OEM space, just look at the miserable numbers of gaming prebuilts and notebooks with their gpu...
@GhOsThPk
@GhOsThPk Год назад
@@veduci22 It's not only that man c'mon, three weeks ago i tried to convince a friend of mine that the 6800XT was a better deal than an RTX 3070, i explained to him, in the best way i possibly could with benchmarks and etc... And even tho the 6800XT 16GB was CHEAPER, he still bought the RTX 3070 8GB (NON Ti) the 6800XT is so much faster that you can use DLSS and even then the 6800XT is still faster and it has double the VRAM. It's IMPOSSIBLE for AMD to do anything with consumers like this, literally, impossible. Also take into consideration that not everyone has a friend like me, who can remotely personally help you decide a better deal and even then he choose the worst card, because NVIDIA, imagine the average consumer... No way man. I fully understand why they're price matching.
@Horus9339
@Horus9339 Год назад
I would say that when Zen came out it was rather expensive, had some great new design ideas, but was lacklustre, and did not blast past competitors. I can see the same coming from their GPU path. Multi chip GPUs are the way forward, and in two generational development cycles could have a much more impressive outcome. Personally I expected AMD to go for lower prices (£800 max for the RX7900XTX) and keep the mid/high gaming segment happy to try AMD, sadly they fell in with the 'grab the cash' nVidia scam. There is still time for AMD to turn this around, and that profit margin will grow if they win over a good 20-30% of the market. Only an opinion, let's face it Companies run on profit and if they so wish can tell the consumers to do one if they do not like the price.
@wahidpawana424
@wahidpawana424 Год назад
maybe they should start working with software developer to create a professional software's such as video editing that make full use of their card's capabilities.
@badass6300
@badass6300 Год назад
Zen was cheap, the Ryzen 5 1600 was a 200$ CPU that could run on a 80$ motherboard and it was as fast as the 600$ i7 6800K that ran on 250$+ motherboards.
@Horus9339
@Horus9339 Год назад
@@badass6300 I respectfully disagree. The only metric that the Ryzen beat Intel was in power consumption, the rest the Intel system wiped the floor with the Ryzen (bar price).
@badass6300
@badass6300 Год назад
@@Horus9339 The Ryzen 5 1600x was faster in cinebench, handbreak, 3dmark firestrike, truecrypt, while being on par with x264 transcoding, 7-zip, while 0-5% slower in games than the i7 6800k, which mind you had the mesh design instead of the ring-bus design, which resulted in lower gaming performance than the i7 7700k at the time. That is NOT wiping the floor in the slightest.
@TheDiner50
@TheDiner50 Год назад
@@badass6300 sounds about right. i7 7700k was a improved i7 6700k. Zen1 was really impressive and everything but just not at gaming and stability. Zen1 was a clear winner over buying a a new system just to get a i7 7700k or whatever expensive 6-8core. But it was not worth dealing with Zen1 simple as. If you where still on a 4970k or whatever 4core Zen1 really was still a hard pick since price and power was not worth dealing with Zen1. I got a 6700k right around the time when 7700k and Zen1 was a thing. Zen1 was not really worth dealing with for someone just wanting a good gaming CPU that was still good enough to do VM's etc. I mean we know what a shit show USB and motherboards where on the first gen Ryzen. Intel had options that where maybe more expensive but so what. First now after Zen3 is not crazy out of stock etc and AM5 was to launch did I upgrade from that 6700k. I mean the worst one can say about the 6700k was that it still was only 4 cores. Avoiding Intel waste of sand and Zen 1 completely was nice. Zen2 was really the Zen moment be real everyone. That is when Intel was outplay for real. And then Zen3 wiped the floor if anyone questioned if Intel was done for or not XD. But yea for anything but gaming Zen1 was just hold back by problems of being so new and underfunded/tested. Still Intel was not far enough ahead or behind Zen1. You just picked your poison that that was it. I mean the extra I had to pay to get 5% improvements 7700k promised? It was just simply not worth it. And DDR4 prices where really nice and only problem I had with my 6700k was that the M.2 SSD slot did not work. Bu hu. Have not lost any sleep over the dead m.2 slot or missing some OC headroom that 7700k possibly can have given me. And a Zen1 system I do not care. It had never made more sense then just getting the 6700k. If you really needed the 6-8 cores Intel had that for a premium too. Zen1 was good but Intel was also still good. Now I'm not sure if Intel or AMD is going to be my next CPU. I can really not tell being honest if either deserves any bias anymore. Both have made it clear that there names mean nothing without first being angry about there greed. Pick your poison agen.
@-MaXuS-
@-MaXuS- Год назад
Först och främst 😌 I always appreciate the content you make available for us mere plebeian content consumers to watch/listen to so many thanks for that m8. You truly have a talent to highlight certain/unique aspects/circumstances regarding a topic/issue/question which often provides more context, nuance, insight and/or clarity from a different/unique angle/perspective. This in my opinion almost always contributes genuine informative value delivered in a way that’s incredibly interesting and unique. I mean from the first video I happened upon by you..jeez some 5 years ago I just realized, to the latest ones I sincerely thank you for the work you do for us to enjoy. 🙏🙌 Sist men inte minst, ta inte skiten som alla dönickar till fanboys ger dig när de inbillar sig att du attackerar märket de har en sån bisarr relation till att de börjar lipa, gnälla och kasta ur sig massa skit mot/om dig. Fuck’em 😎 Du är bäst! Fredens 👌✌️🖖
@tst6735
@tst6735 Год назад
Javla bra innlegg. Hilsen fra Norge
@lordstevewilson1331
@lordstevewilson1331 Год назад
8:27 That translation is wrong, as confirmed by AMD
@lemonapocalypse414
@lemonapocalypse414 Год назад
So now Intel is our only hope? Oof.
@77wolfblade
@77wolfblade Год назад
oh ****
@lukerestlessstudios
@lukerestlessstudios Год назад
I do agree that AMD's strategy is not quite there. They need to get very aggressive with performance targets and pricing to do to Nvidia what they did to Intel with Zen. However I do want to give them some credit as the better aftermarket 7900 XTX cards are actually pretty good. And now they are starting to show better raster performance than a 4080 in a lot of titles for at least $100 cheaper if not more. You can sometimes grab one for less than $1,000 USB open box at MicroCenter.
@andersjjensen
@andersjjensen Год назад
AMD has been able to give Intel a beating because AMD uses TSMC and Intel has been having node problems. AMD and Nvidia will always have node parity (as long as Nvidia doesn't go back to Samsung), and for that reason they will have similar similar performance at similar die sizes, all else being equal. But AMD can't command Nvidia prices, even at performance parity, because of mindshare. Which means AMD will have worse profit no matter what they do. The RDNA3 architecture didn't pan out too well. It should have been closer to 60% gain rather than 42%. We don't know if this is a side effect of the chiplet approach, or if it's the base architecture. But in any case: until AMD fixes whatever it is they will be at a disadvantage to Nvidia. If AMD manages to completely fix it, and perfectly leverage the idea of only making the compute part on the latest node, and stitching together the rest with cheaper nodes, then they stand a realistic chance of going toe to toe with Nvidia while still making a tolerable margin. They're going to have to consistently beat Nvidia at the top end, and in price-to-performance down the entire stack, for AT LEAST two entire generations to break the Nvidia mind share.
@CommandoTM
@CommandoTM Год назад
Regarding the other comments pointing out the apparent mistranslation as covered by GamersNexus, the overall spirit of what they say doesn't change It is still just another excuse. Radeon does NOT have any will to compete. They are not incapable, they are unwilling. And it shows, as for the past 3-5 gens, they have always launched AFTER. They DON'T want to set the stage, they just WAIT for Nvidia and aferwards slot their cards in between the pricing and performance gaps. In my opinion, the last time they really tried to compete was in the R9 290x and the Fury generations. Even with the 6900XT, it didn't feel like they were serious.
@davidgunther8428
@davidgunther8428 Год назад
If you're arguing that Radeon has low profits, I don't think releasing the 7900XTX for $200 less than it was would help that out.
@Dianaranda123
@Dianaranda123 Год назад
I was actually seriously considering to buy an AMD GPU this time around, but then saw they priced there XTX card at 1000€ and went immediately fuck that, i aint gonna lay down 1000 eddies for a GPU no way no how. So AMD shot themselfs in the foot with there dumb price following of GreedVidia. Definately a hugely missed chance, dont know what i'll do now, probably buy a 4070 -_- and maintain my current workflow on Daz3D and iRay.
@OlaJustin
@OlaJustin Год назад
Edit: First, love thay you're back! I hope life is going great! Thats the thing aint it, the wafers. Why would they want to sell big GPU chips for cheap when they can sell tiny CPU chips for big $$$? The Radeon Group is stuck between a rock and a hard place, the AMD data centers and Nvidia. The time of cheapish, big chip pure gaming GPUs in PCs was over the second they got useful for things that actually makes moneys.
@TheDiner50
@TheDiner50 Год назад
BS. Consoles are made on TSMC right? I can still get a Xbox X or PS5 for $430 now after shortages and all that * that was 2020-2022. Somehow AMD is able to sell to consoles the CPU and GPU to make up the consoles. Far less money or rather margins for AMD. Sure a secure source of money. But for real if Consoles are still at 2020 prices after inflation and TSMC price hikes? AMD can also make a price to performance beast that makes 4070 and 4080 12GB look like down right criminal. And the same with 3080 10GB and everything actual people buy. 4090 and 3090's are just greed. Everything under that is made to force upgrades to nowhere.
@OlaJustin
@OlaJustin Год назад
@@TheDiner50 I dont think you got my meaning. I never said AMD cant make a nicely priced performance GPU for the PC, I just dont think they will because its economicly stupid to do so as long as TSMC is on top of the world without competition. Console chips arent on the latest nodes and have nothing in common with a top of the line GPU or CPU.
@andersjjensen
@andersjjensen Год назад
@@TheDiner50 You just saw how little money AMD makes on selling console chips. And those volume numbers are negotiated years in advance. So it's not a risk for AMD to commit to them.
@skaltura
@skaltura Год назад
So PC Gaming segment / dGPUs is just marketing for them, we are like the bottom of the barrel guys demanding the most, for the least, the annoying customer base. It's the businesses which make it profitable, both Sony & MS and HPC & Datacenter sectors. So a lot of Radeon revenue is actually on the embedded and datacenter segments. It would be curious to see a breakdown of how much is GPUs and how much CPUs in their revenue.
@badass6300
@badass6300 Год назад
CPUs are a lot more, but GPUs are growing with the MI200/300
@singular9
@singular9 Год назад
Imaging making money from your "marketing division" which what GPU's are.
@badass6300
@badass6300 Год назад
@@singular9 Console exclusives do that too. They are marketing for the console, and then they can pay for themselves directly.
@spacechannelfiver
@spacechannelfiver Год назад
Given the state of PC games derived from cross platform, a high end GPU is a waste of money, you’d be better served with a console. PC gaming is in rude health, but in the Indy space where it’s competing with Switch and also legacy support. That 4090 is a very good deal for a professional workstation card, but the rest of the line up below it is crippled by VRAM and the Radeons can’t even run general purpose AI work due to the stranglehold CUDA has. CDNA makes sense for certain subsets of computation where you are buying 10000 units and throwing them at a FP64 problem, modelling weather etc.
@sandelu635
@sandelu635 Год назад
In regards to 4090 and AMD competing against it. You have to take into account that AD102 is used in RTX 6000 Ada (and other similar products) at 7000$ if you can find it at that low price. If you can't afford it you can use a 4090 in many cases. AMD dose not have this dual use / dual market for a large die. In this generation i bet that close to 50% of all dies of AD102 are not bought by gamers.
@Xysionite
@Xysionite Год назад
I think AMD and nVidia have completely different philosiphies. AMD wants their graphics to be pervasive. They have APUs, they go into Teslas, console SOCs, handhelds and they have recently renewed their licensing agreenment with Samsung. nVidia has chosen AI as a target and wants to be the best at that while at the same time holding the PC graphics market. Make no mistake though, they are making less revenue in PC gaming as well.
@leucome
@leucome Год назад
Amd are likely to get back on track with AI machine learning. Thing like stable diffusion run about as fast on the 7900 than it does on Nvidia 4070/4080. It is 300% to 400% performance improvement over the 6XXX GPU.
@pvalpha
@pvalpha Год назад
Thanks for this Jim. I'm more than a bit depressed by what I'm seeing. As far as the current start of the AI "inflection point" (its real, but GPT is starting at LOL levels) GPT is crypto now for AMD/Nvidia/etc. Where this will really take hold is in the related merging of various coding mindsets around GPT/GAN/Diffusion/etc. Large Language models are just a start, the compute requirements are going to crater over the next few years (LOL for AMD/Nvidia if they think data crunching will be the cash-cow) But it is interesting. Having a capable agent that can interact with you and act upon your personal needs advocating for your benefit? I think that's where the real end-game of this generation of learning systems end up. People who trust corporate systems and their biased agents are going to get hard-burned by the rise. People who learn how this stuff works and build their own or work on community projects to build agents for community consumption? That's where we will see the benefits. At least, I think that's where the evolution plateau is going to stick after the logarithmic utility increase we're going to see. I'm not sure its going to work out for big businesses the way they hope - too many people with the skills to use what exists to build what's next in the wild. But this is me being a wishful thinker - but I can definitely see the large tech companies biting early and finding out the AI they consumed was poison and not worth the cost.
@Cinnabuns2009
@Cinnabuns2009 Год назад
I wish AMD would plan a GPU that has 4-6x the RT performance and completely disregards raster performance favoring RT for chip real estate. When AMD first mentioned they were moving to chiplets I was freakin stoked because I was thinking likely what everyone else was, dual GPUs would pwn and the perf would double but that's not what they did... When I saw the single GCD and 6x MCD design, I was actually really confused and sad. They've gotten a lot out of architecture in this generation though and it should afford some advantages in future iterations and that's good to see. Its just not what I wanted to see.
@Lync512
@Lync512 Год назад
Nvidia didn’t screw over the GPU market. Nvidia has been doing its job exceptionally well as a cooperation maximizing its profits. The only job of a company is to make money, and maximize value for its shareholders. AMD screwed the market over by not providing proper competition by offering a superior and cheaper product to challenge Nvidia’s dominance. Competition exists to keep companies in check naturally. And AMD has failed. Corporations aren’t your friend. And Nvidia never pretended to be your friend.
@johnknightiii1351
@johnknightiii1351 Год назад
Been using radeons and Ryzen chips for years trying to support the underdog but I'm tired of sandbagging. Going to jump ship again like I did when the core 2 duo was released. I'm waiting to see what zen5 and rdna4 bring. I have a 5950x and a 6900XT that I purchased at launch and I'm tired of the games they seem to be playing
@Mpdarkguy
@Mpdarkguy Год назад
As a Linux user I'm basically stuck to AMD or maybe Intel if they have a high end offering down the line. It's OK for my personal demands so far but hope in the future I can still buy something ok
@77wolfblade
@77wolfblade Год назад
yeah intel card seem to run well with api like vulkan.
@madb132
@madb132 Год назад
You have made the right chose as far as Linux goes. As the world is excepting/turning over to Linux, it can only get better.
@davidgunther8428
@davidgunther8428 Год назад
I'm curious what's going on with Navi 32, the expected 7800XT chip. I would hope to have heard something about it by now, but maybe they're trying to figure out what didn't go quite right with Navi 31, first.
@tomtomkowski7653
@tomtomkowski7653 Год назад
7900xt is only 34% faster than the 6800xt so I expect nothing from the 7800xt. What it will be? 6900xt performance for the current 6900xt price? RDNA 3 is a flop.
@garwynrosser8907
@garwynrosser8907 Год назад
It used to be in business that "the customer are always right". Now it's "the shareholders are always right".
@Spikeypup
@Spikeypup Год назад
Hiya Jim, been awhile I know, but as usual I hope this finds you well. I'm watching this and about half through, but what's gnawing at me my good friend, is let's suppose AMD can always get close to NVidia but with way better power specs and efficiency....and a better price. Is it that important that they "win" against NV's top card? Is it for bragging rights or is there something bigger at play that I'm missing? I'm just curious what your take is, I hope you don't think I'm trying to abrasive, I'm truly curious. My feeling is that, if they get close to NV but at a great price and efficiency, I'm still a happy camper. :) What say you my good man?
@sinephase
@sinephase Год назад
yes, marketing is huge and gets their name in people's minds. They should be more explicit about being in xbox and playstation as well if you ask me. I bet most people don't even know about it.
@anthonym9626
@anthonym9626 Год назад
Beating Nvidias top dog would grant them a lot of mind share in the consumer market, as long as it had a decent cooler n stuff
@elon6131
@elon6131 Год назад
The issue isn't just the lack of a top spec card anymore. they just keep getting further away at every segment, thanks to Nvidia's software advantage. that's not even mentioning that Nvidia currently has a massive production cost advantage on the 4080 vs Navi31 (it's like, 50% more expensive to make or something stupid like that, despite the much more expensive node Nvidia uses). Nvidia's good at designing GPUs too..
@adoredtv
@adoredtv Год назад
Yep I get you bud, and I thought long about this while making the video. Everybody is different and for me, the minute you give up or stop competing is my limit. I feel embarrassed that I'd put so much into RTG for them to surrender so meekly. I'll talk about it more in a later video. ;)
@lamikal2515
@lamikal2515 Год назад
@@adoredtv That's the thing I do not understand Jim. The CPU division seems (enphasis on *seems* ) to fight as hard as they can for the desktop and laptop markets. It's almost like AMD is suffering from schizophrenia. Or maybe RTG just plain do not care about desktop graphics anymore, and focus exclusively on HPC.
@abesmissioncontrol2013
@abesmissioncontrol2013 Год назад
23:24 I never purchased AMD hardware because of some weird duty to "save the company." That's just bizarre. Radeon has consistently offered better value at my shopping price-point for my last three gpu purchases (R7 260X, RX 470, RX 5700XT). That's why I purchased Radeon gpus.
@adoredtv
@adoredtv Год назад
There was a real danger of AMD going bust at one time, and I'm sure a lot of AMD fans bought products simply because of that. Talking like real shite like Bulldozer and Vega.
@rightwingsafetysquad9872
@rightwingsafetysquad9872 Год назад
It's frustrating to see AMD not attempt to compete at the high end. But consider that AMD put considerable engineering effort into chipletizing the GPU. Next year they could conceivably double the 7900XTX with almost no effort. If they don't beat the 7900XTX by at least 50% in the next 12-18 months we should give up on them. If AMD is smart, they'll recognize that their hold on consoles isn't as strong as it was 3 years ago. Nvidia has very compelling SoCs. Qualcomm has the IP to produce such an SoC, but they don't currently have it. Intel almost has it. A dying company like Imagination might just bet the farm and win a console design bid.
@mmarcinigielski8374
@mmarcinigielski8374 Год назад
300mm2 is the gpu size, but that already pulls +400w when uncapped.
@Nelthalin
@Nelthalin Год назад
I do agree with what you say but if we have to give up AMD and the Future of Intel is so uncertain and their current products are not that usable without accepting a lot of issues, what is left? im for sure NOT going to buy an nVidia card. I banned them after GPP that was the final straw for me. With prices going up i dont like the PC hobby as much as i used to. Maybe i should just stick to buying second hand hardware 2 years down the road to keep things affordable and hopefully the AMD cards will be good enough to run the games i want to play. So far i notice im sticking to older titles anyway. But more than 50% of the PC hobby was the hardware the new architectures tweaking the stuff. I feel that is slowly dying out. I really dont get why AMD is not trying at least in the more low end segment. Dont they have enough wafers to do that or what is up with that? The 7600 series seems to perform well in the laptop space. So if they can make a lot of those an price them well they can have their margins and gain some market share. Most people dont care about 400+ dollar GPU's anyway they are to expensive. Polaris did really well the refreshes where to mediocre it needed a real followup that never came. I understand that these high end N31 GPU's are not that interesting for them but selling a product like N33 at mass might be? They are way cheaper to design and manufacture dont need stupid big coolers to keep cool and address a way bigger market. I still hope they would do something like that but like you said they just not seem that willing to do so.
@deilusi
@deilusi Год назад
To be honest, RDNA3 had some teething issues, so I am not surprised they picked some less aggressive path. Imagine if they would invest in that big die, only to see rdna3 have hardware clog they cannot solve. AMD don't have a budget for that. They might have sat out this one gen, to get the next one, when drivers and architecture will get more stable, to strike harder then. I really hope they will do more in next gen, (or update to this one). About price and marketing, INTEL did layoff all the lying bastards, I fell like some of them attached themselves to AMD now. IMHO it's a yellow card for Radeon, but its not as bad as the video makes it. Might be a mistake they never make again. its a shame they did that, but not all hope is lost yet, until they repeat it.
@gamingtemplar9893
@gamingtemplar9893 Год назад
The last 3 months after watching your video of "does AMD want to win?" and Coreteks video about a possible price fixing, which I personally think it is not exactly that case but close, I repeated this theory that AMD does not want to compete with Nvidia as a matter of FACT. My theory that I already explained is more like economic/organization theory, corporations work in a way that they avoid any possible competition, usually this is done with government state law making help, but also, like in this case, corporations can make a deal: "Jensen, I buy ATI, we play competition, then I make Radeon brand trash and leave the market to you" "What do you get in return AMD?" "Jensen, I get the console market, IBM no longer will produce the console CPUs and you leave the GPUs to me, you win the PC gpu market, I get the consoles". This is what happened, corporations work this way because this is the BEST outcome for both, AMD and Nvidia. If you think that any of them could just "eat the other, why not do it" the reason is that in fact, that is really really hard to do, look at Intel now and AMD on cpu market, and the risks of competing are very high, AMD almost disappeared, now Intel is in trouble, why? Because they are actually competing. What Intel could do is move to another market where there is no AMD nor Nvidia to fight, and if that happens.... you will have 3 de facto monopolies. This is a way for these corporations to be actual monopolies and look as if they are not, as if they are competing. This is possible in this highly, extremely regulated market. Also AMD sells are related to government, not sure if just American government or others too, I read that data somewhere. We don't have a free world, this is the fundamental issue. Intellectual property for example, sounds nice and needed, I have bad news, corporations exists since these laws exist, since patents were created, you might find this something needed, but it makes things worse. Anyway, unpopular opinion.
@badass6300
@badass6300 Год назад
Intellectual property, licenses and such are holding humanity back decades
@Hybred
@Hybred Год назад
I don't believe AMD will be completely uncompetitive forever. I think as Intel gets better and better, they will steal AMD's marketshare and AMD will start competing again thus NVIDIA will a bit as well.
@TheHighborn
@TheHighborn Год назад
8:45 that's a miss-translation AFAIK. They said, it'd have been 600W (4090 competitor). EDIT: It might be me, who's wrong edit2: 16:00 i think that amd chose this strategy, 'cos even if they sell GPUS that are similar perf, for significantly cheaper, people still wont even consider it. I have sysadmins and and user support in my company who just fucking hate AMD for their past failures, and 'cos of their views it's company policy to not buy amd stuff. nor do they ever recommend AMD to family and friends. We're strategic partners with Lenovo and all the current computers are intel 11 gen cpus. Total clown word. Because of driver issues 10 years ago, people that are in IT are still bitter and won't even touch AMD. And that's mindset of the average consumer. That AMD is the bad, cheap, budget option, therefor they wont even consider it. If you want to know how bad it is, watch a few content creators on RU-vid. They call Ray tracing RTX. That's really all you need to know.
@badass6300
@badass6300 Год назад
Yes it would be 600W, because they are using 5nm and 6nm and the connection to the cache modules consumes a lot of power.
@ARedditor397
@ARedditor397 Год назад
I can confirm this from another company they don’t like AMD GPU’s either and the employees prefer NVIDIA because of the driver issues
@badass6300
@badass6300 Год назад
@@ARedditor397 Nvidia have had more driver issues than AMD in the past yew years, there are even videos that came out who did the research and showed this. Nvidia is usually prefered professionally due to CUDA.
@ARedditor397
@ARedditor397 Год назад
@@badass6300 cope
@badass6300
@badass6300 Год назад
@@ARedditor397 you can call facts cope as much as you want. But you can also google or search on youtube to find for yourself, Nvidia had about 5% more driver problems than AMD over the last couple of years.
@XYang2023
@XYang2023 Год назад
I just purchased a 4090 last week and I also put a 4090 video on my channel. 4090 is clearly more than gaming. With AMD's severely lacking support on the software side, assuming there could be a 4090 competitor from AMD with some pricing discount, no one with a right mind would purchase such product. For example, for machine learning, almost all the people working in this field choose Nvidia's cards. For me, I learned this the hard way with Vega FE. Back then, they promised a lot with software support for Tensorflow. But what I found was that it was just not usable. In the end, I purchased 1080 Ti instead.
@WayStedYou
@WayStedYou Год назад
This is why I keep telling people AMD isnt going to go hard on graphics cards, they just make way more money by making chiplets to sell everywhere.
@NoobGamingXXX
@NoobGamingXXX Год назад
They will need better cooling and better materials to handle any significant spike in power. It doesn't come with snapping fingers, more R&D = more money and a lot of it, if they can do it It was always like this Nvidia release something then AMD release months or a year later and says "look we have nearly the same performance as the second best Nvidia card" what they don't want to say is we are fucking a whole cycle behind and we can't even put the fastest card in the market Those VPs talks is a wishful thinkin begging fans to believe " yeah we can do it but our super amazing card which doesn't exist will cost a lot and we are a poor man video card company" The day they loose the console market or even one of Sony or Microsoft, the GPU department is going to close.
@imglidinhere
@imglidinhere Год назад
So you think AMD is giving up because they *aren't* launching a $1600 GPU?
@onomatopoeia162003
@onomatopoeia162003 Год назад
That would be for their actual work station cards.
@Stars11222
@Stars11222 Год назад
Thank you for saying what I started seeing when Vega launched. it always felt weird how they where hyped up but would always aim for the "cool/good guy" right under NVIDIA. at the end of the day it reminds me of where the female wolf will act scared and hide under the alpha during a fight but in reality is just protecting their neck/lowers from attacks. end of the day these are both companies, they will only go for profit, they are not friends or family, they are an entity, a corporate face and that's all they will ever be. unless something forces them to aim higher and truly innovate and push the limits then we will see the same thing we have seen since the RX480 series with it just working good enough to keep people interested. I genuinely hope that the chiplet design will age like Ryzen did but if it does, you said it right, they will stay just under and never go over, or another words, not bite the hand that feeds them. overall i will still buy AMD because of the partner brand sapphire, I would have gone with NVIDIA but their practices disgusted EVGA go much they left so i see no reason yet in my eyes for them End of the day though all i want to see is path tracing take over and for us to properly have good rays, just have no clue when these brands will be pushed to go for kill on that one thanks for putting into words what needed to be said, but one thing i would like for you to do and continue on, for both NVIDIA and AMD, is breakdowns of architecture. The way you present the inner workings of gpus and cpus are fantastic and provide great incite into what is normally a box of black magic for many people including myself. keep up the good work Jim, who knows what we may find technology wise that has yet to surprise us, there is always something right around the corner!
@DeltaSierra426
@DeltaSierra426 Год назад
Way way off my man. This is exactly the problem that AMD struggles with, which is that being no where near the market leader, they 1) are probably going to follow the leader's pricing (and as you mentioned, PC gaming isn't generating them much profit if any at all, so nVidia being greedy has helped them actually turn a profit this time around [will have to wait and see 2023 H1 financials to see how much exactly], ESPECIALLY since nVidia came out several months before RTX 4xxx saying to expect price increases, 2) don't know how outlandish nVidia will go on their "top-end" gaming GPU's, which the 4090 is in old Titan territory previously, i.e. a niche market 3) AMD has a rep (deserved or not, doesn't matter) for having hot, loud, and inefficient cards -- we can see how they've really tried to change the narrative as a claimed performance per watt leader, etc. I still think a 600W TGP/TBP GPU is absolutely lunacrious, even if real-world useage is much lower. Again, with nVidia is a firmly-established market leader, they set the standard on what is "normal." 3) Why does it matter "if" AMD could have made a better GPU than the 4090? Most of the gaming world slots in well below that. The value perspective is great on Radeons, and they're wise enough to know that's how they have to hold being 2nd in market. Beating nVidia both in Rasterization and Ray Tracing is an expensive endeavor that will probably only result in lower margins or even going in the red. It's a fool's game. Still, if they can chip into market share each generation, and I'd argue they did with the 7900's, that's about all they can. I appreciate your hyper-deep analysis and speculation, but it's still only from the lens of one person and someone that might not appreciate all the complexities of the business aspects. We forget that these are corporations with a fine focus on making money, not necessarily being "the best" or doing everyone that a certain class of customer wants (gamer, general PC enthusiast, mainstream, and so on). You mentioned the video being sad/disheartening. It's a pretty classic case of expectations vs. perceptions, IMO. Not that you are entirely wrong, but you went on a lot of circumstantial evidence and predicates.
@FelipeFritschF2
@FelipeFritschF2 Год назад
It'd certainly be very ironic if Intel's next Arc GPU generation manages to be competitive with nVidia, and that gets them and AMD to drop prices, just like Ryzen "magically" got Intel to halve their CPU prices overnight
@jerrywatson1958
@jerrywatson1958 Год назад
Thanks for this, I will buy last gen gpus for this year. The prices are much better and they perform just fine for my gaming setup.
@gamingtemplar9893
@gamingtemplar9893 Год назад
Awesome video, I have the same feelings since the last time AMD presented the cards.
@abowden556
@abowden556 Год назад
Maybe intel will save us. LMAO. Even when their Margins blow Nvidia away, they STILL refuse to compete for market share.
@brainthesizeofplanet
@brainthesizeofplanet Год назад
Well actually u have to attribute some money from APUs to the Radeon department as they use the graphics core.
@Premosilva
@Premosilva Год назад
Thank you for a great analysis.
@Speak_Out_and_Remove_All_Doubt
With the caveat of 'for Radeon', I believe the 7900 cards have sold pretty well and if you are right that the original plan was to sell them at around $699 and $549 then surely that means Radeon profits will get a much needed boost this year, possibly giving the Radeon department the budget/justification to do something they maybe wouldn't have if it wasn't profitable anymore???
@m_sedziwoj
@m_sedziwoj Год назад
Do you know how much money Amazon do for years? 0 zero. Why? Because to grow market you must put effort and don't take easy money. If AMD don't have clear win in 2-3 generations in row, they never change they market share.
@PaulSpades
@PaulSpades Год назад
I don't think so. TSMC seems to have milked most of the profits for everything past their 7nm tech, and they do need all of it for the next few nodes to stay ahead. There's hundreds of billions of dollars invested by intel, samsung and TSMC in this mad dash to come up with tech for shrinking these damn transistors. It's going to get even worse if they don't make 3d gates working, and both sram and dram memory are not getting significantly cheaper.
@m_sedziwoj
@m_sedziwoj Год назад
@@PaulSpades and you looking at Nvidia and AMD financial reports (which are audit, and any falsification would end in court) and they have 50-60% margins, yeah...
@saad_ghannam
@saad_ghannam Год назад
If I had a nickel for every time someone suggested AMD should shut Radeon down, I would have enough money to buy RTG
@svettis
@svettis Год назад
I understand what you're saying, but it feels you're contradicting yourself. Revenue is up, profits are stagnant. That means they are making less profits, which means they are pricing their latest products with lower margins so it seems they are trying to compete in price. Selling 7900xtx at $699 would have generated a loss it seems. Another huge problem for RTG/AMD is that the mindset of GPU consumers is to always buy nvidia no matter what. Look at the 1650 cards, they sold in droves even though there were AMD alternatives with better price and performance. It was the same in the CPU space, but AMD got lucky with Intel not being able to execute for a decade, forcing consumers to re-evaluate. Nvidia keeps executing so the mindset wont change. I honestly don't think a $799 7900 XTX would have changed sales all that much, it would only have impacted AMDs bottom line.
Далее
RDNA3 - AMD's Zen Graphics Moment
34:02
Просмотров 78 тыс.
How Good...or Bad...is Ampere?
21:57
Просмотров 57 тыс.
Bilasizmi?
00:12
Просмотров 280 тыс.
🎙ПОЮ ВЖИВУЮ!
3:17:56
Просмотров 1,5 млн
Nvidia - A New High in Low Morality
25:01
Просмотров 88 тыс.
Nvidia's Dumbest Decision
32:07
Просмотров 113 тыс.
Nvidia's CUDA Core Evolution - From Fermi to Ampere
21:29
How this tiny GPU invented the Future
18:00
Просмотров 220 тыс.
Intel's 7nm Is Another Disaster In The Making - Why?
15:36
Zen 5 Secrets Revealed - it's Gonna Be (Really) Good
18:40
Intel's "Real World" Benchmarketing 101
39:24
Просмотров 54 тыс.
Zen 5 Set To End Intel's Gaming Dominance - Part 1
29:12