he’s being reasonable imo, Amd aiming for the low end keeps allowing nvidia to pull further ahead, they don’t incentivize industry to switch to their stuff or improve their stuff it just keeps getting less and less appealing even if it is cheap, and their product stack is awkward every other card is either save up for nvidia or save money for Intel ray tracing and encoding
I'm so tired of Nvidia dumbing down GPU's. IMO, the 1080ti was the peak. The 30 series were ok, but damn they are doing us dirty with the 40 series. Way overpriced and they rely heavily on frame generation software to make up for the hardware shortcomings.
RTX 30 and 40 were both quite sizable update gens, its just that the damn chip shortage made them realize that they can ask even more premium price cause people will still buy. Good improvements, just not worth the money it used to be.
Do you think RTX 4090 has biggest generational leap in term of efficiency and raw performance but it is pricey and it is obvious that if you are the only one to produce that much powerful GPU you will sell it pricy
AMD doesn't need to compete with 90 class cards. If they are similar to the 7000 series but drop prices a bit across the whole stack, they'll be set. For example: If the top card is 899 instead of 999 and the second one is 749 instead of 899, they will eat Nvidia market share assuming everything is priced like the 40xx series. If you can get 5080 performance for 899 instead of 1199, I think more people might be compelled to try AMD
And hopefully there will be efficiency in the high end similar to the rx 6000 series or even better because last time i check the 7900 xt draws way more power than the 6900 xt.
@@gamingcomputers7485Power Draw is virtually the same. My 6800xt Nitro used 330w OC UV, my Halo 6900xt OC UV used 350w, while my XFX 7900xt OC UV uses 375-400w. But stock the 7900xt uses 310-340w.
@@BigAndTattooed This time around I believe they will price its GPUs better to gain market share & recognition. They need that, they have no choice, because Intel is riding them from behind.
Yeah man, Im rocking a Fury X right now in Bazzite Linux, and it's still killing it. I can get 240hz in all my favorite games with some settings tweaks. I'll eventually move on to my Vega 64 once I get it fixed, but my Fury X can run games easily in Linux.
Yeah but crossfire isn’t around anymore so buying two cards for the performance of one isn’t feasible or are you just saying you were happy with your purchase?
It's amusing how many comments make it seem like they're being forced to buy expensive NVIDIA cards when there are more affordable AMD options available.
Focusing on mid tier GPU is market is a smart move for AMD. Similar strategy is how they snuck up on Intel. Developed the APU and Ryzen from the ashes to corner console and handheld market and now got Intel on the ropes in PC and server market.
So the issues with AMD are that they were trying to compete on the high end in the same price range, but the thing they could have done is to undercut Nvidia much more. The 7900 XTX right now is around $900 but what if it launched at that price instead or even less at $850? That would make a card like the 4070 Ti look a bit weaker in comparison and fix up the market a bit.
If they just could get a good optimized design that doesn't cost too much to make and price it aggressively they could just take over marketshare in the low end where most graphics cards get bought anyways. It just sucks that nvidia usually just whack them with a stick right after they discount to negate their discount but they just don't offer THAT much more so, nvidia it is. The age old saying of "I wish AMD would drop prices, so Nvidia drops prices, so I can buy Nvidia." still lives.
That's assuming they even could, as in they could without incurring a loss. We don't know what margins they have and they at a minimum would have a lower profit of scale compare to Nvidia?
Doesn't work because people don't buy it, AMD would have to offer at least a 20% discount compared to the same performance nvidia card for people to buy it.
@@ghoulbuster1Yeah that would have make it worth it for a lot of people. If the 7900XTX launched at $799, it would been a huge seller, especially with the lower card price accordingly. I still got a 7900XTX because I just wanted to give AMD a try and the GPU is a beast, i easily max out games with high refresh rates but I barely ever touch FSR. It’s just not good enough to compete with dlss,and ray tracing just isn’t worth it on AMD. Great card but AMD really could have pull in way more market share if they have price it better.
No, and I have to assume he knows that, so I'm expecting better prices coming soon. If he can sell a product that is about as fast as a 4070 for, say, $350, that might be enough to trigger a lot of 1080 Ti, 1660, and 2060 owners to consider an upgrade. Of course, Nvidia switching the 4070 to DDR may also help AMD get the market share they crave.
Lol, that's not how business works. Products have a certain cost associated with them. They have to make a certain profit on each unit to find the next generation and to run the business. AMD doesn't have Intel/Nvidia amounts of money. Not even close.
I agree, AMD needs to price their items significantly lower than their Nvidia counterparts to give the users a good enough incentive to side with them. Simply being about 10% cheaper than their counterpart is not a good enough reason to give up the advantages of having an Nvidia card (DLSS is still better than FSR, ray tracing is better, path tracing, AI, power efficiency). I think about 40% cheaper than their counterpart will make me switch.
With clock speeds 25% higher than the 7800 XT, faster VRAM, a whole bunch more ray tracing units, a couple tensor AI cores, and 4 more CU's, it's going to be very interesting to see what the 8800 is going to be capable of doing. If the rumors are true, that hunk of silicon should benchmark somewhere between a 4070 Ti and 4080 in pure rasterization, go blow for blow with Nvidia in ray tracing, and have 16gb of VRAM with an MSRP possibly as low as $499 USD. Totally not holding my breath though.
Am I the only one wanting to hear the deep BELOWWWW on a regular basis lol? It was always your thing, then you moved away from it, now you give little bits here and there in some vids. Dude, been following you since the beginning, and I have to be honest... I miss it, and need more. Outside of that, keep doing your thing! I always enjoy the content, I hope it's starting to pay off for you $$$ ;) Anyways, I appreciate you!
It feels like AMD is all in on APUs. I think they are aiming for making consumer graphics cards obsolete and eat Nvidia up by offer laptops and handhelds with better battery and performance at a cheaper price than an Intel with Nvidia card like you see as the current standard.
Imagine if you will, a chip that has 7800X3D CPU performance coupled with a APU/GPU with 7900XT Performance, how appealing would that be to comsumers!! I'd buy that in a heartbeat!!! Thats what Im running now!
For me, it's simple... If Nvidia isn't going to be upfront on packaging which 4070 has 6 or 6X memory, then I won't suggest 4070's for my shopping or for my friends/family. Not going to do the extra digging to figure out which is lesser when either the pricing or the packaging should reflect it. Buy a 4070 Super/Ti or just buy a 7900GRE/XT and skip the regular 4070's, let others deal with the shopping headache.
I think the UDNA is a great move, back to basics. Since their portfolio has grown maybe they can implement some new tech that will help the get over the hump.
Preventing consumers from making an informed decision by not telling them what version of a product is in the box will end badly. I remember back with the PentiumD, Intel had too many PentiumD Extreme chips so they actually just started binning them as regular PentiumD's. So you would literally buy 10 PentiumD's and then hope to get lucky and find the mythical PentiumD extreme and send the rest back. That was like the golden age of overclocking value for me.
The thing is, it's not the DIY market that AMD is or should be aiming for. They should get their cards in prebuilt. In game show. In conferences. They need contracts. That's where people just "get" an AMD GPU as part of their PCs. Thats why they wanted to boost in mobile. Need more AMD Advantage presence. More handheld helps presence. But really, they need contract to be in cyber café, and prebuilts PC. Having those discussions with gamers, they don't care what's in their PCs. They want to know they can play with their friends or online. And those rooms are PACKED.
To be fair the difference between GDDR6 and GDDR6X is inconsequential. We're literally talking about a percent of a percent difference. No one would be able to tell any noticeable difference in their games. But still, the way Nvidia handled it is scummy. The difference in performance is so tiny I dont know why they werent honest about it.
There's something really messed up with the GPU market. Not wanting to go for the 10% that can only afford Porches and Ferraris is idiotic... The only reason it's the top 10% is because of the stupidly high price. If you'd make a GPU that's faster than the x090 at a price of a x080, you can bet your ass you'd be at 50% market share. Doing the same thing as the competition basically brings you nowhere. They basically look at Nvidia and say, OK if they can sell the midtier GPU at 500$ we should sell ours at 490$... Even though it could easily be sold for less. But nope, we want 50% market share at the same price... Lisa should really give them a kick in the ballz...
Intel should have learned no matter how big you get you CAN and WILL fall if you get too confident just because you advanced a few steps on your competitor!
Look at what happened to the RX 480, they tried it already. History is going to repeat itself and they are going to fail down the line again. They are going to get greedy after having a small win with some budget cards and then start losing the battle again. Plus the RX 480 wasn't even a clear win over 1060 when they had the chance to compete against Nvidia. Now that the technological differences are more apparent with Nvidia's AI abilities, AMD has no chance to compete even in the budget range.
Nvidia can build highend gpu for the posh boys, AMD building great midrange gpu sounds the best idea, more people can afford that, especially with prices consistently going up
UDNA is because gaming GPUs (RDNA) actually work better than Professional GPUs (CDNA) for AI, so they can't have consumers with more potent AI cards than professionals, and they can't compromise the professional cards by using RDNA, it still needs pro feature set, so the unification is for sure the right move! Though I'd have kept it under wraps until launch as business purchasers I suspect will now feel pushed to Nvidia for this generation 😬
AMD going for the mid-range is exactly what they need to do, it's the right move. Nvidias top tier is going insane in pricing, wattage and the possible melting cable.
@@del669 they had the right idea with RDNA3. Dunk on Nvidia for insane prices and power. But they didn't deliver on a killer segmentation (regardless of XTX perf snafu/fuck-up). Had RDNA 3 been sensibly priced at the top end, with the RX 7900 XTX matching RX 6900 XT price and ALL CARDS priced similarly to RDNA2 (and named properly......), they could have had a banger line-up. And them skipping the 800&900 series for RDNA 4 would only remove GPU above 600€ and be a clear signal for prebuilt and the game café market.
AMD should have just taken the mid market from NV a long time ago but they were obsessed with selling mid GPU's as competition for NV's Titan cards. NV's artificial market segmentation has left an entire market looking for a unified architecture. So AMD is announcing doing what it should have done 5 years ago when the timing was perfect and NV was fleecing everyone.
Its the ratio, a gpu isn't as important to a person as a car. meaning a person can spend tons on his home but spending tons on first class air tickets (which is obviously less than house) is a luxury
Nvidia is just getting bolder and bolder with this bs. Honestly there needs to be some kind of government crack down. It's just getting old at this point. They are one of the most unethical companies I have ever seen and I follow them pretty closely.
I went amd 7900xtx. Never looking back. Nvdia is a ai company. Amd is the consumer friendly option that got their shitty drivers figured out. Nvdia cant compete with mcm. Watch the money guys, the money talks.
If they dropped the AI bull, AMD could focus on both segments. Create a lower latency 'infinity fabric' and add more cores to tank the RTX 4090. But it won't happen; I mean look at the 9000 series CPUs. You have to update Windows, configure it, and then probably mess with BIOS settings to get something significantly faster than the 7800X3D for gaming.
This is horrible news😢. I have a 6900 XT. and I am now coming across issues where I can’t run some of these new games at 4K high settings. Which means that it may be time to upgrade. I really hope that my only option isn’t to go with Nvidia.😪. Their prices are over inflated and there decisions surrounding their ram definitely makes me not want to support them. The 7900 XTX is not a big enough of an upgrade to warrant and investment😪 I was really banking on the 8000 series. Maybe I’m overreacting. I guess I’ll have to wait to make that decision when it actually releases. I’m still gonna hold onto Hope😢
Please tell me what issues are you having? I have 7950x, 32g of ram and a 6900xt. I have no problem playing 4k high setting. My guess is that your on windows and i use Linux Mint.
@@CoryMillard yeah you probably have lower background processes. And such my thing is unreal engine 5 games. It runs everything else flawlessly. But you can forget about Ray tracing😅
I was actually wondering when AMD would unify the two graphics architectures again considering that CDNA has GPU chiplets working perfectly and RDNA doesn't. Ryzen benefits from the fact that it uses the same chiplets as Epyc. Let's not fool ourselves, the bulk of R&D money is going into Datacenter and client is benefiting from sharing the same architecture. Right now, that's not happening in the GPU space so it makes sense that they would unify it again. It might be less R&D to get CDNA chiplets working for gaming, than to get RDNA chiplets working. Who knows?
As long as AMD´s top tier GPU delivers 4080ish or next 5080ish performance for a competitive price I couldnt care less if they have higher class model to compete with NVIDA. It just does not make any sense at all to buy these kind of GPU´s (4090). Its stupid on each and every level under each and any circumstance por private consumer.
AMD going for the entry- and mid-level cards is a pretty good decision in my opinion. If they can at least bring an at least 4070(ti) equivalent level card for sub 400$ it'd already be a massive win for gamers and a decent 1080p card for less than 200$ even more. They can target the 80 and maybe 90 level for when they've comfortable to do so within a reasonable timeframe and those would need to be at a decent pricepoint as well since many people simply do not have a spare 1000$ laying around for that.
That being said, it'd be a massive win for gamers in general if we could go back to reasonable pricing of GPUs. I still remember I got my GTX 670 for 350€ and my GTX 980 for 420€ back then, an 80 class card for less than 600-700 would be a dream.
Regarding AMD and the high-end on the 8000 series, I find the strategy relevant. We are at a level where GPU performance is well present, but there are gaps in game optimization for some years now and technologies that are still being refined. Third negative point, instead of optimizing, the tendency of some manufacturers is to open watts to gain naturally and mechanically in performance, the ease, while there are indirect costs for the user with an increased need for a more efficient cooling system or the expenditure on electricity. Taking a break on a level, as AMD does, to optimize the existing in relation to yields such as technologies with AFMF2 for example, seems an interesting strategic solution. Just like video game publishers that could allow their developers to do the same and stop releasing mediocre and unfinished games. The hardware exists, a time of consolidation would be good omen. My humble opinion. The future will tell us!
The Unified DNA might not be a flip-flopping but, on the contrary, a step forward to develop an architecture to compete to CUDA that can be run on PC market GPUs.
AMD won't get that much market share in PC dgpus if they won't go for high end too. I think there is too many Nvidia fans or casual buyers who just check what is fastest out there and then buy anything with same Brand sticker. Also game devs optimize logically for most used HW. It is hard for AMD to gain 2x or even more market share over their current 14 percent.
Thanks for changing the thumbnail to show the 8900 XTX instead of it being ripped apart signaling that it was going to be discontinued. Great way to make sure you get that extra view!
It's easy for CPU's to use multiple cores because most of the threads have nothing to do with each other. They don't have to be synchronized perfectly. 1 core handles the download. 1 core handles Excel. 1 core handles the browser, 1 core handles the video that's playing, etc. As I understand it, it's much harder to write code in a game to utilize multiple cores (even on a cpu) because there's always that 1 thread that has to track the world as a whole. All the individual events in the game have to be synchronized perfectly with each other. The more individual threads there are, the harder it is to keep them all synchronized. Note that I said harder, not that it couldn't be done.
This is both good and bad, good because people have better value options in the entry and mid range GPU space. Bad because no competition in the high end means NV can continue to charge stupid prices. 🙃
Its good if they have cards with good performance value as most people buy in this range, it costs 2000 to 2500 euros for a RTX4090 in France which is crazy 😂😂
The only reason I ditched Ngreedia and Shittel because they have just become sooooo arrogant that they just don't bother about their customers. They think that they can do whatever they want and get away with it (which sadly is true because of some idiot fanboys). Nvidia may be better than AMD in the GPU division but until they are treating their customers right I'll never buy from them again.
i do not agree with AMD strategy coz we have a saying strike the iron when it is hot.AMD is timid they must be agressive this is the right oppurtunity to strike at their competitors.
@@HenneDS03 Those aren't sales, those are cumulative results based on voluntary surveys that serious gaming enthusiasts would be more inclined to participate in.
@@HenneDS03 Being a volunteer survey, its easily realistic that hardcore gaming enthusiasts would be 9x more inclined to participate in the survey ... most of those would have a 4090. It's not hard to imagine. I had a career working with stats for over 10 years, understanding the biases built into the data gathering methods is paramount to understand what the data is and is not telling you. The inaccuracies of steam data has been demonstrated many times in the past.
If AMD wants to remain in the video business they are going to have to either beat the 5080 OR they can mass produce the RX 7900 and dump it on the market at $600 or less. This IMHO is their only chance.
AMD not targetting high end doesn't bother me. Gamer got what they deserved. Even when AMD was better performance gamers were like well Nvidia has the better feature set. or Nvidia status symbol etc. Often the feature sets were for things gamers didn't even use. So right now them pivoting away from high end is not surprising. My guess is that they are going to focus on where they can and that is gaming handhelds and APUs where Nvidia can't easily compete since Nvidia doesn't have an x86 license and ARM for gaming is still not really well supported. AMD need market share and while it has most of the console market Xbox and PS5 development is so fundamentally different from PC that AMD hardware does not get the love it needs from game developers. I think people thinking they are going to focus on mid range have it wrong. I think the Steam deck created a new market and AMD is going to focus on that.
Nvidia has essentially abandoned the mid and lower end market in favor of the crazy priced expensive crap that 98% of gamers cannot afford, calling it feeding the AI market. Where is the 4050 and 4030? Where is the 5050 and 5030? Even the 3000 series had a 3050 but that was the last mid level 50 card. AMD is doing the right thing by going after the bulk of the market, which the intel and nvidia fanboys will blast them for.
NVIDA not having strong upper teir copetition and AMD sniping the Low through Mid teir may just play out favorably with them. They don't have to "win" just gain ground until NVida pricess themselves out of their own markets with MSRP and Wattage. 3k 4070s that guzzle 600w every two years anyone? Whats next make driver updates, RMAs or opportunities to buy a subscription?
4:40 .... *RADEON is just SMART !* Just max 10% of gamers buy an HIGH-END GPU, while 20% buy an LOW-END GPU ...... *BUT .... over 70% will buy an MID-END GPU !* So *The money MAKING GPU, will always the MID-END GPU !* I also, will NEVER buy an high-end GPU in my life ever ! *So good job Radeon ! No need for an ultra high-end GPU !*
I'll never buy a $1000 GPU let alone a $2000 one ... I for one welcome AMDs new direction in making GPUs affordable again. This will also help make Nvidia a better value. It sounds to me like AMD figures their multi chip GPUs needs more work, probably to be launched with RDNA5, and meanwhile they want to focus on market share before then. Its all good.
04:36 no reason why can't do both? what about the massive R&D cost of creating a card that is NOT going to sell on a market that is basically owned by Nvidia. Specially at a price that makes sense for AMD.
I dont mind AMD focusing on mid-high range cards since the 3080's, 6900's, 3090's, 6950's used are not selling well here at third world countries. My friend who has been reselling Well maintained GPU's from Korea and Chinese bitcoin farms are surprisingly running out more of the 6800's than 3070's. I
I’d really like to see AMD knock Nvidia off the “King of the Hill” mountain on high-end GPUs, but this article basically confirms that’s not happening anytime soon. Very disappointing news. Nvidia has had no real competition, so they can charge what they want, and we pay.
I really hope that this means we are finally going get some good quality gaming cards for small form factor (SFF) cases. Faster that the RX6400 and 8gb-12gb.
I bought my 7900XTX a year ago because of the rumors of 8000 series flagship nonexistence. So rumors were true after all, AMD goes low profile for the next years to come
I don't see it as flipflopping. I see it as they see the advantage of UDNA over the others. Some times it takes time to figure those things out. Gotta have the data.
Great so we can have 7900xtx speeds for a $500 midrange price right? RIGHT? At least bring back multi gpu support ffs, I haven't bought an Nvidia card since the gtx480 and I don't really want to go back.
Nvidia has too much capital to spend on their gaming sector for AMD to challenge such a giant. Not to mention Nvidia's ridiculous anticompetitive practices. They have users brainwashed that team Red cards are unstable, when they quite are. Anticompetitive practices of team Green simply discourage devs from doing additional optimizations for AMD architectures. If you're sleeping, this makes no sense to you.
AMD competing with the RTX ##90 makes no sense financially because even where AMD is competitive with Nvidia at the low to mid range on all metrics (think RTX 4060ti vs RX 7700XT), PC gamers and OEMs are still buying Nvidia GPUs. In most stores around me and even online the OEM build ratio is close to 95% Nvidia vs 5% AMD. At say BestBuy, I pull up pre-built gaming PC and of the first 100 PCs to come up, all except 2 are Nvidia GPUs and those 2 non-Nvidia GPU PCs are an RX 580 and an Arc 380. Meanwhile the Nvidia builds have RTX 4070ti and RTX 4080 builds along with the massive number of RTX 4060 GPU builds. We also know from past behaviour of PC gamers before ray tracing and upscalers, the only reason a lot of PC gamers and reviewers wanted AMD to be competitively and aggressively priced against Nvidia at each performacne tier was so that Nvidia would cut their prices and those same PC gamers would then go out and buy Nvidia GPUs. As for UDNA, it actually does make sense CDNA was initially very much datacentre and AI and loosely based on Vega 20. In fact, the Radeon VII was an extremely powerful compute GPU and was still keeping up with the RTX 3080 in compute and with crypto mining, it was only beat by the RTX 3090 before the ETH merge. Even then, RDNA is still be capable of doing tensorflow and other AI workloads just at a much cut down scale that the now scaled up CDNA. Now that we are moving to 4nm and 3nm and MCM chips and A.I. is more mainstream, it really makes no sense to seperate the two anymore. CDNA with the MI300 GPUs is fully MCM, if RDNA is going MCM, it simply makes economic efficiency sense to go down the same Ryzen, Threadripper and Epyc almost SOC setup. In that instance there is no need for seperate architectures and seperate chiplet designs. And just to clarify on what my have been misspeak, CUDA is not to do with the GPU itself. CUDA is a compute software/API stack the GPU runs on. AMD have been developing their own OpenCL-based alternative called ROCm ever since the Vega 10 GPUs. ROCm, now on version 6.2.0, is fully compatible with Vega and RDNA 2 and 3. There was an issue with RDNA 1 but that may have been resolved now. CUDA vs ROCm/OpenCL is why I also personally think reviewers doing productivity workflow reviews for AMD vs Intel vs Nvidia GPU reviews should not be including software like Blender and Adobe Premiere which only support CUDA meaning both AMD and Intel GPUs have to do HIIP API type conversions of the CUDA code into something they understand before running commands in Blender and Premiere. Which is a big part of why AMD and Intel GPUs tend to be slower than similar performance tier Nvidia GPUs in these applications.
Focusing on two fronts is not a good idea. Let's look at the automotive industry for instance. If say Porche are all of a sudden to say going to cater to both low end economic cars as well as high end luxury cars for those who can afford them, would it be a wise decision for Porche to split their automotive manufacturing as well as their Research and Development teams into two markets? It would not be so smart because the allocation of materials and workers would be divided into different mindsets, one catering to the luxury class and the other for the commons. This would mean that two separate divisions would be created, which means also that manufacturing and R&D would also be split into two. And it is not going to be a 50% split because it would all depend on who leads, which means that either it may be 10/90 percent split, 30/70 percent split, 90/10 split or 70/30 split. This means that these groups may not work in cohesion anymore but would vie for a higher cut. This implies that there would be competition within the company whereby the ones who make the luxury GPUs will struggle against the lower end ones. This means then that instead of focusing on the completion outside of the company, the struggle would be internal, i.e. internal instead of external. Now that is a problem that is worst for the customer from both camps, luxury and low-end. Because if there is in fighting between two different sets of ideology vying for revenue allocation, we the customer lose out in the end.
Pretending AMD isn’t second best at GPUs is Cope. Them stating they can’t compete with Nvidia isn’t a problem. Everyone knows they have the best CPUs, but being upset that they decided not to go head to head with the RTX Kings is almost insanity. Get off the AMD struggle bus if you want the BEST AI cards available. If all you’re doing is playing games then please go spend hundreds on a late generation AMD GPU and we’ll see you in a few years when you’re still having the same problems
AMD is probably dealing with some financial problems by trying so hard to catch up with NVIDIA, i think they are right and aiming for the 80%, you now the 80/20 rule right? 20% of effort to get the 80%, but 80% the effort to get the remaining 20%. I think they just don't have more resources to waste on the 20% remaining. the change in architecture is probably within the expected lifespan of AM5 socket, so no big deal there as well, if they make a single CPU that is great for productivity and still great at gaming, than nothing to complain.