It is funny how you people are comparing old and undesirable products in society with something newly researched and made which is also high in demand. You are the problem here. If you would look at anything cars related in aftermarket, they go in thousands of euros. A simple maintenance bill for a car is half a grand. So what you people are about?
I think focussing on their X600, X700 and X800 lines instead of trying to fight Nvidia with the X900 sounds like a winning strat. If your budget is the infinity sign you'll usually buy Nvidia anyway, but if they can beat every single card of Team Green in the 200-600 dollar range that would be neat.
@@detecta I fear even AMD has low key abandoned that class of cards, it usually makes more sense to wait for the x600 or even x650 XT to drop in price.
Most basic things are like 20% more expensive than 5 years ago. Even the things that weren't unobtainable for like 2 years. Expecting things to be the same price as they were a decade ago is delusional.
@@kaminekoch.7465Yeah, well salaries didn't go up at all or just a bit so companies can't expect people to be prepared to pay more. Its one of the reasons why AMDs sales have dropped badly.
They were testing the new cooler that can handle upto 600w. 4090 FE cooler already rated upto 600w if i remember correctly. During 40 series development nvidia actually testing cooler that can handle upto 900w.
@@reahs4815 4080S is about 300$ expensive(where I live) for the cheapest model compared to 7900XTX while the 4090 1000$ more. Imo, Either go 7900XTX or jump straight to 4090. 4080 Super does not make much sense even if you're doing productivity, you may as well push to a 4090.
I wouldn’t be surprised if AMD GPU users are less likely to report their hardware. I feel like AMD users skew more tech savvy and skeptical of tech companies using their data
I dont think fsr and other technologies are getting more focus than they already have. Once they killed highend RDNA4 they put people on RDNA5 we will be lucky if they announce it in 2025 but probadly in 2026
Honestly if they can keep upping the rasterizing performance and catch up in RT I'd much rather see that. DLSS is very impressive but it still has ghosting and blur issues, which is a pet-peeve of mine (and a lot of other gamers). I use my 7900XTX to run stuff natively, if that means no RT I'm more than fine with it because I really do not like upscaling artifacts. I tried DLSS3 on someone else's setup and still was annoyed by it.
Those same steam records also show a clear trend of AMD cards including the 7000 series increasing percentage wise month over month while there are many decreases in Nvidia's higher end, except for the top end cards. It shows a clear trend that people are not only buying AMD cards but also changing from Nvidia to AMD in some cases. There are many negative percentage valuses for the 2000 and 3000 series for Nvidia with the only clear win being the 4000 series, with AMD having increases across the board for 5000, 6000 and 7000 series.
But Nvidia is up 56% gaming revenue. Are you saying that that is only because of the 4000 series? If that's the case then the 5000 series will burry team red
Idk why Vex liked it, knowing the copium it is. The earnings calls don't lie, the Steam samples are random, and cards that are over 5 years old are still shown more. It's not "changing from Nvidia", it's "The 4060 is dogshit, so I'll take what I can get.", or "I want one of the best GPUs, but I don't have over $1k."
Kinda smart ngl. Their highest end card is like $800 less and only 21% slower at 4k but can't reach the 4090, like ever. So if they make all their lower end cards just like that, same same performance as nvidia but like 20 - 50% cheaper? Could be better for us all.
They don't have the same margins on the low end cards, so they're going to need to do some innovation on that front first. Maybe if they can pull off an actual chiplet design that splits up the graphics die.
@@AAjax Pretty sure they're already on a design that is cheaper to manufacture than Nvidia's design, which means that if they keep the current offset to Nvidia they'll have better returns. Nvidia already had to backpedal 4070 and 4080 prices hard due to AMD. I think AMD abandoned their flagship because it would be very expensive to make but wouldn't sell well (like the 7900XTX). They'd rather take as much marketshare of the low/mid range, press their efficiency advantage and catch up in raytracing. If they succeed they can potentially match or even overshoot Nvidia's flagship in a generation or two and then slash Nvidia's prices. At which point Nvidia might choose to go all-in on AI instead, if that keeps being where the money is.
They did the same with RDNA1 (5700) and Polaris/GCN5 (480/580). It makes business sense when they can't come within shooting distance of the priciest NVIDIA chips.
@@Ranguvar13 The biggest problem with that is that even though there are a lot of tech RU-vidrs and you can find statistics and information everywhere, people still base their mid-range purchase off of how well the enthousiast side of a company does. The amount of times I've seen people discussing AMD vs Nvidia using the flagship models when they all had midrange cards themselves... To then defend their purchase based on that discussion...
AMD ceased to "fight" Nvidia on the price at the low and mid range since the RTX 30X0 era, they increased their price to be lower than Nvidia but not completly undercuting them so they get a better margin. I wouldn't hope they will try to fight the price again they just seem focused on keeping their margin high and sell at the range they want.
They are. That's why they abandoned their flagship card. Too expensive to make and if it doesn't beat or at least matches Nvidia's flagship it's just not worth it for them. They tried with the 7900XTX but it was screamingly expensive to make and didn't sell that well. Efficiency and decent performance at very good prices should be AMDs game, I agree.
Thats what they want you to think. GPUs in general are horrible value today. Only true value cards are like $120 or less. You can find 1080s for 100 thats best value out there
@@knusperkeks2748 my 6700xt does just fine. So youre ok with the prices today then? Sounds like youre trying to cope with the fact youre getting fleeced by billion and trillion dollar companies.
@@knusperkeks2748 There was a time when high end never used to cost anything close to what it does today. Hope you enjoy paying 10k in the relatively near future. Also thanks for outing yourself as a boomer.
I guess they are planning to go back to RDNA 1 era. The issue is that AMD still struggle with ray tracing RDNA4 is probably a test to see if their new ray tracing engine can rival NVIDIA. Plus, there's a strong bias against AMD regarding their driver. Those biases are very, very hard to get rid of.
That not the only thing not do they only suck at ray tracing but at everything else excluding raster performance that's the only thing their cards are good at anything else from TAA implementation to Features even intel is better than AMD not even better Intel smokes them also their cards have way lower resale value so anybody who build a new pc like every year or so stays away from them and there's nothing AMD can do about it they'd actually be better of if they drop manufacturing GPU at all and ust focus at CPU afterall that's where they get most of their revenue from
Tbh, GPUs are very hard to get right compared to CPUs which doesn't have as much exclusive features. In fact, AMD cpus have avx-512 support and intel doesn't anymore. Quicksync is almost negligible with how good gpus are nowadays. AMD needs a better driver team. Their publicly maintained linux drivers are always outperforming windows and it shows AMD GPUs have the hardware performance to match nvidia..but their software engineers are lagging behind nvidia.
I absolutely love my 7900XT. Overclocks like a beast and can handle cyberpunk at 3840x1620 with 60+ graphical mods and still get 90+fps average. Best purchase ever
@@arcrides6841 cyberpunk is very well made in rasterization if you really want to see a difference in ray tracing then you will have to use path tracing and no card other than 4090 can do that at the highest settings above 1440p at any reasonable frame rate
@@arcrides6841 U love RayTacing?...Good!!...I don't....Love DLSS??..Yeah it is Fake frames..& not Fan of this 16pin fascination...7900XT gives me enough Real frames!! No CPU Overhead problem! Believe me I still have my GTX_1060 6GB & I Love that If your GPU can Run your game pretty good.. U happy at the end of the day...That's enough. Red, Green or Blue, doesn't matter.... though RGB matters.
6800XT here, first AMD card since...the HD 5850..no wait I had a 470 for a couple months in the middle somewhere that I was faffin with repairs on(used to repair GPU's for people) (I have some older AMD cards now, but I got them cheap for old PC's after I got the 6800XT for my main machine) after having Nvidia cards for ages(GTX 770, 1070, . when I was looking for a card it was during the late end of covid lockdowns etc, my choices were the 3000 series cards(with 4k on the horizon) or the 6000 series cards second hand. the 3000 series cards were going for the same price second hand as they were new, and the 3060 Ti was $1000NZD at the time, 3070 was 1400, 3080 was 2100 and the 3090 was 2400, and god knows what a 3090 TI was at the time, I saw one for sale for over 3k nzd. I ended up buying a 3060 TI and was pretty unhappy with it for the price, and that raytracing was annoying(it darkened corners and made stuff I want to see not visible, made reflections I don't look at prettier, but otherwise didn't do all that much to make it look better to me but cost a huge performance hit. a second hand red dragon 6800XT was going for $850 came up the only cards in my price range were the 3060 TI and the 6800XT, after looking at the stock speeds, and working out if I would use raytracing ever with a 3060 TI really, I checked benchmarks and framerates and realized the 6800XT smashes a 3060ti. I then checked overclocking related stuff and realized the 6800XT had massive potential to overclock like a beast, then be modded and overclock some more and the Red Dragon had so much high quality powerful fully populated VRM goodness. so I sold the 3060 TI for the same price I got it for and got the 6800XT. and a water block for it was $100, a second hand D5 pump and rad went for $40. so I had a water cooled 6800XT. then liquid metaled it. Then MPT to mod the power table to let it pull then used an EVC2SE voltage controller on it. now it gets 22k in time spy and found if I don't give a shit about raytracing, it floats between just above a 3090 to just above a 3090 Ti in performance depending on the game. I have no regrets getting the 6800XT. would I get another AMD card? yea, yea I would. still wouldn't fanboy for em tho, if Nvidia comes out with something that competes for the price I'd go them instead. but 8gb vram shitbuckets for stupid money that don't even perform that well? yea no thanks, then right after that cards that cost even more, still have only 8gb vram, and BARELY perform better than the last gen? yea no thanks.
Videogame crash is happening, because of absolute idiocy of "Game Passes," and Monopolization; due to this, the quality of games is decreasing, so there's literally no need to keep making ever more powerful graphics cards, when you can start focusing on Efficiency, Memory, Power Consumption, etc.
The people who bought 1200w power supplies are no longer looking silly. 600w GPU is crazy, throw in a 300w CPU and that's 900w just from 2 components lol. With overclocking you'd probably crash the PC from running out of wattage, not to mention the components would be on fire regardless of what cooling you use.
One of the best GPUs ever made for low budget I'm still trying to tell Nvidia fan boys about it, but they get BUTT HURT when I show them it mops up a 3060ti for less money lol
Last time they refused to compete in high-end segment was actually a good move for them. RX400 series were just a great value mid-range cards. Because in mid-range greater amount of VRAM and highter raseter performance is still a great selling point. In high-end you would have more VRAM and raster performancethan you need anyways so only features like RT and upscaling become a decisive factor - and Nvidia wins effortlessly. So I would ultimately see it as a positive change. It's better to just refuse to compete when you can't win rather than loose money on a product that cannot compete and thus will fail to bring profit.
What players want: low power consumption native 60fps@1080p $200 What brands and RU-vidrs push: high power consumption dlss fake dps@1440p/4k $2000 This is why.
I have a 7900XTX for 1080p (because I could get it for really cheap). I can run everything natively without any upscaling, and even though DLSS runs circles around FSR for quality right now, nothing beats native. I hate ghosting and the blurry mess you often get with high speed movement when upscaling. Not to mention the card isn't even trying so it's actually running games at ultra settings using less power than my 1070 did (unfair comparison, I know) at medium settings.
Nvidia pushing RT is what's ridiculous. Even today, it drops frame rate by half to get only slightly better lighting than pure rasterization. I picked the 7900 GRE over the 4070 Super just for this reason. Better rasterization performance and more VRAM for less money. Screw RT.
nah, at this point it's time to let 1080p@60fps go (not for 200$ tho), I would rather have 1440p@144fps or higher any day of the week. (i am an outlier though, as higher fps is unnecessary unless you got a 360hz screen like my stupid ass but I stand by that 1440p@60-144fps is what you would want in the current day)
Built my niece a PC for Christmas, Built 2 more, upgraded my desktop, about to do so again, upgraded my gaming rig. So far 3 Intel Arc GPU's, A380, A580, A750. 2 AMD, RX7600, RX7600XT. The XT is now in my Gaming Rig does RT just fine over the RX6600 then the RX7600, the 7600 is in my desktop. So just in the last year I have grabbed 2 RX580's, 1 RX5500XT, the 7600 and 7600XT and 3 Intel GPU's so a little over 1300 bucks total 8 GPU's. Still less than 1 4090. All my PC builds plus my old Board in a test rig frame are AMD CPU's.
Well, we're done this new Gen if it's true that the 5080 will be "just" at the same level of the 4090, but costing $1200-1300, that's just insane. Second best GPU from a new gen has always been much better than previous gen flagship. If AMD can hopefully make a mid/high end card for $649 that matches the 7900XTX raw performance with better ray tracing, that would be the way to go
Prices are ridiculously high, most people wont buy Graphics Cards at the current price. Which is a bit of a issue because they increase the prices because they don't get enough sales, but the higher the price increases the less people buy them. I for one only buy AMD, Nvidia in my country is about 30-40% more expensive.
Nvidia cards are only around 5-9 FPS faster in their top cards and does not justify a 2000-dollar price point when AMD is only asking 900 dollars for their top card which can play pretty much all games in 4k@60 fps. Nvidia uses major over hyped advertising because they target kids that have no experience to know any better that there paying an extreme premium for 8fps and they also use their latest Gimmick called Ray tracing like they did in the past with Hairworks so to be honest there Gimmicks work because Nvidia targets kids and persons that are not informed so therefore they get tricked into buying an overpriced GPU and with Nvidia paying youtubers to hype up there GPUs were likely to see Nvidia stocks continue to rise until people start to see through there Gimmicks
AMD cant compete and people act surprised? Took them until RDNA4 to implement ray tracing hardware that nvidia had back on the 20 series. Then intel came in and beat them in the low end on their first try... with much better upscaling.
@@baronvonslambert Its more about hype, having stuff like "giga omega better graphics" just makes headlines and gives fomo to the users. Atleast on the lower end cards
@@baronvonslambert it's marketing trick cause people are fukin' stupid. people thought that ps5 will do 8k gaming cause Sony said "8K" when releasing ps5 and 8K is on ps5 box. this is how stupid people are.
I don't know why so many people still run after Nvidia. The GPUs are far too expensive for what they do. I currently have an RX 7700XT and my god am I happy with it. I have no problems with either the drivers or the GPU The price of the GPU was bad before but now it's a blast. I don't think AMD will stop, they're also getting better and have better prices
brand recognition. like iphone. amd don't have such high brand recognition. even when an amd gpu is better than nvidia at cheaper price, most people will buy the nvidia cause it's nvidia. amd has to find something to innovate like nvidia did with RT and dlss to increase their brand recognition. even though RT is dogshit cause it eats even 50% of performance, people still fall for this shit. people are stupid, so if amd finds something than nvidia doesn't have, people will fall for it.
AMD isn't going anywhere. They just won't build stupidly big chips with 25 billion more transistors than anything else like nvidia did with the 4090. My 7900xt isn't on the steam hardware survey, nobody I know let's steam interrogate their system either.
i have a 6900xt, paid about 680 new, with shipping. also bought the 12 core amd cpu,non 3d, and 64gb of ram. i wont need another computer for 8-10 years. companies expecting people to buy new computers every year or so is dumb. dedicated rt hardware is dumb, every gimmick nvidia does is dumb. the software based rt is already almost as good. per usual nvidia bribed game studios to include gimmicky crap that only they support because they came up with it for the sole reason of saying "look,they dont have it." amd has done this a few times in the past,last one i can think of is the hair graphics thing. i dont think amd is truly trying to compete at the highest end. the board partners have said their cards could handle more watts but amd locks them out of juicing them. yeah the 4090 is about 20% better, but it uses 150+ watts more to do that. that's a hell of a lot, really speaks to how efficient amd is. also amds focus has been on the cpus, mainly server for a while. not a great reason for either company to devote their best people to gpus when they just dont bring in the money like their other stuff. also, keep in mind most prebuilt computers dont even have amd gpus as options, only a couple have offered them the last couple years.
@@vigilantbruiser1119 I'm sure nobody is forcing you to use a better monitor. The vast majority of people will have a better experience and most likely no issues with eyesight when using a higher resolution and/or higher refresh rate monitor, so I don't really see why we should collectively stick to 1080p@60Hz. But we're probably gonna be stuck with that for quite some time anyway, since monitors with a high resolution and high refresh rate combined are unfortunately still incredibly expensive.
If AMD wants to steal the market, they gotta start innovating and make something new that's headline and hypeworthy. Instead of following Nvidia they gotta make their own "Rtx" or "Dlss" moment
Or they could ditch the "High end" segment entirely and follow one plus/Chinese phones policy, that is to bring incredible value for money at low-mid level segments. I am talking 4060-4070ti level of performance at half or at least 2/3rd of the price. That would also make headlines.
@@naqibfarhan4356 They sont have to be at the high end and beat Nvidia's top cards, they just gotta do something to make better hype. Basically they gotta advertise better and use stuff like rtx for that advert hype like Nvidia did
14:43 Nvidia has traditionally not been first with new technology. It has always been a back and forth between the two. ATI created Tessellation in 2001, Nvidia was behind for a good while before they marinated an even greater solution. In 2002 ATI created the first DX3D 9 compatible GPU. In 2015 AMD gave us Async compute, paving the way for DX12 and Vulkan. Then that same year gave us the Fury X, the first GPU featuring HBM memory. Most recently in 2021 AMD gave us the worlds first chiplet based GPU, the MI200. Nvidia gave us stream processing units and CUDA in 2007, which were both exceptionally important developments. Most recently Nvidia is pushing Ray accelerators and AI which are both hugely successful.
All of what AMD has pioneering wasn't unachievable by Nvidia, AMD never succeed to replicate the success of CUDA. Wish is why even if somehow Nvidia started to lose the gaming market; steam hardware survey still shows Nvidia with 76% of the market while AMD is not even at 20% but at 16% and intel at a bit less than 8%, it’s hold on the professional and now the new AI market will enable them to overcome AMD when needed. I don’t think AMD could even compete in the AI market with Nvidia given the cost in R&D and the massive gap in size between them. I also forget that DLSS and RT where and still are Nvidia domain where AMD still struggle to replicate what Nvidia achieved first.
@@itachiaurion3198 Something to think about here is Nvidia's current success is largely attributed to the current interest in AI. the 40 series have been selling terribly compared to previous generations and that has everything to do with the fact that Nvidia cards are absurdly overpriced even compared to the 30 series which were notorious for being a terrible value. The 10 series GPUs were a serious high mark for them, they were fast and reasonably priced while offering a substantial increase in performance over the previous generation. The current generation of cards just don't make sense for gamers, I would argue their pricing is more in line with enterprise applications which is coincidentally why Nvidia's stocks have been skyrocketing. This is a short term trend though, Nvidia has the AI market cornered right now, but it's only a matter of time before AMD catches up or, most likely, purpose built cards made for AI become the industry standard. If and when that happens Nvidia will be in serious trouble, what the 40 series has done more than anything is sully their name in the eyes of the public and make them seem downright greedy.
@@Todd_Coward If AMD can't even play catchup with the best Nvidia as to offer for gaming, how can they catch up on a more difficult market? Maybe Intel will try something for the AI but I don't see AMD finally succeeding at overcoming Nvidia. Unlike Intel, Nvidia will not wait 5 to 10 years that AMD will catchup before waking up. They price may be too high but they are still in the lead in regard of the technology and don't seem to let AMD catchup anytime soon. Their success come from the fact that they are the sole leader in the 3D rendering space since at least 2015 and all of their gpu sell like hotcakes, even in the gaming market with the 40X0 as the exception and even that I have doubt. The most recent hardware survey from steam show that the top 15 spots of GPU are all Nvidia with the RTX 3060 in the lead and 10X0, 20X0 and 40X0 and other 30X0 in this top. Then we have AMD and Intel integrated processor in 16th and 17th place. Heck even the 4090 is nearly at 1% of the survey and it's not the better placed 40X0. The first named AMD GPU is the AMD Radeon RX 580 with 0.83%; Nvidia is still at a very comfortable 76% of the market from those steam hardware review. Even if steam is somehow the HQ of all the Nvidia chill AMD can't have more than 30 or 40% of the PC gaming market from those result. Even if the price doesn’t make sense, it seems that players bite the bullet and buy the gpu’s anyway. I really don't see AMD pulling a hail mairy and finally overcoming AMD in the 3D market while they are still behing Intel in the CPU market and far behing for the gpus.
Lol.... Funny that the intel iris xe integrated GPU is one of the popular gpu lmao! Dunno but makes me feel quite proud cuz I own one haha! great video btw 👍👍!
Got a great idea! AMD should just focus on cpus so Nvidia can concentrate on pricing GPU's however they like because fanbois will pay $3500 for the shiny new AITX5070 with 12gb vram and they will love it and rejoice that no inferior products are clutrering up their hallowed shelves. Good job Vex ypu won
Pretty soon at Nvida’s power consumption you’ll need a dedicated circuit just for your PC, most homes only have 15 to 20 amp circuits with multiple outlets on one circuit. The average person doesn’t understand wattage to amperage, 120 watts = 1 amp this is a subject you should make a video on. So a 4090 is pulling about 5 amps, a 4K monitor pulling up to 2 amps (some less) cpu maybe 1 amp so on a 15 amp breaker your already at 50% of what it can handle before tripping. Also sustained heat on a breaker weakens it, they are at the point say your microwave and pc are on the same circuit if your pc is on and someone uses the microwave the breaker will trip.
They could certainly get Ray Tracing improvements to the moon in one go if they wanted to, same as Intel did. They have the ability to kind reverse engineer what's already been done too, they probably know to a very high degree what they need to do to get up to par.
Marketshare doesn't lie. Nvidia is vastly superior in every way. People pay premium for feature set. This is why Nvidia hold 80% market share. It's not always about how cheap a card is if feature set and software is terrible. FSR is a joke. Also the reality is AMD will never catch Nvidia in software or hardware. They are so far behind it's an impossibility. Even INTEL who is new at this has a superior upscaler than FSR. It's hilarious. AMD doesn't push any new tech ever. When is the last time AMD introduced something that changes the market? Set's a new standard? Never. AMD just reacts to every new tech Nvidia brings forward in the most gimped way possible. It's a joke. In b4 AMD fanboys not willing to accept reality. Cost doesn't mean anything. Market shows this.
Yeah I just took my 3070 out and replaced it with 7800 XT. While the green machine has some good features, I am putting my money on AMD. NVIDIA needs a wake up call.
I found a youtube channel of my country that brainwashes people saying amd cards have problems and if you don't want problems buy nvidia and all of people trust him and go yeah bro thanks for telling us and informing us i feel so bad for those people who trust him (Most people in my country are poor and want to buy a cheap card just to play old games like takken 6-7 and gta iv-v)
DLSS changed the game. I am playing AAA games on my 2022 midrange laptop with a 3050, on high graphics, at constant solid 60fps, only drawing 45watts of power. You people don't understand how game changing Nvidia's hardware accelerated ai super solution is. It basically extended the life of entry level Nvidia GPUs.
Just goes to show how effective bs marketing is on the average person. DLSS 3.0 is a horrible technology that increases framerate while INCREASING latency. The whole point of high framerates (above something like 90fps) is to get lower latency. Meanwhiley raytracing looks worse than rasterisation in all except for maybe 3 Nvidia-sponsored titles, because devs just use shitty plugins they dont actually understand just like with TAA. But your average dude-bro with no understanding of tech sees the shiny new tech and wants it regardless. A friend of mine bought a 4070 a few weeks ago, and tried rtx once, then immediately turned it off because it halved his fps for a visual downgrade over raster. He at least had the excuse of owning a shield that he uses actively. Nvidia has perfected the Apple strategy of making things look new without doing any actual improvements (rounded vs sharp edges).
Did you go to Business School or something? Your Titles and Thumbnails are on another level man..... I dont think I have ever seen you make a title or thumbnail that is nearly impossible to not click for someone interested in the PC space. I am actually being serious because you are very talented at what you do & would like to get better at it myself.
I love my 6800XT but the first one was a dud and had to be RMA'd. The newer one is doing great but a little louder than it should be. This is my first ALL AMD system I have build in over 13 years. Usually I build with Intel / Nvidia and in all honesty, I will be going back to that combination on my next build in a few years.
We had rumors of the 4090 going to 600w as well. Lots of them. But that's the power limit with an OC on some expensive models like the Strix. I'm sure they'll stick to under 500w.
it feels like this happenned wayyyy wayyy back before on the vega vs 1000 series era when vega 64 and radeon 7 trying to close the gap to 1080 non ti and gave up since it can't close the gap to its superior 1080 ti and a few years later dated now 2024, they're like ... its time.
I paid $750 in December 2021 for my RTX 3060. I can't justify changing for anything else right now. I have to keep this card at least until the end of 2025. Only then will I consider AMD... If they are still in the GPU segment. On second thought, I still have haunting memories of the RX 570 I bought several years ago: onscreen stutters, continuous coil whine, intense heat and fans blowing like a Jumbo jet is about to take off. So maybe not AMD.
AMD being AMD... You can't charge 90% of what Nvidia charges when your product is clearly inferior... Nvidia has better upscaling, better Ray Tracing, better frame generation and they have convinced a lot of people that that stuff matters... AMD could fight back with more V-Ram and better prices, but they refuse to... Their low-end cards have the same pathetic 8GB V-Ram that Nvidia's cards have, and they cost about the same... but Nvidia has a better up-scaler and better ray tracing, does it matter on this class of card, well, no not really, but if the cost is the same why wouldn't I. This has always been AMD's problem. They have never been the first to implement a new feature, they are always copying Nvidia... and the copy is never as good as Nvidia, and they don't offer a big enough discount for customers to accept an inferior copy of whatever Nvidia's doing. AMD isn't as good as Nvidia If they want to compete they need to: 1. Offer more V-Ram than Nvidia AT EVERY PRICE TIER. 2. Stop with the fucking clamshell design and actually offer the wider memory bus that Nvidia refuses to offer to up memory bandwidth. 3. TAKE THE FUCKING RAYTRACING SHIT OFF THE LOW-END CARDS THAT ARE NOT POWERFUL ENOUGH TO EVEN TURN IT ON. 4. Accept a smaller margin on their products so they can LOWER THE FUCKING PRICES, cuz no one is going to buy AMD for 90% of Nvidia's price, they have to make it more like 75%... See, the 4060 is a clamshell 8 Mb V-Ram card on a 128-bit bus and it costs about $300 USD. The 7600 is a clamshell 8 Mb V-Ram card on a 128-bit bus and it costs about $270 USD, that's a 10% discount. NO ONE IS GOING TO BUY AMD FOR A 10% DISCOUNT. Now imagine that the 7600 has 12GB of V-ram NOT ON A FUCKING CLAMSHELL DESIGN, with a real 196-bit memory bus, and the price is $225 now would you be interested. OK, so the up-scaler still isn't as good, and the Ray Tracing still sux, but the memory is much better and so is the memory bandwidth and the price... but AMD couldn't do that and make a profit on the card. YES THEY COULD YOU GODDAMN MORONS. They wouldn't make nearly as much profit as they would like to make, the margin would be much smaller, but they would absolutely still make a profit, and they would take over the market segment that the majority of people buy in. Most people cannot afford a 4090, If you look at the steam survey the market is dominated by the more affordable cards. The card Nvidia offers in this segment is a joke, the 4060 is crap, problem is the AMD offering, the 7600 is just as crap and not any cheaper so why wouldn't I buy a 4060 and 90% of people do... AMD is stupid, Nvidia isn't even trying in the budget GPU segment, the 4060 is a joke... All AMD had to do was make the 7600 not dogshit... and they couldn't do it... so Nvidia wins again without even trying, cuz AMD is Stupid.
AMD is always the worst option. Has been since they started as ATI and will always be. AMD has invented and put in features at times (Tensilica DSP, TrueAudio Next) but those are always useless and nobody wants it. AMD does that always same like with their Ryzen CPU. Absolute junk and Intel always has the far superior options.
Their problem is marketing and not keeping everything closed source and being a unethical company like Nvidia. Most of AMDs techniques are open source. Vulkan was started using Mantle which cost AMD millions in R&D yet they gave it to Khronos for free, who turned it into Vulkan which we now all enjoy. Also, when you turn off RT the value for money on AMDs side is insanely better than Nvidia right now. Which means you can buy and AMD card that, without RT, can actually run games at native resolutions without having to upscale for the same price.
AMD cant catch up to Nvidia. Its like stoping a big wave with only your hands. They need to focus on the things they have and Nvidia dont, Vulkan or Fluid motion Frames. They will never catch up to Nvidia otherwise.
@@stysner4580 The unethical Company lets me play Graphics far faaaaar superior to AMD. AMD does not the Open Source thing because its lead by Jesus, AMD needs to because otherwise none buys one. Their GPUs are nearly the same Price as Nvidia Cards and AMD is also selling them 1000€+ Btw, i got more Features or improvments on Nvidia side as on AMD side and thats why i pay more for Nvidia.
I'm still holding on to my RX 5700 XT and I'm not in a hurry to upgrade. I left nVIDIA behind 20 years ago when ATI released their Radeaon X800 card and I have not regretted that choice at all. I got a allround more stable card and I have never had any blue screens of death since with any Radeon card or any driver since. I don't use raytracing and so it's not a problem for me. I rather have a bit slower card and stable than an insane over-priced card and unstable. Not alot have changed in 20 years sure but apparently alot of the issues I had back then are still pressent in the current drivers.
Then your world is stuck at AMD only. Go out from that AMD cacoon and see the reality of what GPU really is. Then you will understand why nvidia dominate GPU world
@@arenzricodexd4409 you are talking sh1t. I've owned AMD and Nvidia gpu's , for someone who don't wan't to spend a lot of money for midrange gpu AMD is the best in price/performance. I owned a 2070 S ( is in my daughter pc) and now in my pc i have a MSI 6800 non xt paired with and AMD 7700 non x cpu and they work beutiful togheter.
Sorry to hear AMD is slumping with sales, I am happy with my 7800xt. I think it is good AMD is there, cause it will help make graphic card prices lower. If Nvidia is the only company, then they will raise prices for us gamers. There is also Intel, who knows maybe they will pickup in 2 years or so with their new silicon foundry. I like what that guy making his own gpu "Fury". But the bottom line FPGAs have to be more affordable and powerful so you could have decent OpenSource gpus that would drive prices further for gamers. Thanks for reviewing about AMD, Vex.
If you want something for vr you have basically no choice. And my previous 1080 gtx made a lasting impression on me, my amd 4890 did not when basically it broke. That is why u bought a 1k bucks 4080
7900 GRE is worth the slightly higher price tag over the 7800 XT, especially after overclocking the gimped memory, although you can't go wrong with either choice.
We will never get back to the days where x60 was ~ $200, x70 $350, and x80 $500. Prices decreased from 2007 to 2012 but then rose again dramatically around 2018. Even accounting for inflation, a x80 should be ~$800-$900.
They would,lose money by levelopingnhigh end gpus… that don`t sell enough! The point is… development eats money. The money has to get back by selling product… and AMD gpus don`t sell! NVIDIA market share is 78% AMD 21% so NVIDIA gets 4 times more money if they would use same money to development!
The 3090 was a more perfect RTX card than my 2080Ti. The 4090 is RTX perfected. The problem however is that horrible power connection. 5090 will maximize FPS when Ray Tracing and DLSS are enabled in 4K. It’s gonna be a monster. I’m sure the power connection will be better this time around.
With the power estimates, i'll probably only buy from an AIB that does triple or quad 8 pin honestly. I don't want 12HPWR on the 5090, shit is a time bomb.
@@cloud380 I’m not thinking what you’re thinking I’m thinking. I watch people skip it constantly by dismissing its little pop up it does when the survey rolls around. Ofc I always participate in it.
Amd wants the same amount of Money for around the same performance with significant lesser features. You see what AMD cards sell well: 7800 XT, that's it.
This rumor happens every new generation. The only time it held true is when they stopped making cards like the Radeon 7. I think they will keep competing with the 80 cards, which is good
The 6900 XT was my first AMD GPU, I’d always gone team green before. I upgraded to the 7900 XTX, and had intended to upgrade to whatever the top of the line 8000 series card ended up being. I suppose now that isn’t going to happen… guess the 7900 will have to carry me through an extra generation or two 🤷🏻♂️
I rejected the current generation lol, I just built a PC with a 5800X3D and a 6650XT. Less than $1,500, plays everything @ 1440p minus RT. Good enough for me!
I needed to replace my 2080. So I looked and compared prices and it was the 4080 vs the 7900xtx. The 7900xtx was better on paper, it was better price wise. And I still wanted a 4080. But I was lucky that the 7900xtx came in with availability and a discout so after 2 weeks of going back and forth I bit the bullet and got the 7900. I do NOT regret it. What this experience taught me is that Nvidia is the default video card maker, not because they're THAT much better, but because they have the mental realestate, just like grannies back in the day called every console a nintendo, even if it was a sega so everyone wanted a nintendo...
I never noticed the danger of AMD was INTEL's GPU starting to grow up quite fast on the market, more than going to join the NVIDIA's "high end" GPU's route. Somehow, it made me remember the old times AMD was fighting INTEL again from their processors unit progression in the market. Thanks for the video, you made a new subscriber. Have a nice day☺
To be fair to AMD, Intel and Nvidia have engaged in extremely anti-competitive business practices in the past to kick AMD out of the market. AdoredTV has some videos on their past. Nvidia for instance has been caught modifying drivers just to go faster on press benchmarks and have included technologies in their games like Hairworks that cripple performance on AMD cards with no path for optimization for them.
Naaaa, if you search a minimum you can found that amd call it for a 9900xt(x) / 9950xt(x) like as twice the power of actual top-end GPU. The thing is: they aren’t ready yet, so (in my opinion / i think) they are going to replicate what happened with Vega to current XT generation. « How ? »: by sending Nvidia in the corner by releasing it something around 6 months later after the 8th gen / during spring 2025 (maybe summer). I call it again : they gonna sacrifice the next gen who is gonna be really short in time, or an intermediate one, and release something that gonna put nvidia in a hell of a day.
gpu like 4060ti, rx 7700xt, etc will bottleneck your cpu, so you have to upgrade to at least i7 13700k cpu. your 12500F with 4060ti the bottleneck is 25%, and with 7700xt it's 23%. at this % you have to upgrade cpu. if you upgrade to i7 13700k cpu, 4060ti and 7700xt have no bottleneck, but the 4060ti that has 16gb is overpriced for the performance so rx 7700xt is better choice cause it's cheaper. but if you want the 4060ti cause you like nvidia then get this. if you want an rtx 4070 gpu you'll have to get a i9 14900ks cpu, and even with this cpu there's a 6.5% bottleneck, but this is alright. it's acceptable.
I'd say that this generation of GPUs is just largely a skip - I'd even hazard a guess the only reason there are any sales happening with current pricetags is because people can no longer hold out with their GTX 10 series card that was carrying them through The Great Crypto GPU Famine(TM). If I was recommending GPUs for someone building now, depending on a budget it would probably go like this (based on prices here in the UK) based on my own experience (I own the first two and the last two neighboring 4080) and research: used 3060 12GB (price is silly now) -> used 6700XT -> used 6800 -> 7900GRE -> 4070 Super -> 7900XTX -> 4080 Super -> 4090 I'd say anything else in-between is a waste of money, with the sole exception here being 4060 because it's the best price/performance low profile GPU on the market. This obviously changes as pricetags move around. Like, below 4070 Super, the Nvidia features might very well not exist because performance just isn't there so raw raster performance is more important. Arguably, if you don't care about RT, 7900XTX is best bang for buck out there right now along with 7900GRE on a more stringent budet. DLSS is visibly better than FSR but mostly in 1440p and under, and that probably won't be an issue for much longer with XeSS stepping up and being compatible. Frame generation is a joke for the most part (one needs 60fps to begin with for decent experience so it isn't a feature for the peasants, contrary to it's misleading marketing - it's great to hit that extra 40fps to match refresh rate of your monitor, not to make your obsolete e-waste last a little bit longer). 4090 is the obvious pick when money is not an object and you want all bells and whistles so not much to say here, other then it already struggles in 4k with maxed RT so gaming-wise it doesn't seem to be future-proof in any capacity, as there is very little headroom already. Lastly, if you look globally, GPU prices vary WILDLY depending on region. In some regions AMD is much more expensive or there is no availability at all. AMD produces less GPUs, has worse distribution outside of USA and few European countries, and is totally absent in the mobile market as far as dGPU goes. All in all, just about every GPU at every tier in this generation are, quite frankly, grossly overpriced. Realistically, even if AMD released 8000 series with redesigned RT hardware to bring in on par or closer to Nvidia, with 7900XTX raw performance and for $600 they would still not gain any considerable market share...but it would be damn nice card to purchase.
A friend of mine got a 7900xtx just to find out it was slower than his old rtx3070. DDU was used in safemode to remove drivers and everything, 7900xtx was still slower. Maybe it was a defective unit, but after all of the throubleshooting he just returned it and got a 4090 instead
I reluctantly upgraded from a 6900XT to the 7900XTX recently and from the 5900X to the 7800X3D. I was not expecting the lift in performance that I got. The Alienware 34" QDOLED helped. Waiting for RDNA5 makes more sense but health issues influenced my rash decision. HDR in The Division 2 is more important to me over Ray Tracing. I also picked up a 4TB Gen 5 M.2 on a really good sale. Very happy I made the jump.
Bro they probably will cook nshitia way they cooked intel cpus in gaming :D anyway i have rx 6800 and dont look for upgrade maybe in 5 yrs or more so :D i had like 3080ti and 4070 but idk i never used rt much and amd cards costed way less for more or same perf so :D no brainer to not go amd
A lot of us may be still rocking on older AMD cards. I have an RX 590 that plays Starfield and DD2. While I don't really need a new card, I've been considering either a 7000, or an 8000 when they come out.
This makes sense tbh. I feel like if you're gonna buy a top end card, you're gonna want all those flagship features unless the price difference is huge, or you're running Linux where AMD is just better. DLSS in particular is essential atm for that top ray tracing performance. But at the lower end? You're not gonna be running ray tracing anyway, so DLSS isn't as necessary, and FSR is decent when it's an option. Plus, you can build up some brand loyalty with the entry level market who may then go on to buy another more expensive AMD GPU later on when they upgrade.
It makes sense. I don't personally care about ray tracing, but I'm probably a minority. If someone has $700 to spend, or they have $1,000 to spend, then they probably have the money to spend a bit more. So once you're in that price bracket then why not spend that bit more to get RT and such? If AMD wants to compete at the high end then, well, they have to compete at the high end. And they simply can't currently do that. They don't have the RT features to do it. If someone's going to buy high end then they're going to have the money to buy the most features, obviously. They're not looking to save a bit, they're not willing to sacrifice a few features to get good cost-to-performance. They're by definition looking to spend a lot.