My favorite video we've made on this topic is with Gordon from PC World. Check it out here - super entertaining and insightful! ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE--wGd6Dsm_lo.html Updated the title: It was "NVIDIA Has Flooded the Market" (meaning to overrun or overwhelm with options, as they've done). Some pointed out that this has a specific economic meaning with a specific definition. Updated for that reason. Thanks! Supoprt our work! Our Limited Edition CyberSkeleton V2 foil T-shirt will soon no longer be available! It directly helps fund our research: store.gamersnexus.net/products/limited-edition-foil-cyberskeleton2-cotton-tshirt Watch our video about NVIDIA's AI and how it's on another planet: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-0_zScV_cVug.html
Your Walmart comparison is way WORSE than you state. Both AMD cards are sponsored advertisements and not native selections. Without the sellers paying, those two would not even be there.
This phenomenon is referred to as "shelf terrorism" in certain circles. It's very common with candy brands in the United States. For example, Reese's makes about a dozen variants of essentially the same candy in order to expand their presence on store shelves. If a store designates five shelves for candy, Reese's products can easily take up an entire shelf. But all of that candy is virtually the same thing. It's the laziest illusion of choice you'll ever see. And stores can't refuse any Reese's variant products without being cut off from the primary product, which simply isn't an option.
I noticed this happening in the big supermarkets with hot sauces. They all ended up with the three big brands, tabasco , nandos and another one i can't remember, the superior local brands were forced off the shelf.
kinda is, but also because nvidia produces around 3-4 times more gpu's than amd, with consoles already in the equation, amd doesn't have the allocation to keep up with the same gigantic amount of stock that nvidia has, so AIBs have like 14 models of the same gpu on nvidia, while amd has like 14 models with all their aibs combined for a model. Add that in NA gpus do sell at msrp, and most of nvidia and amd models have very similar pricing, so people will orbit the nvidia sku when it's just barely more expensive.
Even in the comment sections, AMD has less than 75%. But the thing is that most people don't give a damn about what commenters have to say online. They go to store and buy something, which leans Nvidia. The most they'll check online is a review from a big outlet. I only bother with other commenters because sometimes they're so blatantly wrong and/or I have the time to waste.
That’s because board partners and retailers each only get single digit gross profit margins. There’s no room for retailers or board partners to compete on price while still making enough gross profit to cover overhead costs.
there is no such thing as 'good value' when monopoly/ oligopoly runs the market , it's guaranteed, though you are being exploited and extorted @@ask_carbon
Google pushes reddit on their search pages thanks to the AI deal with reddit. Nvidia pays to put their products first at Amazon, Best Buy, etc. Nividia can also buy more shelf space at retailers.
Agreed; my current GPU is an EVGA RTX 3080 and I don't really plan on upgrading any time soon and there isn't anything (that I see currently) that's worth upgrading too anyway. AMD still lagging behind and Nvidia can go to hell.
@@TheHunt890 I mean in overall software/updates/etc; if I was to upgrade (or if my current broke and I had no choice) the 7900 XTX would be my first choice but ultimately I’ll wait to see what the next generation brings.
Recommended or featured are sometimes (as we understand it) paid spots as well, or preferred spots on Amazon where money may exchange hands. Sorting by price seems like it should be fairly objective as long as it's just a "dumb" filter that literally only sorts by price.
@@GamersNexus trouble with sorting by prices is that there's usual a couple dozen pages of $1.00 crap. Even searching by using the GPU category filter: you'll usually end up with multiple pages of cheap accessories....then a bunch of used/refurbished products that are suspicious *and then* at some point you'll start seeing cheap GPUs (that are multiple generations out of relevance) Search & sort allows customers to save money, there's no profit in allowing customers to find what they want at a price they can afford! (God I feel dumb for writing that, but you KNOW it's been said by c-suite staff)
Sorting by popularity amplifies that popularity. Things that are popular stay popular because they're popular (if that makes sense). However, that popularity doesn't necessarily equal quality/bang-for-bucks.
@@PincKenis it really that amazing when they have generational upgrades every 8 months? Nvidia should focus more on card longevity/replaceable parts/right to repair 'cause the way it's going right now this industry isn't worth the eWaste. Y'all are having a party with the digital creation and Bitcoin money laundering bullshit but at the end of the day it's gonna be you who has to clean up this mess. I bought a 1080ti in 2017 and idc how derelict it gets from lack of support I won't be supporting this black hole industry again for another 15 years. Say what you will on the futility of an individual vs. hordes of hype infused consumers but y'all have been warned.
I’ll stop calling GN a RU-vid-Channel or media outlet. You’ve become journalists. The attention to detail and the standards you and your team adhere to are outstanding.
Don't insult GN by comparing them to the mudslingers & corporate mouth pieces that call themselves journalists! Gamer's Nexus is competent & has actual integrity! 😂❤
And software. Almost nothing doesn't mandate CUDA - and there's no good reliable way to run CUDA on AMD, Intel or others (zluda is on the way, but isn't reliable yet, and reportedly abandoned - though I doubt forks will be abandoned for long)
I went with 7900xtxs when the AI hype started; I've fought through every workaround/model conversion. I should have just bought the 4090s: My 7900s were half the price but, years later, I've spent that much (in time), and then some. Having the 7900s around for gaming boxes kind-of makes up for the loss, but it still sucks that AMD cannot catch up- as anyone, who's seen geohotz struggle, can confirm: they have the hardware, they just (as usual with AMD) don't know how to write drivers for it.
The same thing goes in productivity and especially rendering. Vendor lock-in for NVIDIA GPUs was a guarantee since CUDA became a thing. NVIDIA was chopping away at Radeon GPU sales way before machine learning even became a thing due to the applications and features their cards had outside of games.
Sales background here, that Mere-exposure is applicable to almost anything! Its why persistent ads are so effective. You may not want that thing at the moment of seeing the ad. You may even actively dislike it or curse it. But when consumers come to the moment where they want a product in that category they'll choose the one they've been overexposed to WAY more. It's why targeted ads are so effective and sought after! Edit: I have now adjusted some abbreviation, spelling, and grammar errors to better meet the RU-vid comment section perfection standard.
@@The-Cat No, that works on everyone that's not a cheap bastard The only way to avoid it is to actually test all your options to find your preferred "sku" if you are deciding on a consumable (like food) or to do price/performance analysis before every buy for persistent items (remember to ignore userbenchmark's descriptions and only use the numbers if you use that website, either they get paid or they really like the taste of green dick 😅 ) Most people don't do any of that, so most people will just pick the easiest thing Shouldn't unfamiliar things draw the eye more? Yes, but people are also lazy (blame evolution) and you know nothing about a new thing, but that card you heard about is 69% faster 'cause triple RGB! Faster than what with triple what? Who cares, stop asking complicated questions Do you really wanna waste all that time READING D: about this new thing? Of course not, just pick the 69% faster one and let's get out of here
In the early 2000's, a private computer shop owner told me that he had to sell a certain percentage of Intel CPU or else, he would not be able to order as many as he needed/wanted. The giant companies are all bullies and their success, their huge market shares, are not due only to having the best product nor the ones that are wanted. I remember the Radeon 9700 Pro after ATI bought Artx. That card was far, VERY FAR ahead of anything nVidia had and their FX cards were pathetic in comparison. The 9700 Pro had serious 2nd gen of shaders (dx9) capacites and the FX series were ok for the demos on their website and it's about it. The low end, the FX 5200, was a pure joke. Guess what? Most people didn't want it and they wanted the shitty card with the green logo. There was none in store, I had to buy it online on NCIX. I had a 9800 and that thing last me 4 years before I started to feel the need for another one. I was running Far Cry maxed out at 1024x768 while others were only dreaming about it.
I remember that too. Nvidia really screwed up with that FX line. I switched from using an older Nvidia card to the 9800 Pro myself.... It was far superior to the Nvidia cards. There has been a few times where AMD has managed to score a coup with a superior card or an awesome exclusive feature like Eyefinity but lately unfortunately Nvidia has been dominating the market.
The 5800 ultra aka the dustbuster aka the card Nvidia literally pretends never existed. When the 9700pro caught 'em with their pants down. An interesting time for sure. I remember them using driver level cheats in benchmarks and STILL being slower. ATI (now AMD) was dominant for a while there. Everyone bought ATI cards for a few generations. It's what AMD NEEDS to do again. They need to be going after the 5090 not the 5080. Take the top spot. It trickles to the bottom. During that same 9700pro era people were also buying 9500pros. Was also a mod you could do to potentially unlock the extra vram bandwidth on the 9500pro to turn it into a 9600pro or something. I forget.
To be fair most people dont need card higher performant than those 800$ cards so they expand the market to the top instead down since AMD cant compete in the top
Honestly you don't need to spend that much to play video-games. Unless you make actual money from your GPU spending this much is ludicrous in my honest opinion.
really cant believe ?because titan cards always were pricey, and 4090 IS titan card . only named different. its not for your fortnite or all other shit to play. People need good card for work and nvidia Provide it.. not other company. its people choice to buy it , and dont blame your poor ass for high price of productivity card..
In New Zealand, we have only ONE major PC parts retailer that sells any real selection of AMD Radeon GPU's, with 23 Radeon and 109 NVIDIA Skus. One other PC retailer has 6 Radeon and over 100 NVIDIA. Two of the more "Premium" PC parts retailers, don't even sell consumer AMD Radeon cards (they used to). What I find funny though, is that despite AMD's massive up-hill battle to have the Radeon GPU's seen, buried under the pages and pages of NVIDIA products, they still manage to make up around half of the top 10 sellers at the largest retailer... Makes me wonder why the smaller retailers don't sell them...
@@r0N1n_SD That does make sense. But to not even stock a single AMD card, even one of the more popular Skus that regularly trade blows with NVIDIA's best sellers at our largest retailer, seems sus
Well if you have 200 sales split between 23 cards vs 1000 sales split between 109 cards, it's not surprising that some of the AMD cards would show up as top sellers. The overall sales volume probably just isn't enough for the smaller shops to justify carrying the less popular cards.
Remember that AMD was 100% convinced that the 7900XT would be a good seller at 900$. Let's not keep pondering wether they are strategic geniuses when it comes to product segmentation
With the position they are in they should be undercutting all their prices by $100 flat period if they don't wanna get horsewhipped out of the GPU space altogether
@@thrace_bot1012 They should've pulled out when they released the RX 480. Historically they've been weak sellers, even with superior performance numbers. There's a video by Adored tv that goes over their history. Keep in mind this is before the 6000 series ever came out.
@@ArtisChronicles yeah and there is a sickness called Stockholme syndrone... thing these prizes are normal is absurd just look at there profit. They milk everbody as much as they what and people clap for them.
UK-based mainstream retailer Argos has 41 GPUs listed on its app / site and only 5 are AMD… They offer one RX 6600, one 7800 XT, one 7900 XT, and two 7900 XTX… But they have 15 variants of the 4060 😳
Back in the 90s advertising focused on subliminal methods were illegal here in Poland. But over time they were becoming more and more prevalent and harder to fight in court (mostly due to corruption in our court system) and today they are just the norm.
A factor that is often overlooked is how the third world fits into this. Now I can't speak for every country, but at least for mine, Nvidia is easily the most available brand for GPUs and that is because finding AMD is very hard to do and usually way overpriced as there aren't any local official retailers, just resellers who import AMD products, contrary to Nvidia who has reached out to ensure having their products available in local stores. So everyone I know here, me included, has only ever used Nvidia even if there was a desire to try something else out, entirely due to availability alone. And I don't know if I can claim that that's on Nvidia flooding our local market as much as it is on AMD just failing to even attempt to expand.
Even in Canada AMD has made very little effort to price their GPUs competitively. In most price segments Nvidia is often a bit cheaper or the same price with more features. We're like an 8 hour flight from AMD headquarters and they can't even get Radeon GPUs prices right here. They're screwed internationally 🤦♀️
@@MaxwellTornado No, the implication is that Jensen will unleash the basilisk in Nvidias basement and bring about the end of humanity if AMD's market share ever surpasses them. Why else would they be going so deep into AI? They must feed the beast.
I just want to say I rarely ever comment on videos. I want to thank the GN team for the work on this video. I recently graduated with my bachelor's degree in marketing. I have dome research for classes and my undergrad research regarding this situation. It is refreshing to see a channel being analytical in not only performance but also market presence. It is becoming increasingly difficult for competitors to find space in market that is saturated with NVIDIA. NVIDIA has been able to diversify their portfolio with there many SKUs a various different price points for good or bad. This is a perpetual motion machine where when you think GPU most people think NVIDIA. NVIDIA has mastered leveraging their brand and their partnerships. Regardless of how people feel about NVIDIA they are everywhere and people will continue buying them. NVIDIA is longer a graphics card or graphics development company they have become focused on AI and deep learning. How this affects consumers is that they are being upcharged for features they don't need like RTX Ai chatbot or NVIDIA Broadcast. It'll be interesting to see how the market will continue. Thanks again GN crew loved the video.
My takeaway from this is that Nvidia is simply more professional and consistent in its marketing: it keeps the partners that matter (i.e. Dell rather than EVGA), it doesn't confuse the market with random naming changes, it always has something relevant for a wide range of different price points so that it's rare that it would be a no-brainer to wait for the next generation rather than buying something today, and they and their partners make sure they are competitive in every market. And that's before we consider technical USPs such as frame generation, ray tracing, video encoding, CUDA, and so on. Nvidia knows the prices people will pay for its products, and makes sure it has a suitable product for them, which keeps the money rolling in, and keeps the machine running and growing. I do wonder if Nvidia will one day spin-off its gaming division, as it doesn't really fit with the professional divisions that now make up the majority of its revenue. Of course, I'd expect that division to carry on licensing Nvidia technology, but it would have to stand on its own two feet to secure manufacturing capacity for its products. On one hand, I think that would make life easier for Intel and AMD's gaming GPU divisions, but on the other hand, I think it would play out to be a net disappointment for gamers, in the shape of slower progress and even higher prices.
It was so much more simple when you had just a few 'gaming' and 'professional' graphics cards. Low, Mid, Enthusiast class, and then you had the "i need to show picture on monitor" entry level GPU's. Now each level has 4-5 GPU's...
It's the same in smaller stores in western Europe. It's basic economics, why buy something in bulk, risking a lot of capital, to put something on your shelves that almost no one will buy?
Still love that idea! He spoke very highly of the latency reduction being perceivable and I want to see it! We're in regular contact and will likely do that next time he visits or I visit him!
**Nvidia naming their cards:** "Ok so we're going to increase the first two numbers by 10 each generaiton. So the 1080 becomes the 2080." "And then the 3080 after?" "Correct." **AMD naming their cards:** "So our current card is the RX 580 and the one hefore that was the RX 480. Is our next flagship going to be the RX 680?" "No. Vega 64." "Oh... Is our next one going to be Vega 65? Vega 128?" "No. Radeon VII." "Oh... Is our next card going to he Radeon VIII? Radeon IX?" "No. RX 5700 XT." *Flips table and leaves in frustration.* Edit (Part 2): "Hey, so... I know I killed a few people here by crushing them with an office table, but it was a tough time and I got better. But what you guys did with the 6000 series launch was excellent. The 6900 XT was a great enthusiast card. So... You guys are releasing the follow-up soon? The 7900 XT?" "Oh yeah we are, but it's actually called the 7900 XTX." "... What?" "Yeah we added another X to the name." "I thought you guys were releasing the 7900 XT?" "Oh we are, but it's actually a 6800 XT follow-up." "I thought that's what the 7800 XT was for." "Well we have a 7800 XT, but it's not all that more powerful than the 6800 XT, so we're actually making it a follow-up to the 6800." **In his despair, the man realizes that all of his years of therapy have been wasted.**
I dont get why people care so much about naming schemes, literally just look up any of the thousands of videos on the gpu, not being computer literate isnt an excuse to not research anything at all
consistent naming schemes are one of the reason people buy Nvidia, you roughly know what you can expect performance wise. With AMD? Have to look up reviews first. Most consumers simply are not well informed.
In Europe, in Poland, Nvidia and Intel provide cards for pre-launch testing. However, local AMD branches usually do not have any cards to send for testing. Therefore, testers-reviewers have to order/buy Radeons from online stores only after the premiere, but then the real availability of Radeons in stores is practically negligible anyway.
Wow the psychology is powerful. I haven't bough an Nvidia GPU in 12 years, but all of a sudden I feel like I'm missing out on something. However, the marketing psych has it's hard limits. I'd rather miss out on the Nvidia cards then miss out on the $2000.00 in my bank account.
I don't want value. I want really high performance. I'm not excited about the high prices, but I am excited that Nvidia has decided that super high performance is something they want to offer.
I mean, it has happened that the most expensive GPU ironically has the best values. Is very rare but it has happened. Current prices are still garbage tho.
Yeah, no one who drops 2k on a card is crying about value. That's more people on a budget. I didn't get my 4090 because I was looking for value, I was looking for max performance. I've been a value GPU buyer many times.
@@sebastianandres6802 this is factually incorrect. the 4090 is not 5.3 times better than the 4060 but it sure is 5.3x the price. the best value cards are typically in the $200-400 range and most people dont need to buy anything higher than that unless they really want high fps at 4k
I prefer a less power-hungry card. I like the 750Ti, 1050Ti, 1650, now 3050 - all within 75W with all the features of their bigger brothers. I wish AMD would do the same, as their only current less-than-75W card 6400 lacks any encoding support. Last AMD card less than 75W was 560, about on par with 1050Ti
NVIDIA’s name actually comes from the Roman Goddess Invidia (goddess of envy) with her Greek counterpart(s) being Nemesis/Phtonus (vengeance, envy, jealousy, etc.)
It also works in spanish, since envidia means envy, and they made it easy for spanish speakers to say it, like myself. ah eme de (AMD) is a bit harder to say and sometimes influences purcashing power.
You know what...you make an amazing point that I've never put in to words. GN doesn't make sensationalized headlines or click bait. If their title is sensational... it's because the story is sensational.
I used a gt 1030 for 5 years because that is all I could afford. I have upgraded in August 2022 to a RTX 3050 because its was 230 dollars during mining and the rx 6000 series cards was super expensive. Im a casual gamer and mostly play single player games, but I do more programming and using some software suite for my hobbies, studying and job. I need cuda so im happy with my card.
The 1080 Ti remains a very competitive card. There will never be another release like it from NVIDIA, because, unfortunately, they learned the wrong lesson from it - that if you release something too good, people hold onto it for a decade instead of buying your crappy new stuff. I've been buying up secondhand 1080 Tis, so that if one of mine does actually fail, I've got spares.
Same here. But I plan to upgrade my monitor soon, so I am looking at GPU prices. There is nothing that screams value, unlike in CPU sphere where the AMD 3D parts are crazy good. Made the 7800X3D an easy buy.
@@Steror I am running 5120x1440 on an OC'ed 1080 Ti. I don't play any of the crappy new AAA games; but, even the demanding stuff I play has been no issue on the 1080 Ti.
3dfx ran fine, their problem was they bought in manufacturing instead of remaining fabless and ran out of money. The whole Nvidia argument at the time didn't hold water, the entire GeForce 1-2 generation was completely unnecessary and you could use a TNT 2 or voodoo 3 for ALL GAMES, until dx8. Nvidia is still using the tech demo model to this day to sell hardware, currently being some tiny amount of RTX games like cyberpunk. That's their marketing strategy. NV = Envy. Green. So don't wear a green hat, and yes that's also a reference to something relevant.
Ah yes, 3dfx, proprietary API, 16-bit graphics, horrible product lineup and unwillingness to change. If THEY ran the market today we would still be at on DX9.
@@JohnDoe-ip3oq"GeForce 1-2 completely unnecessary" GeForce 1 I could agree with but the GeForce 2 was a powerhouse that ran laps around the TNT2. It was a very impactful card.
@@sulphurous2656 At least ATI had a handle on costs for production, and AMD was able to bolster ATI's weaknesses. It took time, but AMD had the management ability to bring ATI into the fold and make them into something more than ATI themselves could afford to be. When 3dfx bought STD, STD's costs were out of control, and 3dfx had no ability to rein that in nor to improve STD's channel presence for sales. Not to mention having screwed over many board partners that had been burying STD in the marketplace for years, and now had reasons to adopt NVidia and continue to use their STD-burying practices against 3dfx's card sales.
Another fun fact is to take a trip into the world of Linux. Nvidia is a market minoritty when it comes to Linux gamers, and even when viewed across time for a couple of years back, you will see that Nvidia holds a minor position. This is due to a driver issue and is one of the reasons Valve went to AMD for their Steamdeck. While it is a small market, it is interesting to see that such a dynamic does exist.
The only downside of amd vs nvidia on linux is your limited to 4k30hz on hdmi, doesnt matter for anyone who uses display port, but it was the first thing i noticed
@@tyler6602 There are a few others, Gamescope requires special handling if you use that, hardware video acceleration doesn't work for any browsers outside of Firefox if you don't tweak. And of course the more recent Wayland debacle and mess with syncing and flashing game renders. This is my Nvidia list so far.
What's funny is AMD also used to have terrible linux drivers, there was an era where the roles were reversed even (though nvidia was never great, amd was just worse back in the late 00s)
@@DeinonychusCowboy AMD didn't support Linux for a long time. They did briefly and then they dropped out when they scored a big contract with Microsoft. They licked the boot. Then when that was over they came back. Say what you will about Nvidia but their support has never varied. Once they got on board they stayed on board. Nvidia was the first PC hardware manufacturer to ever support Linux too. DEC was the first but DEC didn't make PCs. And yes hardware acceleration back in the day with Nvidia was great. I was rocking Quake with my MX200 That was in the late 90s. I started running Linux in the mid 90s. They were heady times. World Domination!
This goes all the way back to when AMD bought ATI. AMD didn't buy ATI for their discrete GPU's (contrary to popular belief). They bought ATI for their chipset technology. AMD's entire focus was on competing with Intel with their APU's. Discrete GPU's were an afterthought for AMD. And here we are today. No marketing, no competition through innovation, and no more CrossFire. I blame all of this mess on ATI for selling. The Canadian government should have blocked the sale.
i think you overlooked one important segment of customers video mostly looks at DIY customers, looking for a GPU, something that's not majority in the market most probably the biggest part of the market would be prebuilds, i don't believe i found any radeon prebuild ever, for cheaper options? 1650, for better? 3050, high end 4070 and higher even if you ask someone to make pc that does it profesionally, they will just go with "RTX gpu has Reflex that makes your aiming in games better", even all shadowplay to make videos of your gameplay or nvidia overlay with filters is a selling point, also my console friend liked hands off approach to automated settings adjustment so even if radeon has way better frame limiter in radeon chill (great for screen tearing on lower end displays), eyefinity that works, even recent Steve's video (HU one) he had problem with RTX 40 series not playing nice with 3 displays, so he went back to 6950xt as a fix... it's all for tinkering type of people that like their custom resolutions and all that, radeon lacks a lot of very casual oriented quality of life features that gives hands off experience
@@mightylink65 they are kind of similiar, but when i'm looking at prebuilds, they seem to even exclude ryzen, i checked whole "shroud related prebuilds" a few days ago, since i was interested if those are expensive rebrands of existing products, and all maingear pcs are intel + nvidia, i think theres single pc in starforge systems that had ryzen cpu, so if ryzen is hard to come by, then radeon won't be there unless they'd crush nvidia for 2-3 gens in perfromance... which is we know what
Because they don’t want sound alarm bells to investors. Game of vulcan chess being played. anyway prices will eventually drop slowly. The 4060 12 gig will be sub400 in three months. They’re gonna call it a “hot summer sale” or some such. Or they provide a code to helldivers 2 with the purchase of a card.
@@tonycrabtree3416 What? NVIDIA doesn’t control prices on used cards. They can’t just decide used prices stay high in order to not “raise alarm bells to investors”
Well I put my money where my mouth is and bought a 7900 XTX last year. Yes I wish it had better ray tracing, but other than that it just flat out destroys every single game for quite a bit less then the 4080. By retail price it is only a little cheaper, but in reality it was quite a bit cheaper (about $200 I think). Statistically a lot of the people criticizing Nvidia in the comments still have an Nvidia card. It's like people are afraid that if they buy an AMD card it will be inferior. I can assure you they perform just like the reviews say they do, which is much better than Nvidia frame per dollar in anything except ray tracing.
I did the same exact thing except I don’t care about Raytracing in addition to the fact that AMD has superior software support on Linux compared to Nvidia. (I daily drive Arch Linux and Debian.)
This is why I went from my NVIDIA 20 series GPU to Radeon 7000, and from an Intel CPU to Ryzen. There was more value per dollar when it came to just playing the game. Stuff like raytracing and DLSS offered no additional value except compromises just to get prettier lighting and reflections that I wouldn’t even notice after five minutes into a gaming session.
I put my money where my mouth is as well. Went from a gtx 1070 (excellent gpu) to a 6700xt after I began using Linux. I will be using d from here on out as the driver situation on Linux is fantastic. I’m also willing to give intel a try once their gpu department matures
Nvidia valuation is at an insane high. AMD is the perpetual underdog, green team cant stop winning at the moment (personally an AMD owner since it's cheap)
@@thebcwonder4850 For exactly the same reason they decided to focus on Datacenter CPUs and let Desktop be more or less a byproduct of that: That's where the really big bucks are. Intel is selling at break-even in Datacenter, but AMD is getting almost their entire profit from Datacenter. AMD full well knows that getting their GPGPU/AI software stack in order, and have a scalable compute architecture, is where the big money is. If they get that, then a couple of extra size dies for the gaming market is chump change in development money for another side hustle. In semiconductors scale is everything. It costs the same to develop an architecture and driver stack whether you sell one or one hundred million. That is why everyone wants to be big in this space. As soon as the fabs go brrr you can replicate to infinity, but the step up to that point is expensive as hell.
I sincerly bought 3 months ago an RX 6600 to change my GTX 980. I wanted to try for the first time AMD, and sincerly i don't see any difference. In fact AMD Control Pannel has a lot more features. I am super happy with my AMD GPU. For 1080p RX 6600 is king. 100W + Ryzen 5700X3D Top.
@@mikymouse7525there's a ~65% improvement, I don't consider that a side-grade, though personally I made the more substantial upgrade from 970 to 7600. Either way, it means newer games and higher fidelity become more accessible.
The funny thing is even this video was more about Nvidia's market share being bigger than the overall landscape of GPU market share. You kind of inadvertantly prooved your own point subconciously. ;P
After owning 3 generations of Nvidia cards, I went with an 7800xt this time. I think Nvidia is becoming the Apple of GPU's, where you have to pay for the name and the brand instead of getting the best bang for your buck. I definitely don't regret my purchase.
Putting together my first ever build tomorrow. Despite a potentially unlimited budget I went with the 7900xtx along with a 7800x3D. Went for the value play at the end. Don’t think I’ll be sacrificing much anyways 🤷🏻♂️
As SOON as AMD fixes their upscaling and ray tracing deficiencies I’m buying an AMD card, the rasterised performance is clearly excellent, but you couldn’t pay me to use FSR😅
I don't know, man… If you go to my electronics retailer, set a hard 500 dollar price cap and ignore every ripoff, you pretty much end up with the 67/7600(xt) and maybe the occasional 6800 on sale I don't see much NV presence there
That I personally don't care about. The part that is annoying is people online straight up making things up and trying to make AMD cards sound worse than they are. Some of the claims I've seen on shit hole websites like Reddit and I'm sitting here with my 7900XTX and wondering how they came to those conclusions other than blind fan boyism.
The 7900XTX I bought which was a PowerColor was the most unstable graphics card I've ever had and the only one I've ever refunded in my life. Insane coil whine, constant crashes and terrible frame rates in comparison to Nvidia cards of similar price. It really sucks because I wish AMD would get better. I really don't care what has to happen I just want them to start kicking arse whether they significantly reduce their prices or start incorporating the public to help them create better drivers like the linux community does.
My 7900xtx runs 48c max cyberpunk scaled to 4k with mods and ray tracing. On my linux I can break things without frame dips when explosions hit. It’s a robust card when bought from the right manufacturers. Sapphire and XFX are solid makers.
@@hoyschelsilversteinberg4521 my first amd card was a RX480 and i was surprised how well it worked. i heard a lot of bad things about amd but nothing was true. after that i bought a vega 64 and now i'm on a 6900XT. all cards had no problems. i've been using amd for 8 years and before that i was just a nvidia fanboy, but they blew it with their low vram and i switched to amd. i've also never had any problems with amd drivers that i haven't had with nvidia.
I still remember upgrading my GPU every two years and thinking the difference was insane. That ended with my leap from a 970 to a 1070. No modern cards have impressed me
3060 ti is a marked increase for about 250 watts. 4060 ti is pretty much the same thing with 3rd gen dlss and 225watt usage .Kind of waiting still for the next pcie 4.0 card that offers more performance for the same price. i bought two for an average of 550$ CAD each. 3060 ti's in the end of the scalping. Could have got away with slightly cheaper a year or two later but would have been without a card. GTX 1080 (ti sometimes) performance with ray tracing.
@@roboman2444 the 970 had 4gigs of GDDR4 I believe. My 1070 has 8. Which believe it or not was intended for 1444p gaming or 1080p at 144hrtz. The latter of which it did really well for more then half a decade.
4:30 I like that little thing at the bottom where it explains a little more of what specifically you're talking about. I think I saw something like that in another video but I can't quite remember. Still, it's pretty nice to see, especially when there are some topics/points that go over my head because I don't know what the heck they are, even if you do your best to explain them haha
As an uninformed consumer, I feel the AMD naming and numbering scheme has historically played a large part in my purchases and in my general awareness (or lack thereof) of the AMD offerings. Every time I was shopping for GPUs I'd take one look at AMD and think, "I have no idea what any of this means, why can't they keep it simple like Nvidia." Yes I could have done my research and learn what the current AMD cards are and how they stack up, but I didn't feel like dealing with that headache.
I agree, and this is as an AMD adherent who has been purchasing AMD GPUs since they acquired ATI. For a long time it wasn't even "bigger number better", there were generational numbers where the "new generation" were worse than the "old generation" (HD 7000 vs HD 6000.) They're doing the same thing now with their APUs, calling them 8000s compared to the 7000 series CPUs. They may be trying to imply that their 7000 series CPUs should be paired with their 7000 series GPUs, but it's not working, and they need to do better to market their GPUs as an independent product line.
My 3070ti was a direct replacement for my 1070ti, which was a direct replacement for my AMD r9 290x 8gb. These graphics cards lasted and did for me what I needed. Some people turn over more of their GPUs . My 290x replaced my two ATI radeon 6700 1gb . This was the problem for me- availability when I needed a GPU.
Another possible cause is due to the market share. If more people have Nvidia cards then more developers will use Nvidia specific features, like framegen or DLSS 3, then due to this more people will want Nvidia GPUs because they don’t want to miss out on these features, so that is another cycle too. I know there are games that are quite heavily AMD biased but it isn’t as common. Add to that the fact that AMDs features may be available on older cards or are able to be used on Nvidia’s cards and that further increases peoples want for Nvidia cards, if they choose AMD then they lose out on Nvidia’s features, if they choose Nvidia then they get both Nvidia’s and AMD’s features to some extent.
Part of the problem is that because of its position, buyers have allowed nVidia to dictate what the must have new feature is. Nobody cared about real time raytracing until nVidia made it the new hotness. Developers played along (probably with that combination of incentives and pressure nVidia puts on all it's partners) and started supporting it. Then AMD did what they and basically every reviewer said they needed to do and added raytracing to their products. All that did was just validate what nVidia was doing and lock them into an endless, pointless catch up cycle, when they can NEVER catch up with the lead nVidia has. Then they did it again with DLSS and FSR, and again with Frame Generation. They have to know this locks them into the position of second fiddle. If they introduced a new feature that nVidia didn't have that actually improved gameplay people would notice.
That is literally not true because the Vega 64 was faster than the GTX 1080 Ti in ray tracing for half the price (look it up), and absolutely no one cared. Nvidia dictates what the next feature is because every time AMD did it, they got blasted by everyone for being "anticompetitive". Remember, it's only ok when Nvidia does it!
why some people assume NOBODY cares about rt ? can you provide reasearch about it ? No , you just Guess by reading some comments.. You just talking nonsense. Its more like Nvidia creates something by investing money and amd is trying to copy , but much worse. Your are saying like all software sollutions are all perfect and invented and no one needs anything anymore . If people like amd would be in charge , there would be no inovations and you would be stuck with all that old amd shit forever . RT is another step to graphics and a decade ago it was just impossible for average card to handle it , but nvidia is pouring money into it to make it possible . because rt is REAL time simulation of shadows and reflections and not some old shitty gimmick which was just illusion before. Its next step to better graphics you want it or not. i am tired of bots which say 'i dont care about rt and dlss and just want pure rasterization.. Glad that market stats still shows that not all people empty headed and sees things broader , than average pure raster boy..
@@milesfarber where did you see vega 64 beating 1080Ti in RT? in the past Crytek release their noir demo that use software RT and in that test GTX1070 end up being faster than Vega 56
The idea that Nvidia "controlled" the demand for these features is already shaky with raytracing, but you're kidding yourself if you think their literal freefps.dll wasn't gonna rock the boat for anyone that didn't have it. Sometimes the train gets lobbied out for the car, sometimes the horse is overdue for retirement.
I alwyas preferred AMDs more compact selection of GPUs vs all the sub types of Nvidia ones, made me feel i didnt need a Phd to understand the difference between their own products.
Why, it's SO EASY! Just know that the 4070 is worse than the 4070 Ti Super, and the 4070 Super and 4070 Ti are somewhere in between, all within $200 of each other. How could that POSSIBLY confuse any consumer? /s
This really hit home to me when looking at gtx460s back in the day. The 1gb (and 2gb) variant had more render output units and significantly more bandwidth than the 768mb variant, even though they were called the exact same and you could easily confuse the two when purchasing. To make matters worse, Nvidia later released another 460, with 1gb of ram, but the same bandwidth and cut-down rop count as the 768mb version. At least for that one they clocked it higher to try and make up the performance difference. It would've been very easy for Nvidia to name the lower specced 460 something like the "gtx 455". They already did that with the 465, which was a cut-down 470 with less ram. Nvidia did this behavior again recently with the 1060. Their naming scheme is confusing and unintuitive. At least with AMD, they seem to be mostly consistent. (bigger number = faster, with an "xtx" addition for the highest end).
This is weird to me because when I got my current GPU (RX6750XT) last month I didn't even bother looking at Nvidia options before I got it because I knew they'd be overpriced. As someone who is only interested in gaming and sometimes recording Nvidia is just a bad choice because for a higher price I was gonna get less vram with the only benefit being that the gimmics I won't really use would run slightly better. Yea I will turn ray tracing on for 5 minutes to check it out but I can already do that with my AMD gpu anyway, and the VRAM amount is far more important than a gimmic. All the Nvidia cards I could get for the price of the 6750XT had only 8gb.
So true. I consider a realistic price for a gpu to be around the cost of current gen consoles. People that spend $1500-2500 on a gpu don't realize how niche a group they are. So for me when I bought my 6900xt for $500 nothing else really touched it. I don't need to use FSR but if I do I find it good enough and all games support it even if you need to run a patch - it's open source after all. I still feel like RT is an infant technology that isn't ready for the mainstream consumer because I'm told you basically need a 4080 or higher to comfortably fully utilize it. It seems to be forced onto developers that in turn overuse the tech so that people can "see" the difference and then all the games that use it look overly Shiney and reflective / wet or suffered from light glare that doesn't actually add to the game. I can run it fine in cyberpunk but I turn it off. It will be a great tech in one or two more generations.
@@michaelmabe8214 Yea barely anyone is ever buying those higher end cards, even in steam hardware surver the top ones are always at the mid to low price range.
AMD is on most pc channels the GPU THAT IS BEEN USED in builds. Maybe make an even use of brand in these builds would be a Lot better. I use AMD for as long g as they excist. The new cards and drivers are great.
I work at an ISP, we unfortunately buy a lot of Dell computers and laptops. The sole reason we do this is for the reliable availability of Windows PE drivers. As much as we hate Dell, they haven't ever failed us in having PE drivers for all models that we purchase. It makes imaging a new device or re-imaging an existing device easier.
Yeah we used Dell in a large office and their corporate support was real good. It makes it easier to support when most of the computers are the same and behave alike.
@@Derkiboi Support is decent but there's still plenty of issues. Proprietary e-waste is the worst part though. There is honestly a reasonable amount of good too
@@complete-mayhem-x64 oh yea, and fair to say there's a difference between 'ethical' and 'good' It's useful for the bottom line and Dell provides that. The support outweighs the proprietary parts and they probably don't care about e waste unfortunately
AMD and Intel need to massively undercut Nvidia. These generation on generation 25% price hikes can only last so long. The competition needs to grow some masculine balls, and go for it!
Intel is already selling at a fraction of Nvidia's price when counted per transistor, because of much lower performance per die area. AMD's and Intel's main problem is that Nvidia is technically competent and doesn't slow down.
the consumers need to give a chance to the competetition, gamers are married to nvidia like iphone fans are with apple. AMD tried to compete in price performance and it did not work, now they are only following nvidia they let nvidia choose the price. at this point nvidia has a real monopoly and only lets AMD exist so it doesnt look like it.
NVIDIA's long running market share has created an ecosystem of relative stability that AMD doesn't fully enjoy. I went AMD on my previous PC and Laptop and suffered many of the classic driver and applications issues. Back to Intel and NVIDIA for the most recent desktop and those issues are gone. I felt the urge to prop AMD up to help create balance, but that's also expensive...
Curiously I got the worst problem with my RX580, for like 1 year I installed, uninstalled driver. I feel like AMD driver got bigger driver issues problems but it's not just them. It seems the whole ecosystem is unstable, not just AMD. I had problems with Nvidia on my latop 970M with an Intel but overall less instability. They are too many new technologies being thrown at us, developers can't keep up, normal people even less. Linux for me is so unstable now, I can't even have the time to bother to make it work anymore. But Windows instability since windows 10 force update is a nightmare unless maybe you are in IT dealing with that everyday it becomes easier. But when you are on a time constraint you can't afford to install 5 different windows install with 5 different bootloader. At some point last year I had like 3 disk drive that I put in a corner hoping I would solve the issue later. I still haven't have to time to look at my laptop drive that don't boot up anymore, and I don't want to just clean install. So well to me the whole ecosystem is chaotic since we have windows going rolling release. The IT worlds have grown a lot more complex than 10 years ago, and it wasn't that easy to begin with. And to partially resolve my problem I have to use AMD professional driver on my RX580, I have less problems since then. This driver is not meant to play games but it worked better, at least didn't crash my computer.
My price threshold for a GPU is 500€ max, whether it's new or used. Same as I don't use more than 150€ for a new smartphone. That's how I value those particular items. So in my consumer eyes, it's pretty shitty that the prices keep swelling and the performance stays the same (or rather you get less for the same/more money).
What's UP with the pricing of phones? I mean, GPUs have tripled in price, which sucks, but my old phone I bought for $150 was a premium feature phone back when - nowadays, the premium models can cost ten times as much. How can they charge $1500 for a *PHONE*, and who will *PAY* that?
@@Panscrank999 The GT710 can play FEAR at high framerates. I remember when FEAR was a cutting edge game that hardly anyone could run. Therefore, the GT710 is more than enough GPU for anyone's needs.
Switched to a 5600X and 6800XT during my last build (right before 7000-series was announced), also switched to Linux since the AMD drivers work pretty much flawlessly there. Never been happier with a decision and build, I will for sure keep buying AMD after this switch and hopefully see them keep on releasing good parts for good value! Also very sad we lost EVGA, in an alternate universe I would purchase EVGA AMD GPUs any day of the week.
Linux is something we may be looking at more seriously with Windows making some of its recent changes! The driver stack there is totally unknown to me. Great to know what it seems somewhat stable. Any experience with NVIDIA drivers in Linux?
EVGA did not have expertise to work on anything but nVidia GPU's from what I understood, AMD themselves do have Sapphire, Powercolor and XFX which are arguably the same thing as EVGA was for nVidia, these are enthusiast brand partners who rose out enthusiasts tinkering with ATi GPU's early on and became professional partners later on. They have decades of experience, its hard to switch over night to an entirely unfamiliar GPU and architecture.
@@GamersNexus I ran a 1660 Ti in Linux, before switching to AMD with the 6000 series. Currently on a 6700 non-XT (they were cheaper than 6600 XT, at the time). The 1660 Ti ran fine, no real issues, as I use a stable distribution (Mint), not something cutting-edge like Arch (I bleed, so I don't like cutting edges). The biggest problem was all those nifty settings available in Windows? Yeah, no - the Linux application / GUI has been more or less the same since I first installed it, back when I got a 210 in 2009 for an HDMI out. It's beyond primitive, and there is no way for Linux coders to fix it, because nVidia release their Linux drivers only as a binary, whereas AMD driver support is integrated into the kernel. In simple terms, for nVidia, I had to install drivers (start menu -> additional drivers -> select + install), whereas for AMD, I didn't have to do anything, it "just worked". Nowadays, Xorg is being replaced by Wayland by many distributions, and nVidia isn't exactly fixing all problems this occur, now that Linux is moving away from the X11 window system protocol from 1987. Glitches happen, and the only thing Linux developers (much less gamers) can do is file bug reports with nVidia.
@@GamersNexus Not the OP here but, in my limited experience with nvidia cards on linux, you either get the card working with 1) the default Nouveau drivers (open source sort of reversed engineered drivers with some closed source components), 2) the nvidia proprietary closed source drivers (which only target specific kernels and distributions) or 3) your in for a trip. Either drivers 1) or 2) usually lag behind the windows drivers in both features and bug fixes. If you want to try linux in one of your PCs use a LiveCD/USB first to test support without installing anything. I suggest Ubuntu, Mint or any other Ubuntu derivative because they will install the proprietary graphics drivers, if needed.
That's not suprising. Jensen has been taking Nvidia to the moon for decades now, with gaming, blockchain and now AI stuff. They all use the graphic's card heavily and Nividia is pretty much a monopoly in this domain.
@@dead-claudia nvidia was sitting idly for long time, but the moment AMD started being a threat they immediately started working, as opposed to Intel that just laughed at "what poor little AMD can do"
28:06 everyone points towards AMD’s driver issues but how about Nvidia’s high-end GPU failures every generation? (The top end card always has an issue, hardware failure or “unoptimized” drivers at first.)
because it doesn't affect most of the users, for the regular buyer its obviously more important that the drivers work fine that whatever problem a gpu that they won't buy has.
Never had a problem with NVIDIA card since 970. Told my friend to buy 7700, his screen is turning off mid-game sometimes. Rarely, but it does. My iGPU in a laptop (some sort of Ryzen 8XXX, it's stashed at work) is bugged and is at 100% utilization all the time, even if i have no programs running. Sometimes it even makes my cursor stutter. Even after a clean Windows install. So here's that.
@@utarefson9 I've had years on AMD, no driver issues for me. Though I did have 2 friends on Sea of thieves with nvidia cards that kept crashing out of the game. Its all BS marketing from the king of marketing.
Going to add that I'm another AMD user for the past ~3 years now and haven't seen these driver issues people are talking about. Actually had more crashes from my 1070. I will admit that I don't buy bleeding edge though and my other computer with 2070s has been relativity fine. My point is that I can't tell a difference.
The NVIDIA vs AMD market share segment is so important to keep in mind. I tell people all the time who have this naive and blind perception about how capitalism works where they think that superior products sell and inferior ones don't and sales numbers directly and accurately reflect consumer preference because "vote with your wallet". No. If you are simply more abundant in the market or if the market simply provides virtually nothing except one type of product, of course that product will inevitably be consumed because there's basically nothing else TO consume. It's inescapable. Do this for long enough and consumers lose sight of the mere concept that there ARE other kinds of products that might actually be better. Even if better alternatives exist, they cant compete on a cost basis because less abundance means less economies of scale and therefore it's more niche and prob. more expensive as a result. On top of this even if a product is objectively superior there's simply a comfort factor. Everyone else uses the abundant thing simply BECAUSE it is abundant and understood. There's a reason Google and Iphones dominate their market. They're not objectively the best services or products for their market but why even try out alternatives when the big names work good enough and people will genuinely look at you funny simply for going against the flow?
Yeah, and the fact this happened with cars for example is really great because we have cities built around that and rubber is great for your lungs I hear
Comic book fans: "First time?" This is what is known as "flood the shelves" strategy, and is what Marvel Comics has been doing for decades. They have printed so much low quality stuff they knew nobody wanted, that it pushed out many books form Indie/other publishers, by virtue of being the biggest and most popular brand.
As a non movie guy do you think that's why they pushed out so many Marvell movies that even a die hard movie goer would have a hard time watching them all? I've heard the same criticism of those, that they are junk and just taking up space.
Some of us actually do put our money where our mouth is and buy AMD cards after swearing off Nvidia. That's why I have a 7900xtx. I mean I probably should have just bought the 4090, but I said I wouldn't and I didn't.
I didn't swear off Nvidia because I rarely used them ever. Most of my cards have been AMD going back to the 4650 1GB which was my first real card. I dabbled in Nvidia during gtx 460 and it was good but the low performance jumps and price hikes put me off after that. I tried Nvidia again more recently but their settings software was so bad compared to adrenaline and I also encountered driver bugs so I was like "I guess nvidia isn't more stable", and went back to AMD since price/frame is what I care about and it's easier to find deals on AMD cards.
At the end of 2023 I bought a Radeon RX 7900 XTX from XFX to support the underdog and because I was tired of Nvidia never giving you enough VRAM (I was often going over in games with my measly 8 GB on the RTX 3070 so it caused crashing or stuttering)... Now I have 24 GB lol and I crank everything to max. I've had a GREAT experience with my Radeon card - smooth FPS and performance in most games so I don't regret it. Plus Amazon / XFX gave me a payment plan so I paid it in 5 equal payments with zero interest... Meanwhile, all Nvidia cards on Amazon (in my country) were WAY more expensive - like the RTX 4080 (smth like $300 more). And to be honest I was surprised to find that I don't miss DLSS or high end raytracing performance as much as I thought I would. New Radeon cards are actually pretty good at raytracing anyway. So now I bought a 2nd Radeon GPU for my other PC - the RX 6650XT (also got an awesome deal) to replace my Nvidia GTX 1080. Cheers Ps. I'm pretty sure the name Nvidia comes from the expression "To be GREEN with envy" (of someone)
I feel like Vega and Radeon VII should have been grouped together as these both had Vega GPUs and they were on the GCN 5.0 and 5.1 architectures. On the other hand Radeon 500 had Polaris GPUs and was on the GCN 4.0 architecture and Radeon 5000 had Navi GPUs and was on the RDNA 1.0 architecture.
Tbh great video love the channel and all the performance based testing that really is how i choose my hardware buys! This market data video idea is one of my favorites though now outa your library…. Ive always suspected the exacts of your findings but could never find the best approach to look into it so i gotta say i agree theres no perfect way to do this, but you found about the bedt damn approach so kudos
I've purchased NVidia GPUs BECAUSE it has CUDA. OpenCL, being what it is, just doesn't have the support (nor the performance) that CUDA-enabled applications has. I remember back in like 2009, where CUDA was just starting to become a thing, that there were already a lot of talk about porting applications that traditionally ran on CPUs over to NVidia GPUs. I think that AMD really missed the boat on that one, and so far, they still haven't been able to catch up. So, it's going to take them a REALLY long time, JUST to play the catch up game, let alone beat NVidia for GPU-accelerated workloads that AREN'T AI specific workloads. The software support for OpenCL just isn't there. And THAT, as a result, has a HUGE influence on what I end up buying.
I agree, and can add it's rather common for the gaming market to underestimate the power emergent technologies. I also worked in some researches using GPU acceleration by those years, and remember the challenges of doing scientific simulations using shaders and texture tricks. It was already powerful, and CUDA took us to another level. But to my "harcore-gamer" friends it was pointless, because it wouldn't bring more performance to games. Recently, the same consumers were against AI and raytracing, for example, also seen as pointless. Gamers don't want innovation, just more performance on existing games, and AMD delivers this. They have amazing products, but with a short-term strategy of lowering the price of existing tech o_õ
@@SauloMansur Agreed. AMD needs to up their OpenCL game, BY A SIGNIFICANT margin, and they have a VERY steep hill to climb as not only will they have to beat NVidia in CUDA performance, but also demonstrate that it is as capable (or more), and as stable and robust and then some. I use 3090s actually less so for gaming now, and moreso for locally hosted AI as the results via CUDA appears to be more well known (from an implementation and deployment perspective) as well as using a RTX A2000 6 GB card for GPU accelerated CFD. There are a LOT of applications that doesn't support lower end AMD GPUs (I think that the lowest end that is usually listed on the QVL for the various software ISVs is the AMD Instinct MI250). So whilst I CAN game with a 3090 (of course), but that's not ALL that I can use it for. I can't necessarily say the same for say the AMD 7900 XTX.
@@ewenchan1239 Funny, I'm endlessly frustrated with my Nvidia card, because they offer F all support for ARM structures. In general, I've felt more and more reason to get an AMD card since switching to Linux, but I don't really want to spend that money here and now.
@@contentsdiffer5958 Three things: 1) I think that it *really* depends on what it is that you're doing or trying to do. 2) You would think that Nvidia should actually have relatively decent support for arm64, given that they have systems with ARM processors in them, but again, depends on what it is that you're doing, specifically. 3) If you're able to switch to AMD, then that suggests that whatever it is that you're doing isn't tied to CUDA and CAN run on an AMD card whereas a lot of what I do, either can't, or will only run on the AMD Instinct MI250 or MI350 ONLY. Compare and contrast that with the more diverse options that are available from the Nvidia camp. (Don't get me wrong, I am not a huge fan of Nvidia neither, but in terms of being able to get what I want/need done, Nvidia has more options available to me than AMD that's been validated and certified by the ISV.)
Nvidia using less VRAM allows them to produce more BUT it also pushes sales to AMD. This is gonna be interesting because it’s probably going to lead to cost cuts and make Nvidia options more competitive. I bought 4070 Super for $600 and returned it because my 1080 is over 50% of its rasterized performance and RX 6800 is $359.99 and has 80% of a 4070 Supers rasterized performance. There pricing makes 0 sense.
@@Derkiboi The 6700XT consumes about the same power, but the 6800 is 25% faster and has 4GB more VRAM, while also costing a little over the RX 6700 XT during the entire generation, the 6700 XT fell in price only after the 7000 series came out, while the RX 6800 didn't get enough production to drop as much, but it's still the same price/performance as the RX 6700 XT currently, during their generation the RX 6800 was better price/performance.
Yes, but you don't get the nice UI like in Windows. They actually don't have any first party UI applications. They push the drivers to the kernel before the cards come out and patch things there as necessary. If you want the cool stuff like screen recording or one button overclocking, you either hope someone made something and published it online or you do without. There is usually some third party application but it would be great to get feature parity. The features like RT and FSR do show up as options in game settings though, if the game has them. I don't know first hand if Nvidia has UI applications for their cards on Linux though since I don't have an Nvidia card. I don't recall ever seeing anything from them either though.You install Nvidia's drivers separately(only AMD and Intel put their drivers in Linux directly)
Linux prefers open source drivers and AMD has much more open drivers. Years ago Nvidia was still better for Linux, but Valve put a ton of work (hired driver programmers) to make AMD drivers good for Steam Deck. However recently Nvidia is starting to open their drivers on Linux more too.
I didn't have any problems installing Linux drivers for an Nvidia card. I was expecting all kinds of problems from what people say online about it, but it was fairly straightforward. No more complicated than it is on windows, really. Now performance isn't something that I can attest to, since I don't have an AMD card or a Linux system with ReBar for my Intel card, but I did try gaming on it and it worked, so make of that what you will.
Interesting video, back in the 90's I was Intel/Nvidia biased and back then a mid spec PC would go up to $3-5K, it wasnt until 2004 in high school when my good friend Mourice showed me the value of AMD chips when he showed me his AMD Athlon dual core custom PC under $1000 and decimate any game he threw at it with an AGP Radeon Card, back then I had a old Pentium III with a PCI Gforce 4000 MX and it was a night and day diferrence, since then all my PC builds have an AMD CPU/GPU below $1200. That all changed after the Crypto madness and all mid range value GPU's went unobtainium, heck I still remember when I set my first Crossfire Radeon 290 in the low $200 bucks each, those were the good days. My last 3 cards was the Vega 64 for $500 bucks, Radeon 5700XT for $450 and lastly my Radeon 6800XT for $520. So do I need to spend over $2K ona GPU for raytracing at the cost of FPS? or laggy fake frame generators?
I bought my 6800 XT because of price and price-to-performance. Simple as that. It cost $200 less than the comparable Nvidia card. Simple fact is, I'd be buying Nvidia if they did that. So I do thik Nvidia needs to be humbled, as it were. I do think consumers need to stand up and push back on performance pricing for a mid-tier or entry level card. If AMD isn't going to step up, and we know Intel is incapable of stepping up at this time, it's we as consumers who have to push back.
7:00 "Simple repitition is enough..." Head-on, apply directly to the forehead. Head-on, apply directly to the forehead. Head-on, apply directly to the forehead. Head-on, apply directly to the forehead.
Base on *Gamers Nexus, Hardware Unboxed and Digital Foundry's* data, there was a 0% chance I was going to buy AMD. I pretty accurately predicted the 7000 series current market share based on that data. I mean, if you have to mention raster performance to highlight AMD's value, I think you've found the problem.... *Consumers have spoken.*
I literally saw a store person sold gtx1650 to a girl over RX6600 despite costing the same. I didn’t want to see this girl getting scammed so I told the girl RX6600 is more than 200% faster (literally spend 10 minutes showing her why it’s better). But the girl still went to buy gtx 1650.
I only bought EVGA GPUs, my last being their 1080Ti FTW3. The 20 series was too expensive for the performance uplift and the 30 series was made of "unobtanium". Then the bombshell, EVGA giving Nvidia the middle finger. The 40 series was still priced more than I was willing to pay which left AMD. I bought a 6900XT and it was a nice upgrade over my 1080Ti. When the 70 series went on sale (briefly) I bought a PowerColor HellHound 7900XTX for $900. Again, a nice, but smaller uplift over the 6900 than the 6900 was over the 1080Ti. I recently bought a HellHound GRE for a friend (no sales tax in my state) and used it for a week. The GRE has a smaller cooler, so it tends to ramp up and down, but the fans themselves are fairly quiet. $400 is a fair chunk extra for a bit more performance and noticeably quieter. While I would "trust" an Nvidia card to perform well, I am unwilling to pay the Nviidia "name tax". Yes, the 4080 MSRP at $1000 puts pressure on the XTX, but $2000+ for the 4090 is ABSURD. Grandkids have my 1080Ti and I gifted a cousin the 6900XT, and both are quite pleased. Obviously, consumers do NOT do enough, if any, research before buying "stuff". Why your channel and a few others are needed.