Nvidia has Proprietary CUDA (hardware/software), and all the ML programming languages like Tensorflow, Pytorch, .... are built for it. So, even if 7900XTX performances like 4090 Ti. The Data Center/ML/AI people cant use the AMD cards. PS: AMD has ROCm as Cuda alternative, but it ha very poor support as compared to CUDA
This is 100% it. AMD simply cannot compete with the current AI/ML needs. Everything is built for Nvidia because they were the only viable choice at the time those things were developed
I have used NVIDIA GPUs up until 2020, i switched over to AMD for the first time (6900 xt) and earlier this year to 7900 xtx. I have had zero (literally 0) crashes since i have used AMD gpus. I also used Intel CPU up until i upgraded to 7900 xtx, so i have full AMD system now, it's crazy how much more stable AMD systems are, knowing i had about one crash per month with NVIDIA gpu. Also the overclocking is much easier with AMDs inbuilt software :D I hope it stays stable lol
Dude, c'mon. You can support AMD all you want, and kudos on you for fighting against NVIDIA's shitty business practices, but to pretend that on average AMD is more stable than NVIDIA is fucking insane. Get a grip...
@@PPedroFernandes I can promise you that i did NOT go for AMD before due to the reputation of their worse software (and also less powerful GPUs), but holy shit, maybe i got lucky with these upgrades, but my system has not crashed a single time & runs smooth AF (got Ryzen 7800X3D). I don't have any brand preference, i only go for what is best, and AMD is overall best atm. Who knows, might be a full intel-based system in 5 years?
Anyone notice that a part of Apple's Game Porting Toolkit 2.0 is the introduction of ray tracing and AVX2. This is what you need to port games from NVIDIA to Apple M series.
Linus's idea of renting out nVidia's hardware well nVidia already does that for the highest end enterprise hardware. The other way to interrupt what he said of the hardware being on the nVidia side with the data streamed back and forth alike to the AWS services. Well Amazon is a huge name and most likely nVidia would rather sell to Amazon instead of compete with Amazon as they saw what happened to Intel's x86 when Intel pondered in the public of doing so. Amazon moved away from buying anything with x86. So keep the big business partner who both sells and uses your products. So as they seem to understand the long term profit models instead of short term profit they do not want to compete with who is most likely one of the biggest buyers of their product Amazon closely followed by OpenAI and Microsoft all 3 using nVidia GPUs. So nVIdia gets the profit of just having to provide the GPU chip without the major loss of having to run it 24/7/365. Now when the data center market goes back to normal they are already trying to promote their GPU cloud services to gamers at a reasonable price of 10 or 20 USD a month both with limitations. Both worse than running the same rig locally but much cheaper unless you count in the internet price needed to get a stable 1080p@60fps for days. Still might be cheaper for the consumer than buying a 4080 at current retail market prices. The other nVidia is trying to get into is the console market and depending on who you believe the 10th gen consoles will have it. With that AMD and nVidia are almost always competing for that so I am unsure but both want to be in the next gen. Another they want to be in is cars. nVidia is already in some cars but not all mostly due to the amount of power needed to run their chip so most EVs do not use them. If nVidia has a even lower power chip while not having noticeable less quality then they might get into the "media center" part of EVs. Same with the screen in between the front 2 seats as it also need graphics processing though not a lot and really an optimized RAMDAC is most likely enough. As 256 colors is enough for most as it isn't there for anything but right away information for the driver and shouldn't be a distraction so most of the modern GPUs can just be taken out which will make it take less power to run. Another is for tablets. nVidia competing with Qualcomm mostly winning on the side of a tablet designed to be a drawing tablet while on the side on the tablet made to last a long time the lower power qualcomm chips seem to do better. Etc. and so on with what might be the next market to take off like a rocket and nVidia has their finger in many pies.
The fact that Luke is immediately asking “And you have no more driver issues or anything?” tells you all you need to know why NVIDIA has this big market share despite AMD having good products out atm. It’s just more stable and less annoying to use. It’s the safe bet for a consumer, even if it’s expensive. I build NVIDIA cards in every system a friend ask me to build for him just because it’s less likely that I have to constantly troubleshoot something with it for the rest of the systems lifetime. And I’m shure I’m not the only IT-friend who does this.
for a consumer, distinguishing between nvidia GPUs is so effortless compared to what is on AMD.. if there is nothing against amd, this naming is, and i think it kills it and its brand.. the brand it self is so far away to me, and shows no effort to come close. and intel is also falling to this.. simplicity is not that unimportant.. another area this issue presist ps5 vs xbox whatever.. what where nvidia might lose is price... and i'm hoping for this
compared to 2 years ago the tech is 3-4x better, there are 1000s of papers released each day progressing the tech. you may not like it, but its improving almost daily
@@FusionC6 Sure but what is going to be done with it? What more is it going to do? Even when asked, OpenAI execs, Google execs, MSFT execs can't seem to say what they'll be selling you as a product. Only that 'it's improving'. When asked about halucinations, Sundar Pichai couldn't say anything other than 'I think that makes it better that the AI is being creative.'
@@WhiteG60what's it going to do? Literally everything. It'll take several years, if not a decade or more, but very nearly every job that humans currently do will be taken over by some sort of AI powered machine. As for the hallucinations, the dude is right. That's actually a good thing. It means the AI is capable of cobbling together its own ideas from existing information. That's literally what sentience is. It's still light years away from becoming anything resembling human level sentient, but progress is progress. For applications where absolute accuracy is important, like say an AI doctor or Lawyer, there will be models that make no mistakes and have no hallucinations. We're not there yet, but it will happen eventually. For everything else, it really doesn't matter. If the AI burger machine at future McDonald's adds pickles to an order that requested no pickles once in a blue moon, big whoop. It's still doing better than an average McDonald's employee currently.
I went 40 series over AMD because I find AMD's naming schemes super confusing, and they don't support DLSS. The unfortunate fact is that DLSS is still better than FSR and more games support it
We all know it's not. Both those who missed the investment boat and those who didn't. It's not an ecosystem maker, but since ecosystems need Nvidia, it's definitely worth a great part of those trillions. It will take something truly special to keep the boat all together.
The 4090 and 4080 are just objectively better than their counter parts, so most don't mind shelling out a couple hundred extra dollars. If your in market for high end, price really doesn't matter and there's really no reason to buy AMD.
@@infinite683 if you want a 4080 just get a 7900xtx for a lot less nobody actually uses ray tracing, saying money doesn't matter is a redicilous argument
Ok maybe on this video you may actually see this comment, you should send Luke to season 2 of Inside from the @Sidemen if he is down and if they allow it.
@@SimonBauer7Their software stack in comparison is dog shit. Their use case outside gaming is also dog shit in comparison. If there weren’t so few options, you’d be speaking a different tune.
88% is how much the "average consumer" is in a sense, those are the people that only hear about Nvidia because of the AI and its long standing lead over the market, plus with 200 variants of the 4000 series, they're also flooding the market. And eyes are everything.
I also think a major part of it is laptops. I'm sorry but, there are like,, 4? amd gpu based laptops available, and a lot of people just don't play on desktop for one or many reasons.
No its not at all lol, NVIDEA makes so many other products including processors in military equipment, laptops, and even consoles. I bet GPUs arent even close to their main revenue stream.
The only thing is that nobody is making money from AI except NVIDIA, so it's going to pop for sure. I'm not saying AI is useless or anything, but I'm saying that everyone and their grandmother is buying up hardware to do this shit and only like 1% of them are going to succeed, and when the others get bored of burning money there is going to be a massive over abundance of hardware, so yes in the mid term NVIDIAs stock price is going to crash hardcore, long term, I see this as being a bubble, but it doesn't mean that I think there stock will equalize around about where they were before the AI boom after the hardware dries up and NVIDIA launches a next generation part that massively increases AI performance they still have a product they can sell for a lot of money so long term they will be better off for being ahead in AI, very long term wise though I don't know that NVIDIA stays in the lead.
for tsmc i guess it is a issue of the whole (when) will china invade taiwan or not debacle but as for asml i really don’t know why it’s such an underdog 😅
Yep ASMLs growth vs Nvidia has been driving me crazy since I keep thinking the idea of the ASML making the tools that Nvidia relies on is the same idea of Nvidia being relied on for gpus and AI
Yeah it’s very interesting considering ASML is the most important company in the world since its launch. Without ASML there is basically no digital world since the making of chips starts with ASML’s EUL. Although the relationship between TSMC and ASML is so strong that one could argue that they share the same number 1 spot as the most important companies in the world
NVidia is overvalued. They are in a perfect moment because suddenly AI exploded and having the CUDA monopoly they were the only ones to offer reliability, a familiar framework and performance to developers. The rest of the industry was caught off guard, but it's only a matte of time before they catch up. And I am saying this as a NVidia user (I need it to do machine learning research).
Nvidia doesn’t even have competition. No one is “catching up”. With as long as AMD has been in graphics, and all those who have left, let alone Intel coming in with products so cheap, they still have damn near 90% of all products. You’re just typing nonsense to put Nvidia down with no merit.
You overestimate the importance of the GPU market for graphics/gaming. AMD offers better cards than NVidia in terms of performance per dollar. I am no gamer, but I honestly wonder why anyone would prefer NVidia over AMD for gaming...I might be missing something. Over the past years, the reason why NVidia was ahead in the market has had less and less to do with gaming/graphics and more to do with CUDA used for ML/AI. Now ML/AI has exploded and the vast majority of ML/AI software is build around CUDA. The math is simple.
Also, this isn't to say that NVidia doesn't deserve to be where they are, or that they won't be holding on a huge share of the market. But with time, other companies will be able to compete for a fairly large share.
@@hyperplano Fine then, go look at steam hardware survey. Nvidia is just better, they have a better software stack, even down to the little things like Nvidia broadcast vs amd voice. Frame generation tech easily goes to Nvidia with DLSS over FSR. And even if that flips, AMD is open source so the Nvidia card can use it, but AMD card can’t use DLSS. Stability and graphics issue with AMD has been more prevalent, even with SAM I’ve had crashes, which requires having both AMD cpu and GPU. Nvidia is simply better in every way but base raw power per dollar.
Nvidia software for devs, ML etc. is what's killing AMD. ML engineers are expensive, buying them the hardware that doesn't make them waste their time (and your money) is a no-brainer. I run a 6800XT because it made the most sense when I bought it but for professional work, particularly in ML - there is only one unfortunately.
If your workload demands advance usages, intel and Nvidia is the go though. MKL, Cuda, cudnn, oneapi… is something amd just can’t complete. ROcM barely supports windows, and it is difficult getting to work in Linux. Even more, the rocm from Ubuntu 22.04 isn’t the same for 20.04. The dependencies is just hard to fix. Even getting installed does not genetee running.
I have mostly Nvidia GPU's in our household for many years now, they served and serve us well. But my AMD Radeon 7900XT is the best GPU I have ever owned. I have had very little issues, it OC's and under volt's very well, does well in both games and benchmarks. Ray Tracing it's only weakness but it has greatly improved since launch, to the point where I have turned it back on in CP2077. We also have both Intel and AMD CPU's, no fan boys in our household, all welcome, consoles too.
I’m not expecting Nvidia to hold this position cause they make one thing, and one thing only. The moment ANYBODY makes something slightly better, it crashes down. Meanwhile if someone makes a better phone, Apple has Mac, iPad, whatnot. They also have stronger brand loyalty than Nvidia.
Lmfao what a load of nonsense. Like Linus, I also have xtx, and Ray tracing isn’t even a thing on their flagship. It’s just nonexistent marking talk to say IGPUs can.
@@brandonhoover2120 Not true. It's that things are tailored to nVidia. Games that aren't written with RTX/nVidia in mind, AMD ray tracing is just fine. It runs great in RE4 Remake.
@@WhiteG60 lmfao no man. Ray tracing isn’t “written” for Nvidia. Nvidia just went heavier into the technology than AMD and does better at everything they want to. Go ask literally everyone with an ounce of knowledge what you should get if you want Ray Tracing, it’s Nvidia. Drop the nonsense bias.
@@brandonhoover2120 Except go look at Hardware Unboxed's 4070/7800XT comparison video. In the US, the 7800XT is $200 cheaper, and is faster in rasterization and splits RT almost right down the middle. The one game that HEAVILY favors nVidia RTX is Cyberpunk, which gets used in every review, so everyone just assumes all games are that way. @ 1440p, the Radeon is only 2% slower on average across 13 games in RT, and 7% faster in rasterization. So again, let's not look at the results of one or two games that get used all the time in reviews and assume that that trend carries through every game out there.
@@Goldomnivore A life not lived is a life scorned, never to be known and never to be enjoyed. A fervor of untold pleasures and happiness, nought and never to be. Why? The gnashing teeth of the past encroaches close to your future.