@@churblefurbles is more than thst nvidia was always anti consumer, i am lucky amd si more than enough for my tasks and is not like the most gpu sell at entry level / mid range. 3070 was a meme, gtx 1630 is weaker than 1050ti and is the same price as rx 6600, rtx 3050 is the same price as rx 7600 where i live and so on. rtx 4060 / 4060ti another joke ...
Then AMD comes out with a new Rhyzen G next year who needs a graphics card when my CPU has a built in graphics card as fast as the 3070ti 🤣 I'm buying it next year
Amd jacking up prices also. Intel never lowered them. I think there is a silent egreement between those, just waiting until one of them raises the prices the others follow. Just like the iPhone. We will never have good bang4buck pcs again. Time to welcome the ultra enthusiast eara and go console+cheap laptop if you want to game and work on the cheap.
I am a corporate slave for over 30 years. What I have observed concerning management is that are not guided by what is right, but rather by what they can get away with. I have also observed that the higher up a manager is on the corporate ladder, the more amoral that person is. In other words, doing the right thing is what peons do.
Yeah, basically if you want to climb on the social status mountain the easier way is to be pretty immoral even if you don't like it. The system is just built that way and if someone manages to do it without being immoral, is an exception and not the norm.
@@sebastianandres6802 In my opinion, it is amoral, not immoral. Amoral means people don't give a damn about what is right or wrong. Whereas, immortal means they are doing what is wrong. I think this is the reason why sociopaths do so well in upper management.
@@amarupsyn I am not sure if I agree. I think people have to be sociopaths to gravitate towards power. I suppose their sociopathy and power reinforce one another.
@@watcherworld5873 Sociopaths certainly gravitate towards power, but people end up with power motivated by other reasons as well. In the current corporate world it's not immoral to do these things, and it's not even amoral. It's MORAL according to the culture. The moral virtue is making as much money as you can, and failing to be honest, manipulating, etc. are all honorable and respectable things to do when trying to achieve this. When you're leading a public company you can justify everything because you have responsibility to make as much as possible. Most modern societies lack morality because we have convoluted hierarchies of justification. Morality varies between cultures. What is moral in Saudi Arabia is often not moral in Europe.
Saying all the time that you are right is not a nice characteristic, a lot of youtubers in tech have this problem. Same as MILD. Let your actions just show and dont remind us how good you are at telling the truth
I'm sorry but I fail to see how this is even a realistic expectation. The audience is not all at the same level. RU-vidrs are forced to do this messaging as far as I can see. The difference is, you have to be able to tell when they are exaggerating the claims. If the claims are reasonable then I think we just have to accept that.
Not trying to jump on the band wagon so to speak, but dang, Coreteks really doesn't seem to like or respect the guys from Digital Foundry either. What's up with dat?
No need for that , the stock market is about to collapse because of all this fake fiat money. Nvidia is not a real raw material producer . They don't even own their factories , once the dollar goes into hyperinflation this tech bubble will be a long forgotten souvenir .
@@churblefurblesDon’t pretend that people’s “votes” have any real weight when the truth is that the US and pretty much the West is 3 oligarchies in a democracy trench coat.
GPGPU can be used in so many ways I don't see why they need to specifically claim a GPU is for one thing or another. The GPUs themselves are "gaming" GPUs but can be used for many things. My only issue is the way they were selling in bulk to fly by night operations that were siphoning stock to miners.
Way to throw shade Coreteks lol. But I agree with you that the AI gold rush period is coming to an end cuz I've been saying the same thing for the better part of 8 months. The true paradigm shift will be when AI and quantum computing merge . That's when we'll actually have true AI and that's when everyone should be worried .
they're getting to the point they can make their own fabs. If things keep going the way they are, there won't be enough fabs as it is to keep up with demand anyway
Look It doesn't matter how much money a company has if we the customers stop buying, the giant can be 'knee capped' and brought back to reality. I can't possibly justify the cost of a gpu anymore... Its consoles all the way now.
I would expect that TSMC is going to raise wafer prices on NVidia soon, if not already. TSMC will demand more of the sundae, as Samsung probably cannot fab those giant pieces of 3nm silicon.
You didn't even mention the GeForce Partner Program. Another great benefit for its suppliers and customers! What a great company. I feel so lucky to live on the same planet.
Honestly for years ive refused to buy nvidia gpus... they're gonna have to do a really badass deal on the shelves in order for me to be willing to buy from a company like this
I wont buy their products, the only ncrappier thing I ever bought was a used gt 7600 for 20 quid over a decade ago. These price trends just mean I stay 2 years behind the curve.. A used rx 6650 xt or 6700 10Gb/xt will be looking good when everybody is broke mid February. It will literally come down to midnight bidding and what price I can win at. ;#}
Nvidia may have committed fraud. They partnered with a hedge fund company they have had shady dealings with in the past to sell AI accelerators to a company called Coreweave and that company failed. Nvidia got those GPUs back and resold them.
It will be interesting how well the new Microsoft Azure AI chip will do. Then I wonder if AMD and Intel also soon will deliver properly AI processors for data centers.
The MI300 is a monster. If AMD can leverage its design advantages (chiplets, and 3D stached cache memory) successfully, it could even one-up NVidia in the future. I said possibly! Not that I would bet on it!
How expensive would it be for the in consumer to buy a 4090 in China? My buddy has been asking me to send him one, but we are afraid that the Customs there will confiscate it and it'll end up in some stupid server
Am I the only one that noticed the premise of this video and numbers he keeps referencing in regards to Q3 2023 are actually the projected revenue for Q3 2024.....
No, the AI 'bubble' will not collapse in 2024. LLM development is accelerating. 2023's big revelation is that there is no practical upper limit to scaling for training. Many corporations have recently adjusted their plans for AI accordingly. They are ramping just as fast as they can. This trend has long legs.
q3 was an Nvidia mobile H attach quarter and so will be q4. Laptops surpassed cards on the shelf at 58% laptop four weeks back. However, when you roll in Ada SI prebuilt WW channel available laptop is a traditional 33%. Cards v SI prebuilt is interesting. For example 4090 represents 5% of cards on the shelf across the full line, but when adding 4090 in PC prebuilt 4090 = 23% of all Ada available so it might be back to getting a 4090 is a PC prebuilt. mb
If i saw blow through (passively cooled) 4090s i'd be even more worried as this is the standard design for servers and datacenters Oh wait, there are a bunch of them now, damn
@@Kurukx Normally, i'd say you're right. However, the new H100 has so few ROPS a 1650 would probably beat it, yes it has the FP32(CUDA) horsepower to stomp modern desktop GPUs, but it cant render those cuda calculations without ROPs
@@Kurukx I should also note that unlike Ampere and Ada, the A100 and H100 do not pull double duty with the INT 'cores', so any FP32 will only run on the FP32, it cant also run on the INT so the 'cuda' count is going to be lower. This can result in Ampere and Ada having up to 2x performance, but also as little as 0x performance boost if INT is being used. But generally Int isnt used that much in very many games.
Blame AMD for lacking in the competition sector, not just for games but across the entire spectrum of usage a dGPU has. AMD might not even want to out compete Nvidia either, as both CEOs are members of the same family. Bad optics either way. Intel may be the only one to throw a wrench in it, especially since they have their own fabs and much larger pockets in which to fund r&d, not that they are necessarily using that for GPUs yet.
It's the result of GPGPU design anyway. The complaint about how they classified their sales is goofy, kind of like Intel only considering Xeon chips to be "business use" or w/e when most of their CPUs have the instructions and ability to do the same work anyway.
@@sinephase Aye, but I suppose it would be really hard to try and get an accurate estimate unless nvidia starts doing estimates of sales for gaming using steam hardware survey or something like that.
@@genstian My issue is with how he framed the issue as if they're doing something nefarious or "against gamers". Their whole GPGPU strategy started with PhysX IIRC and all of them have been capable of non-gaming workloads ever since. Now I guess they call it CUDA but it's a continuation of the same thing. This is all a matter of supply and demand, and I don't fault them for selling to those willing to pay more or for guaranteed large orders of their expensive chips.
@@sinephase 4090 is even just a cutdown version of AD102, same chip used in server GPUs like L40, RTX5000, RTX6000. But yes, Nvidia have known ever since PhysX and CUDA v1 that compute is the future, Nvidia thinks frame generation tech will replace rasterization entierly, and they might be right.
I am disappointed to see you feeding into this wealth ranking nonsense. There is no way Jensen is 27th richest person or even close. He's #27 out of the subset of people who's wealth is made up almost entirely of equities in public companies who's shares have a known market value that they hold personally in their own name. These shares might be more liquid than some other assets, but they are not cash. There are no doubt more than 27 members of various royal families who's wealth is far greater, for one of many possible examples.
10:35 Nvidia tech that “no-one wants”. You had me until there. I have a 120hz LG OLED that’s is of course 2160p. I want to play all the latest games with all the latest graphical technologies like ray tracing, at 4k, at 120 frames per second. Right now, an RTX 4090 is the only GPU on the planet that can do that. It needs to use every trick up its sleeve to pull it off - from DLSS upscaling to frame generation. With Alan Wake 2, I can max out all the settings, all the ray tracing, set DLSS to performance mode, enable frame generation, and I’m constantly locked above 120 frames per second. And the overall visual experience is unbelievable. It looks identical to native 4K to my eye , and the responsiveness is outstanding. Even RADEON’s top-of-the-line GPU can’t even come close to doing that. With a 7900XTX, the best you could do is set the rendering resolution to 720p, use FSR2, and the resulting image would be a blurry flickering garbage-pile mess. Nvidia has the only technologies that make what I’m trying to do even possible.
You use a 1600$ (2k dollar now) to play on upscaled 1080p and comparing it to a 999$ GPU (930$ now)? Imagine having top of the line specs and still being forced to use upscaling. Not saying that 4090 is a bad card of course but 4080 vs 7900XTX would be a better comparison on price/perf point (which the 7900XTX is actually beating 4080.)
People keep buying Apple no matter how much money it gouges from its customers and today it is the company it is because of that (was surprised to find out that in Phones, its like 80% plus in America by market share and more than 90% amongst younger generation below 20 so blame the Americans for Apple and Nvidia too probably!)
@@jahramika Yes I forgot to mention that the marketshare I am talking about is for phones. They are far behind in computers and laptops but they are overpriced machines too no doubt.
10:27 "But hey, it's not like the media is constantly promoting NVidia tech" MY SIDES. Resubbed...as long as ad videos like that UE 5.3 abomination don't make a return
Nvidia treats the Data Center Market totally opposite from the gaming market. They give data center a range of configurations and price points, as well as immediate supply. With the gaming market they practice scarcity and large price gaps between tiers.
Hardware is about to plateu so the price for pc parts will crash hard and nvidia cant stop it. moores law is dead so they cant really go smaller so they have to use improved transistors and do transistor stacking. ARM said they plan to use technologies like GAAFET and transister stacking called CFET. going below 1nm is an unachievable pipe dream and ARM knows it.
If people were so strong to not buy their stuff, if only for one generation, it would be good for the people, right from the next generation products. But yeah..
I am not sure the AI bubble is going to burst, but shift, from gpus to more dedicated neural, analog and organic processors, areas other companies are surpassing nvidia, including the likes of intel and ibm. If a big shift happens they will implode in value not being able to not only do more profit than last years quarter but year and even having a big enough decline to cause a sell off.
If you've seen how coders do their work, you wouldn't be surprised that they reuse others' code. They get paid for the code they "produce" so they're incentivized to do it as lazily as possible.
They are pushed to deliver whatever works best for given time, rarely or never refactor previous code until the tech debt grows so big that anything they touch is gonna break or underperform something else . Time of ship stuff when is good was over decades ago, now you just need to ship faster
Lol, that projector is trash. The hisense u8k 100 inch TV is 3k on sale at best buy. I love projectors, and you are gonna need to dig deep to explain why that's better. The Sony a 95 L at 83 in is also 5000 and I'd much rather have that with far superior processing. There are just better products on the market right now then that projector, if it was that good other people would be talking about it in the home theater arena and they aren't. Next year, the Chinese have 110 inch panels. With low input lag, and a half resolution double refresh rate feature. I can go on.... Projectors don't really do anything until you hit the insane price levels now. None of which is going to have any bearing to your customers if you are preaching about products being too expensive than that projector is way too expensive
There's no real value point for that projector dude .. don't mess up you're channel just for one uber expensive toy ... You've built up a lot of kudos by calling out overpriced tech , it just feels disingenuous... it's like watching one of Linus's " look how good my house is" videos.
When AMD will have something as good as CUDA and all the software around it will be considered a serius competitor when you buy a graphic card often it's not for gaming only it's a more general device used for different tasks, so why not say you AMD and it's relationship with software may be the thing that created NVIDIA so big
@@wakazimaru And which GPUs even had access ROCm? How can it even take off if its locked away from the vast majority of consumers? Of the reports so far, despite it sounding nice it has yet to show it can actually prove to be an acceptable alternative. You don't buy GPUs for what you think they may be able to do in the imaginary future, but rather what they can do in the now.
Since a while I have decided I'd never buy nV$d$a gpu again. Them being at center of the so-called AI, which is just advanced automation to replace human labor, being cheerished and funded by traitors and fools. A corner stone in the global agenda.
Your logic could be applied to the creation of Automobiles as well, which replaced the horse and buggy. That resulted in a ton of lost labor, but I bet you will still happily use automobiles. The simple fact is, new technology, regardless of if its abused or not, is inherently disruptive. Its always been that way, it will always be that.
@@deuswulf6193 respectfully disagrees. What's been going on is a text-book example of 'this time it's different'. I mean just look at how many jobs/sectors they are targeting.
Gamers help build Nvidia and how did Nvidia reward them? Oh, they didn't. Buy Nvidia if it is what you need but don't fanboy/simp for them, they have no loyalty to the one who helped make them what they are.
If only AMD offered something that was good value instead while they had the chance. Like they did with Ryzen against Intel... sigh But now they are too far behind in features and even plain old raster performance.
@@NatrajChaturvedi they've been behind for the last 20 years, ever since they had to file in court against nvidia and intel for fixing the markets. you think they've made enough money to compete against intel and nvidia? get real, everyone bitches that amd isn't doing enough, but 80% of gamers onmly want amd to do good enough to drive down nvidia prices so they can go buy nvidia. trust me when i say that i've watched as amd fights a two front war, then get the blame (sometimes deserved) for almost everything. nvidia has had the best people for the better part of a decade. they can afford to pay them to keep them from going to work for amd. you've never seen how bad it was looking for an amd laptop and not finding them anywhere in north america . just because intel and nvidia could pay companies to keep them out of the market in most countries. they did the same thing with oem desktops for years and now they don't have the money to throw at a problem right now. they have to keep their eye to the future and hope that they can plan well enough to keep their doors open.
@@craiglortie8483 My last two laptops had AMD (CPU and or GPU) components over the past 15 years. I picked them up in the US at a brick and mortar store. It is not at all, even remotely hard, to find AMD powered PCs anywhere in the US which is most of North America already.
Crypto is dead. Mining is a waste of time, using gaming cards. Bypassing the SEC ban, sending 4090's and repackaging them, will result in the biggest fine in their history.
They were selling off expensive partner inventory just like Nvidia. They'd rather make margins by falling in line with Nvidia vs dumping cards off and still lose when Nvidia fought back.
Well, when you realize AMD's CEO is a literal family member to Nvidia's CEO, it certainly gives incentive to "not want" to compete in such a way that significantly hurts Nvidia. AMD seems to be fine with playing the back seat driver.
So is NVIDEA undermining US national security, and if so, what are the consequences? Do they have obligations not to undermine national security, or as a company does US law effectively say "go for it, just make coin for shareholders!"
Ehhhhh, love the video and agreed with so much. But I don't like the small jab at DF at the very end. They cover the tech for what it is moreso than Nvidia talking points. But I do find that DF can be a little too easy on Nvidia. Rich makes good reviews but he seems to play it safe in his critiques rather than have a more aggressive stance. It's my only problem with the channel. I'm certainly much more a cynic than he is perhaps, but I think he's just more conflict adverse than other tech channels. Could be a problem with access media however rather than that.
They way they absolutely have to spit out the term "RT" or "RTX" or "Path Tracing" on even the most unrelated of things, at least once, has been pissing me off too. (and I own Ampere/Turing myself) They so very clearly are Nvidia shills. When so many examples of great looking and playing games exist from the likes of Nintendo and others like the studios owned by Sony, DF and others really should stop hyping "RT" as the be all end all of computer game graphics.
@@NatrajChaturvedi Honestly when I think of "Great looking games" Nintendo is the last to mind lol 😅. Nintendo does what they can with the hardware, but theirs really only so much you can do when the hardware is terrible. (The Switch was using outdated hardware when it released, and now the problem is made worse by it being very old at this point too) I own Ampere as well, but currently would rather turn ray tracing off (I'd rather target 4K with high FPS instead) but even though I usually turn RT off, I understand why DF goes on about it so much. Ray tracing and path tracing are pretty much the future of graphics, you can get pretty good results even without them, but all of the tricks that are used instead of RT or Path tracing have some sort of flaw or multiple flaws (e.g Screen space reflections only showing reflections of objects onscreen)
Yeah it's almost like coreteks is a little butthurt about something. I watch DF videos often still. They are more like an introduction to tech, for people who are not experts to begin with. They explain things in a way that common people can understand. The people that are into gaming, but are not experts in rendering, raster, RT and path tracing. The people playing on consoles thinking about getting into PC gaming. They have their place on YT, and I think they've earned it. And if there was some better competition in the GPU market, I don't think they would be as one sided. AMD clearly has no interest in trying to compete at the high end of the GPU market. They are content to always be second best, or the cheaper option. And I am a big fan of AMD, I used AMD GPUs for many years, in fact my first NVidia GPU was a 2070.
How to cover tech and not invest in the easiest layup ever... Man... Every gpu release Nvidia stock go up... At a certain point you'd think you put two and two together. Nvidia gpu expensive. Nvidia stock give me profit from Nvidia gpu. Then me able afford everything.... Nvidia always win... So me win too... lots of you have been gaming for 20 plus years, at a certain point you have to give up fighting Nvidia... Like bro... You don't have to fight these companies... They can work for you...
10:30 What the hell man, you make a great video and then do cringe crap like this at the very end for no reason. As if DF isn't always critical of Nvidia's products and princing.
8:30 investors don't care.... We can bring our own lawsuit. If sec wants to go out of its way for statements MOST investors don't care about... Well... No one will care... Most people invested in Nvidia do not care about the 4090, a100,etc. All they care about is that Jensen sell chips... You don't even have the reasoning right, it's just the being obscure because of sanctions... Not because of trying to mislead investors that DON'T CARE.
I blame AMD for Nvidia’s monopoly. They aren’t trying to compete in a meaningful way. For example the 7900 XTX should be an $800 card and that should set the tone for the rest of their gpu’s. Pricing slightly competitive cards at a slight discount isn’t disrupting anything. Wont do any good to cry foul at Nvidia’s profits either. It doesn’t matter why or who they sell their cards to, it just matters that their competitors can’t drum up the same demand. AMD has been declining in revenue quarter after quarter, year over year. They’re barely trying.
Lol AMD isn't a charity and GPUs aren't driving their sales it's CPUs. The 7900 XTX is priced appropriately the 7900 XT on the other hand should be $800.
Well you have some sort of point it ends up being invalid in a couple of ways. AMD GPU needs profit margins to further research. Hence the aim for RDNA5 and the changes to RDNA5. Investments are needed for this further drive and clearly has been effected AMD graphics over all. FSR3 is a good example of this. People are thinking of a quick execution and check mate move is on the table but it's not. AMD ryzen had a lazy and badly managed INTEL to compete with
If AMD lower the price then nvidia respond lowering too and none of them want a race to the bottom especially AMD that doesn't seems to have good stream of revenues to support it. Also if nvidia rise the prices they follow too because investors will get more money. So they walk hand in hand.
@@samuelec AMD's stream of money is largely due to their enterprise market, but they will be facing stiff competition going forward so who knows how long that revenue stream will keep them going. It's no secret that AMD's leadership does not put their full financial weight behind their desktop GPU branch of the company.
You are right that it is AMD's fault for the most part, but not due to pricing though they do take their cues from Nvidia. It's largely due to the fact AMD just doesn't compete as strongly, in the areas that matter. For example, a 3060 is still faster than a 7900xtx in Blender Renders. Why? Because Nvidia has done a much better job on the software development side, and has invested far more into offering better software solutions. AMD on the other hand constantly looks at how to cut corners or it outright ignores some segments which dgpus are utilized. There were a bunch of Unreal Engine developers that bought the 7900xtx thinking it would be a great option for their line of work, only to encounter issues, graphical errors, crashes, and features just outright not working. When they started shooting emails to AMD in order to help identify and patch the problems, AMD just ignored them instead. You don't do that to developers, especially those who actually work with you to fix the problem. So they chose instead to get rid of the AMDs and went right back to Nvidia. Huge missed opportunity there, and AMD just does not care. That said, it is worth noting that AMD's CEO and Nvidia's CEO are actually family members. Not good optics there, as it leaves open the question of price and product fixing.
The fact that nVidia is a terrible company is nothing new. I first encountered just how bad they actually are when I worked for Tiger Direct back in 2007. I had a Palit GeForce 8500 GT 1GB at the time and was looking at getting an upgrade with a GTX 260 by BFG or Sparkle. When I saw what nVidia (and Intel for that matter) had been doing, I immediately switched my new build's configuration from an Intel Core2Quad Q9400 with a GeForce GTX 260 to an AMD Phenom II X4 940 with an XFX Radeon HD 4870. I have never looked back and never will. This is what I've owned since then: 2 × XFX HD 4870 1 × XFX HD 5450 (Gift for my father's HTPC) 1 × XFX HD 6450 (Gift for my mother's HTPC) 2 × Gigabyte HD 7970 2 × Sapphire R9 Fury 1 × XFX RX 5700 XT 1 × Powercolor RX 6500 XT (Upgrade for my mother's HTPC) 1 × ASRock RX 6600 (Upgrade for my father's HTPC) 1 × ATi RX 6800 XT 1 × ASRock RX 7900 XTX These aren't all in perfect chronological order, the cards I bought for my parents the second time around were purchased after my RX 6800 XT. I've been happliy gaming away this whole time with 100% Radeon cards, wondering where people get the idea that Radeons suck. I'm not proud that I use Radeons but I AM proud that I DON'T use GeForce cards because it means that I'm not supporting that Green Goblin named Jensen.