@@pf100andahalf so what that it went down 50, it will add extra 50 in your power bill lmao. and add that every month. you kids dont use brain properly.
@@astreakaito5625 what I want first is price reductions. I would much rather have a used $400-450 3080 than a new $600 4070. Better performance for $150 less and about to be $200 less. So far I haven't needed frame generation since I play at 1440p and the 3080 is a best of a card.
Guys, remember that Nvidia initially launched the 4070 Ti as 4080 12GB? So they were originally planning to release this 4070 as 4070 Ti and charge us even more
The 4070 is really a 4060 ti. The 4070 ti is really a 4070. They just wanted to move their product stack up a belt loop and hope no one noticed. Didnt work for the 4080 "12 gb" and they had to pivot, but they still got what they wanted with the remainder of the stack.
AMD will set things straight and release a 7700XT with the performance of 69xx, we already have the 800$ 7900 XT thats a whopping 10% faster than 6950XT.
When running everything in rdr2 on max settings it looks amazing, and it doesn't go over 7gb vram ^^ And some games that look inferior im every way is hogging vram like crazy. Lazy and incompetent developers when it comes to optimization
@@andersbjorkman8666 You don't even have to compare trash ports to rdr2. Just try comparing them to "the vanishing of Ethan Carter" it'll probably look better than the latest resident evil but will run on a GTX650 just fine. Modern PC ports are a joke. And the guys who ports them are clowns.
@@HeLithium Cards around 160W and lower tend to have single fan models, 1060, 1070, 2060, 3060. As the shrink will be another power efficiency jump from current Lovelace, 5070 will 99% chance have tiny single fan cards for the ITX builds.
@@Some-person-dot-dot-dot By "deprive" I meant that we won't get GPU that was supposed to be 4070 Ti originally. I agree that Nv messed up. My point was that the 4070 isn't the original 4070 Ti.
I am glad I got my hands on a brand new PNY 3080 12GB for 7490 SEK at black friday last year, it's about equivalent to $540 US. Was a big upgrade from 1070 and seeing the price/performance from the 40-series I do not regret it one bit. EDIT: Since a lot of people comment on it: The price comparison is with VAT, import fees and other addons for the swedish price. Generally you can take the USD price*14 today to get a rough estimate in the swedish retail prices.
having owned a 40 series for a few months I'd take a 4070 over a 3080 12GB every single time, very similar base performance to the point where you wouldn't notice in reality but with the option of FG and using way less power makes it a no brainer if they are both available at the same time.
I miss the days when a 70 range card delivered top-tier 80Ti or Titan level performance, for a mid-range price. Now, we get barely 80 range performance, for a top-shelf price.
@@poison7512 Yes and it's glossed over. The 8800 GTX and that architecture was Nvidia's big break. I remember people running SLI 8800 GTX to run Crysis (1). But it wasn't until after Fermi that Nvidia really started making some big leaps. Maxwell and Pascal were culmination of many years of R&D. From 2012-2015 Nvidia didn't really have, in hindsight, good product lineups in terms of price to performance. The issue is that Nvidia has gone for margin. It's likely Nvidia is preparing their business for integration into things like national security. SpaceX is another tech company doing the same. Microsoft. AMD seems to be more business as usual in the graphics segment and their long term contracts with customers like Sony, Microsoft, and all the APU devices that have AMD inside. While their footprint is smaller than Nvidia, it just seems like Nvidia doesn't really need consumer confidence in their historical revenue driver. They're acting that way and investors are aware. But Nvidia will flip right back if mining takes off again. Long term they don't want to make a 1070 or a 1080ti type of product available anymore. They're crunching their product mix in a way that resembles smartphones. The graphics card as a 2 year "can you afford it" check.. not so nice, but Nvidia has business opportunities that make them not need us as much. Perhaps there is some technology (chiplets? SOCs? The client device becoming a streaming platform for gaming?) I don't know what they're working on, but I don't think Intel would've done Arc if Nvidia had continued to make everything a Pascal and then spacing out time between generations. This would be friendly to consumers but bad for Nvidia. It's just cold business man
@@Noob._gamer we all know that console are generally cheaper but they're not so great in some games, like rts or competitive fps..., also you can't really setup some highres music player on it, to your liking and a lots more...
try undervolting, ampere undervolts so well tbh, my 3090 performs the same as a 3090 ti in games while consuming 320w at most with rt and shit, usually around 300w without if not a little less (stock it was 390w with lower clocks due to power limits) doing a oc ad undervolt at the same time is really nice, my performance seems way more stable as well, i could probably have same performance as stock while consuming 280w or a little under but want that extra 5-8% increase in perf.
It's a typical Nvidia grift. They speced the bus for the 3080 way higher than the memory could actually utilize for better paper numbers that only give you a higher electric bill in practice. The next gen they cut down the bus and drop the tier to show a paper efficiently improvement on what is effective the same product. I've lost count how many times I've seen Nvidia pull that play.
@@onik7000 at the average electric rate in the US of about $0.20/kWh, the 140 watt difference is costing $0.03/h. 8 hours of gaming a day puts the additional cost at $7.20 a month. Let's just say you were running at full tilt round the clock. That additional 140w would cost you an extra $260 a year. Now take that $260 and divide by 24 then multiply by however many hours of gaming you do a day. That will give you your additional cost. Mind you, that's just the difference between the two. It's not including the $0.04/h you're spending on the first 200w nor the extra $0.015/h you're paying because both of these GPUs actually opporate 50w over TGP. In Total that would leave a 3080 gaming around the clock at costing $740 a year to operate Now the actual point of my comment was to say that this is how Nvidia opporates. The over spec one generation beyond what the hardware can handle, then when they down spec the next gen. They use the "efficiency increase" as an excuse to keep the price high knowing fully well that the energy saving won't even be noticable to the average consumer. That's what makes it a grift.
Because the new gen G6X at 256b offers comparable bandwidth to 1st gen G6X at 320b, using 1/3 of the power, plus 10 times more of on-die cache? Nephews are hilarious
this is more confirmation that the rtx 4070 should have been a 4060 TI at most, delivering a 10% worse performance in most scenarios than an rtx 3080 12 gb
that was NEVER going to happen....think about it....we had the 4080 12GB before it became the 4070ti so this card was ALWAYS slotted for the 4070 position no matter whether it was spec'd more like a 4060ti or not.
It was a 4060ti and the 4070ti is the 4070 but they couldnt justify the price rise! Imagine buying the 4070 for $800 😂 is what actually is happening but everybody thinks that they are buying the ti version
@@andreabriganti5113 oh I agree it should be 499 or below but like it or not....this is definitely what we got as a 4070 this time around. Nvidia got WAYYYY greedy!
It's actually kind of crazy how many anti-consumer moves Nvidia has pulled with the 40 series cards. I'm gonna hold off until the RTX 50 series and hope things get better especially with VRAM. also recently learned they're going to release a 4060 Ti 16GB, What a mess lol.
same, i was using 1660TI and got a 3060TI for my 1660TI + some money, i'll use until idk, 5070TI comes out, cuz this generation was dogshit, without the new Nvidia Tech 'frame generator' literally didn't change a single thing... but they increased the price, said the moore's law is dead and also as u said, VRAM? BUS WIDTH? like hello Nvidia? we're stuck at 2018?
The most notice difference between 30 & 40 series is the power draw, even tho you can have the same fps performance but you get less power usage, pretty good to consider your monthly electric bill
Thats the main reason I went for a 4070ti. It will pull a little over half the power of the 3080 and temps are a lot better. That was the main selling point overall for me
The performance differs is huge compared to 30 series you are comparing 70class to80class of last generation when you compare 4080 to 3080 it’s a big jump in performance and 4070 ti is = last gen powerful GPU 3090 if you compare 4090 to 3090 it is also a big jump in performance AMD 7900xtx also performance similar to 4080 still but ha prices are higher for that performance if it comes same prices as 30 series it would sell like cupcakes
I think this looks great. I paid 900usd for my ROG Strix 3080 10gb and that was 100 less than MSRP. Yes that's the most expensive model but wasn't the cheapest 3080 700? I really don't get peoples issue with the 4070. I'm getting great performance with it in 1440p and it sits at around 180 watts and is cool as a cucumber and extremely quiet.
Saving now for Blackwell next year. Getting a card with atleast 16gb of vram on it. Not making the same mistake I did with my 3070. Hopefully by then, competition from AMD and Intel brings Nvidia back to reality on pricing.
The 3070 was always suspect right from the start, current consoles can spare up to 10GB (out of 16GB) of shared memory for VRAM after the game data and OS. Around 10GB is basically the upper target for game developers for 4K and below, excluding the odd fringe case 10 &12GB will be fine for the next 3-5 years, it's not going to be the same situation as 8GB.
@@Battleneter you are confusing for accounting same vram in consoles and pcs. Remember console have only 1 purpose to play game and developers will definitely optimise their game so it runs perfectly on console. Pc on the other hand, do more than just gaming. So it works differently. 10gb vram on console can’t be considered same as 10 gb vram on pc. Pc might need higher vram to perform same as on console. And that is why the game developers are not bothered by vram consumption on pcs and pc ports are launching without proper optimisation. And judging by the current trend of AAA released games, we can get games that will eat more than 12-16 gb of vram on pc in THIS generation alone. And this vram consumption will increase with newer generation of consoles. So yeah if you have budget, go for higher vram and native performance rather than software like dlss and fsr. That chide will last you longer. So 10gb console vram doesn’t equal to 10gb pc vram.
@@YashPatel97335The situation's so bad that Jedi survivor has a native sub 720p resolution on ps5, I wouldn't be surprised if current gen consoles (ps5 and xsx) end up using the equivalent to low settings before the generation ends.
@@demetter7936 You can argue old consoles also have value, but again consoles are just toys. It's like comparing a screw driver (consoles) to the price of a Swiss army knife (PC), sure both will play games but the PC does a crap ton more.
Now we know for sure that frame gen will never come to 30 series. Its literally the only thing selling the 4070 now. I imagine that would be the same for every lower tier card still to come.
I recently picked up a used RX 6800 non-xt, which after some mild overclocking is performing in 3080 territory (ray tracing off, of course). It only cost me around 300 USD.
Tbh the only thing I see as a big difference is the power draw. DLSS 3 isn't a big thing for me as I never turn it on as it just doesn't feel good, personally it feels janky as hell so I only ever use DLSS for it's image reconstruction, not the frame gen.
Wow. You're the only person I've see remark the jankiness of DLSS3. A lot of people think it's gonna save them from a 40fps stutterfest experiences in the AAA goyslop released now instead of exacerbate the input delay compared to the smoothened frame rate they see.
@@Greez1337 most of the people using it are 4090 owners playing at 4k or 4070 to 4080 owners playing 1080p mostly and some 1440p with the base fps being atleast around 90fps, its not surprising they don't notice the jank from it, i bet they would if the base fps was 60 or lower tho, honestly it just reminds me of how motion blur makes it look like more frames xD
It's interesting to see NVidia do this. But going from 1080 -> 2070 things were on par, but maybe a 3%-5% edge in favor of the 2070, then a 3060 was on par w/ the 2070 with about a 3%-5% boost in favor of the 3060. And now, you would expect a 3080->4070 to be on par with a slight advantage of 3%-5%, but it's actually at a disadvantage of 3%-5%. Very sad.
My boy just got a 4070 build from a sponsor and was like "I have the best pc in the group now." I have a 3080 12 gb and said nah your just up here with the big dogs
To answer to your request around wanted content: I would appreciate more eduactional testing setups that are investigate behaviour like in this video with the bus width/bandwith. And being aware of how easily you can leave performance or even screw up your experience in the Nvidia(AMD) control panel... basically an evergreen for a changing audiences. E.g. i would appreciate videos explaining single settings in depth and testing their performance/experience implications in several different game engines- alongside the usual comparative testing. That could add to your benchmarking imo. THX💙
Because the 70 series is really the 60 series and so on, if you look at it like that you understand the performance, they pushed the slider over by changing the names on everything, the 90 is the 80 and 80 is 70 and 70 is 60 and 60 is 50 and prices went up for what would have been the real product stack and then slide the names over making profits go up many many times, selling a 60 series card that should have been 250 dollars as a 70 series for 600 is INSANE mark up in price to performance.
its the nerfed CUDA cores on the 4070 that cause non-RT rasterized performance to drop significantly while they peddle frame generation that uses the SAME tensor cores as previous generations, but they softlock it to 4000 series in drivers. while also giving you a bit more RT cores to polish the turd of the rest of the GPU.
nah, it's just the bus width. 4070 RT cores are better than the 3080's, so even with LESS RT cores, it performs better, they don't matter in raster though.
@@MaxIronsThird sorry for the misunderstanding. when I mentioned they "nerfed" the CUDA cores, I meant they reduced the CUDA core count. with the RTX 4070 only having 5888 CUDA cores, and the 3080 12GB having 8960 cores. yes i know the efficiency of the 4070 is MUCH better, and even though the difference in cuda cores is around 35% it manages to perform only 8-18% worse in raster graphics while also consuming much less wattage. however this drop in raster performance, especially for the price of the card and the fact that nvidia has historically released new 70 class cards that are supposed to match an 80 class card of last gen. makes this newer card a much worse deal when raw performance is considered. however, if you need less power consumption, and energy in your area is rather costly, than the 4070 would make more sense. and frame generation is still a pretty decent technology even though frame interpolation has been out for years, and this tech is legit just frame interpolation with some AI smoothing to prevent the image from looking unnatural. which again, uses the SAME EXACT tensor cores that every nvidia card has been using for DLSS since the 2000 series
Grabbed a barely used 3080 12GB last year and have not regretted it at all, the recent drama around cards being limited at 8GB are of no concern, and the 40 series has nothing that is compelling comparatively. Not that NVIDIA couldn't produce something compelling, but they chose not to and here we are.
Imagine 4 years later rtx 4070ti and 3080ti and below can not run AAA games becuze of low vram memory. Would pick rtx 3090 any day over 4070ti or even rx 6950xt
@@WASD-MVME yeah, i think the perf difference at lets say 4k between the 3080 and 3080 ti is atleast 15% while stepping up each after is 5% depending on model
@@InnerFury666 3090 and 3090 ti are within a few percent of each other depending on the 3090 model, mine runs about on par with one for example here is a video ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-NZACJFQlJK8.html, you can achieve the same result with an undervolt (depending on lottery) as out of all of ampere the 3090 is the most power limited, if the clocks are on par with the 3090 and 3090 ti, they are within 2-3% of each other, but fe vs fe due to the higher power limit on the 3090 ti its about 10% yeah
somehow my 4070 uses even less watts than my 3060 ti and when you undervolt it a tiny bit the card just becomes the best choice for older power supplies or bills
@@jesusbarrera6916 I mean ofc you can but say 100 bucks for PSU and the money a 3080 will cost you over the years, this will stack up. It only makes sense at a very big discount in my eyes
You should test the undervolting capability of your 3080 12GB. Usually it runs fine on 800-850mV, with 1750-1920MHz GPU clock. At 850mV, 1920MHz, it is pretty much like a stock frequency, with a lot lower power consumption at 240-280 watts. But then, the 4070 underclocks really well too, I’ve seen it undervolts to 140-150 watts while maintaining pretty much the same performance. That’s GTX 1070 level!
Indeed, I have a Rog Strix RTX 3080 12GB (OC version) and run it at 1900Mhz with 900mV, runs cooler and stay between 250-280 watt! Thats almost 100 watts less power consumption without losing much performance. The default clock of this card is 1860MHz. Also, this card has same chip as the 3080ti/3090. And same cool block as the 3090, (for the ASUS ROG STRIX version) They had to do this of course :-D
I undervolted my 4090 to 825mV @ 2400Mhz. Power consumption dropped to 220-250W for all my games. That's a pretty big drop from 400W stock, so I'm very happy. Temps-wise, GPU core stays around 40 degrees and vram / hotspot gets to 50 degrees. Never even touches 60. The best part is performance loss was very small, less than 10% which is not even noticeable for me. :)
Love my 3080 12gb. Suprim X model. Very good silicone in mine. Been running it since September 2022 at 1980mhz @0.875v +1200 on memory. Pushed to it limits at 430w it can hold 2295mhz in unigine heaven 4k.
Nice undervolt for 1980Mhz, I stick to stock voltage cap it to stock and was able to overclock to 2115Mhz stable for my rtx 3070 msi gaming x trio sadly a$$hole Nvidia cap wattage to 255W I can not pass 260W what I even do. unigine heaven is pretty bad now days and would recommend to use firestrike at last or even timespy, I can do stable 2200Mhz something too, on unigine heaven but does not stress GPU. Lucky you that your GPU can pull 430W my stuck 255W and below.
I have a 3080 12g that takes an undervolt at 8.35 1875 mhz and plus 500 on the memory.... it uses about 280w and is really stable...... its a gigabyte oc card
@xXOverEyeXx i have 4 profiles in MSI Afterburner. If i run the efficient mode (for my kids) 0.825v @1860 with stock memory i can keep it around the 200w in most games. Not far off the efficiency of a 4070 with about the same performance. Mind you, you can also undervolt a 4070.
So now we're just paying for software upgrades basically and reduced power draw (which is something you EXPECT from a new gen of gpu's. Its like saying the new gen will be faster). So now we're paying for software upgrades. Brilliant. Keep defending it guys. You're soon going to see performance uplifts be a thing of the past.
I mean this is not suspose to remplaze a 3080 12gb owner this is still a uplift if you compared to what buyer they are targeting witch is the 3070 and below there's a reason the 4090 have a huge gap this Gen maybe because every single review talk crap about the 3090 being a scam on being 15% faster than a 3080 for 2x the prices maybe Thas why nvidia nerf lower tier this Gen so review stop talking shet but guess what the 7900xtx was barely a uplift for the same prices so who is here to blame
@@HosakaBlood Nvidia and AMD are just colluding at this point. The xx60 class generally have performance close to the last gen xx80 class GPU. This ''rtx 4070'' is that. Thats basically what the 4060ti should be at the minimum. So, what nvidia is doing is selling us at best a 4060ti for $600. Its a pathetic performance jump. I mean its barely 15% faster than the rtx 3070ti. And the vram, its been long overdue. We should be having 16gb vram at this price at minimum and realistically having 4070ti performance at minimum for what is suppose to be a 4070.
Daniel, as usual, I completely agree with your assessment! I'll wait to see what kind of price/performance AMD gives us at 7700 & 7800 levels. I'd probably be most inclined to get a 7900 XT if it were at or under $700...
@@nipa5961 AMD's drivers are not better than Nvidia's for anything. For a few things, they are about as good, but if you use your card for anything besides standard raster gaming, you will start to appreciate just how far behind AMD's software is compared to Nvidia's. In pure raster, they've improved, but they still have issues like massive idle power draw or dysfunctional multidisplay support. To take just one example of the many software issues AMD still has, just talk to all the people who bought a 7900XTX to use for VR and then returned it because it either A) refused to recognize their headset, B) crashed upon launching any VR platform, or C) performed worse than a 3070.
@@nr1771 I've made very different experiences it seems. AMD drivers were much more stable for me than Nvidia's. Also, a friend just "upgraded" to a 3060 and has massive problems with his monitors not waking up properly now and is forced to blindly restart his machine a few times a day. Just one example. Nvidia also has no equivalent feature to Radeon chill. Speaking of power consumption RDNA was way more power efficient than Ampere. Strange how everyone seemed not to care last gen but brings it up since last fall. XD They both have their pros and cons, but in the end they are very equal. So again, sadly AMD is the only option right now, since all equivalent Nvidia cards are much more expensive and lack VRAM.
@@nipa5961 I'm glad you've had a good experience with AMD's drivers, but it's definitely not shared by a lot of people out there. All you have to do is look at any AMD forum and you'll see a lot of people talking about driver issues still. And I'm not saying this to fanboy Nvidia. I hate the way they've priced this generation of cards. I'm just saying that if you care about anything other than standard raster gaming, AMD still has a lot of issues (if all you care about is standard raster gaming, AMD is the better value and you should buy AMD).
For some reason 4070 footage looks laggy and riddled with stutters even though the frame times are very similar. Something wrong with the recording of 4070 results?
@@BlackJesus8463 1440p and 4k differs very little and monito/tv upgrade costs a lot. not to mention 4070 can run most games at 4k 60 fps , for 4k 120 144hz gaming you will need 4090 that is super expensive and 7900 xtx can match that too
@@mr.cookie8265 They better be. They are on an a significantly more efficient transistor node. The fact that this gens 4070 can't consistently beat last gens 3080 (12 gb, but all 3080's should have been this for the price) in raw performance is really sad. And to top it off, Nshitia wants 100 bucks more for this card than last gens 3070. It sucks. If this card was matching the 3080 ti like how the 3070 was matching the 2080ti this wouldn't even be a discussion.
Like anything used, it's always a risky gamble. Most of those used 3080's listed on EBay, will no doubt be ex-mining cards or will have had a fair amount of wear and tear. Having the safeguard and peace of mind of a 3 year warranty, in itself is worth paying out an extra $100, at thier price range. Secondhand will always be cheaper, with the potential of getting a better bargain. However there is always several risks and cons involved. You could find yourself 2 months later with a dead GPU, with no way to get a refund and be $500 out of pocket.
@@victorxify1 the 4000 series cards are also alot cheaper than the 3000 series cards (at least in europe) the 4070 costs about 600€, the 3080 10GB costs about 770€ and the 3080 12 GB costs about 2270€
Nobody talks about the huge efficiency upgrade for the same performance of a card that was once selling for close to $1500 at a new cheaper price of $600, so not only is power cost lower which pays for itself over time if comparing to RTX 3080 power cost but also lower temps on the newer architecture means the 4070 will likely have more endurance than the power hungry 3080! Just a thought I know is not popular because all people worry about is raw performance but I value efficiency and endurance more than raw performance!
DLSS and Frame generation is going to be the saving grace of the 40 series. It's just insane tech, and theoretically only going to get better and wider supported.
It's software. It's software they could push to previous generations to make them better if they wanted to. But they won't and lock it to their latest cards for profit. Unlike AMD which allows even their oldest cards support their newest technology.
@@Vespyr_ lets see how well FSR 3 will be Currently, FSR 2 is clearly behind DLSS 2 so I wouldnt expect wonders. I like that AMD releases it Open Source but if it works better on Nvidia, you just cant help it
@@froznfire9531 When a company patronizes a market in this manner, it is specifically targeting established customers of previous generations of their brand. New customers specifically are unaffected by this exclusivity, until the next model releases. Nothing stops them from backward porting this technology. They won't even do it by one cycle, just a few years apart. There is more to a company, than performance. They did this with Gsync too, until sales forced them to concede.
For everyone interested why the bandwidth got smaller... The VRAM is added to the graphics cards as chips and every GDDR6/GDDR6X/GDDR7 Chip has 32 data pins to connect to which is called a 32-bit wide bus. The memory bus can be shared between 2 memory chips, but I don't think there are recent examples of such shared bus systems. (the 660 Ti would be an example) Usually you can simply multiply the 32 bit bandwidth of a chip times the number of chips and you will get the memory bandwidth.
I have found the method which is used to give more VRAM without making the bus bigger. It is called the Clamshell mode. There is a x16 mode (two 16 bit channels) and a x8 mode (two 8 bit channels). The x8 mode is called the clamshell mode and you can put double the VRAM without using a bigger memory controller. You can still use the VRAM chips in parallel, but you only get halve the bandwidth per chip.
I'm here in Brazil considering buying a 4070. Here the prices are very high due to the conversion of the dollar into local currency and the inhumane taxes. Congratulations on the particular analysis. One more subscriber.
I genuinely cannot understand people who call the 407p's 4k performance "not worthwhile at 4k" like, this thing does better than a ps5 and xbox series do you absolutely need 120fps.... and even then most games will get over 60fps at 4k or 1800p.... like....
@@emanuelacosta2541 there is a chinese tech guy that UV 4070 to 100W and took all the FANS OFF, and the highest temp was like 81C, while still maintaining 85%+ performance
Nvidia thought the big L2 cache would be way more performant than it is or they just want the 4070 to be 1440p card and not a 4K card. Even with better RT cores and more Optical Flows acc, the 4070 isn't able to match the 3080 12GB in RT mode. That's ridiculous.
Have you considered doing any monitor reviews? I know you have a lot on your plate already, but it would be cool for someone to pick out some value monitor picks!
This is probably why Nvidia is giving the middle finger to gamers now lol. A LOT of comments last generation about the power draw, then they release a new generation with the same performance but half the power and the top comment is "last generation's performance at today's prices. Good job Nvidia" lol... I think some gamers just like to complain.
The RTX 4070 delivers almost similar performance compared to the RTX 3080, despite consuming 120 watts less power... Not " half the power"!! But the 4070... ready for "Planned obsolescence"! Because of VRAM!
Don't be disingenuous with the numbers... we can literally see them in the video. I haven't bought either generation because of Nvidia's shenanigans, so I'm not sure where you get that I'm "ignoring all the shit nvidia does". I'm also mature enough to give them credit for fixing one of the biggest complaints I saw against the 30 series.
The fact that they are banking on tricks to sell cards is simply silly. They artificially limit Rtx 30 cards so they don’t use these tricks cause they would lose all credibility at that point.
I tested my 10Gb 3080 vs a 12Gb 3080ti, ram wise the 2gb didn't make a difference at all. For reference Re4 remake can be played with the 10Gb card all maxxed using a 2gb texture option while the 12Gb card only allows you use the 3gb option (crashes at 4gb). That been said it isn't relevant 🤷🏻♂️
If you want to know the real prices/values of 4000 series Nvidia GPUS then just scale it down by 1 model lower: rtx 4090 $1600 ---> $1200 rtx 4080 $1200 --> $800 rtx 4070ti $850 -- > $600 rtx 4070 $600 --> $400 etc etc If you notice how price make sense perfectly when you scale the prices down 1 model, you will see the pattern where Nvidia just scaled up prices by +1 model.
This guys is going places. The gimped vram bus of 40 series will go down in history as the worst change nvidia ever made. We need more ram AND the same number of ram chips. The 384bit bus will make 3080 far better at demanding 4k raster.
The thing is that you are right the memory bus is leading to it being faster at 4K however its drawing 70% more power for 10-12% more performance and this relatively small gap in performance means as DLSS3 becomes more prolific it won't matter as much
@@GewelRealyeah 12gb isn't really enough anyway but at least the 3080 12gb has a firm lead in 4k. I get the draw of NVIDIA but I'm going AMD next. Enough vram without selling body parts sounds good to me.
the 4070 ti and below does suck at 4k. But I wouldn't recommend these for 4K gaming even if the specs were the same as the 30 series. 4080/7900xtx and up would probably be the best buy for 4k. That's by design of course.
As I said before Nvidia just threw all RnD into the 4090 and then said screw it to the other class of cards. This 4070 should have been a 4060ti at best. Improvements to the tensor cores and RT cores is one thing, but that improvement means jack if the memory bandwidth can't keep up.
Hello Daniel. Is it possible to make any benchmarks with AI tools? Like stable diffusion. 3080 vs 4070. I'm kinda new to AI, but it seems like nvidia is the way to go.
@@ladrok97 saying you can't go AI on amd is just a lie, sure it isn't as easy as cuda but there's directml and vulkan implementations for a lot of projects. AI is also much more than "image upscaling"
@@Suilujz no its not worth it getting an AMD for AI. Too much work to get it running, slower speed than nvidia cards and most tutorials on youtube would be using software that support nvidia. Its a shame though because AMD cards have more vram
@@Suilujz Maybe it's a lie, but yesterday I wanted to test other upscaler to get X4 from 480x360 and majority is blocked by cuda and brute forcing it with 6600xt it's pointless. I plan to upgrade to 7900xt (or maybe wait for 8800), maybe then brute forcing this limit will work. But if someone wants to use AI, then it's far easier going with Nvidia than AMD and I doubt situation of "max 20% work on Vulkan" won't apply to most of AI use case
@@ladrok97 I got my 6950 XT today, just upscaled a 512x512 image by 4x in two seconds with r-esrgan 4x+. Don't know which one you trying to use but there's perfectly working ones out there
The 3080 12Gb doesn't just have more memory and bus to support that memory, it also has more Cuda cores , more Tensor cores and a higher TDP. The 3080 12GB is just another in the long line of nVidia examples where they named things the same even though it has most specs different. It's the exact same thing they tried to pull with the 4080 12GB vs 4080 16GB , but backed off and called it 4070Ti eventually. And it's not the same as with the 4060Ti 8GB vs 4060Ti 16Gb where memory is the single difference between them and is somewhat excusable.
Except the problem is its basically impossible to find any 3080 12gb in existence. When you look up "3080 12gb" you get nothing but 3080 ti results... and thats still quite a bit more $ than the regular 3080.
i managed to cop a 4070 a decent deal below MSRP (for my country, EU so all prices are fucked, basically) on launch day, so combined with the lower energy usage and the fact i don't use 4K i think it was an okay option i do agree with your conclusion though, not very happy with Nvidia but for what it's worth 4070 happened to align perfectly where i live
to be fair to the 4070 released at msrp, those 3080's were 2-3 grand if you could find them. Pretty sure the msrp for the 12gb was like 1200 bucks anyways.
In the EU, during shortages, I frequently saw 3090 for lower price than 3080 (ti). Ridiculous and too bad I didn't screenshot some. 3090 1800 EUR, 3080ti 2000+ EUR. I don't get it how the same store could sell a weaker GPU for more money with a straight face.
@@zdspider6778 Sure, but in the same store, like ComputerUniverse from Germany, which is quite a reputable company with decent pricing for the EU? That why I said, too bad I didn't screenshot those listings.
In general comparing prices with last gen things that came out in mid 2020 and into 2021 isn't really fair, because msrp was a fantasy and around 2021 the chip shortage started kicking in so it probably cost wayyyy more to r&d the 40 series or the 7000 series. A lot of things went wrong to get us here.
Hi daniel! I just wanted to say thank you for all the videos. 6 months ago I built my pc based off of your tips and am still happy with the result now. Im commenting now though, because I wonder how much of your audience, like myself, only watch you for a month or so, get their information they need and leave straight after. Thats why I figured an appreciation post is in place. Have a good one!
I don't care. To me it's probably the best feature to ever happen in the history of gpus. I've been dreaming of the day it would finally be a thing for more than a decade. And I'm excited for it interpolating several frames per generated frames in the future. Thanks to it, I might experience ultra high frame rates in my life-time.
Too bad the 3080 would have not worked with my current PSU. So i went with the 4070 instead. I paid MSRP price. Over my older GPU i only use about 30 watts more, but having over double the performance and VRAM.
Great, I was wondering why no one did reliable compare of those two till now. That makes my personal chart complete now and re confirms it for 4k with amd cpu. 850€ 4070ti new 125% 650€ new 6950xt 125% (450€ used) 600€ new 6900xt 120% (450€ used) 600€ 4070 new 100% 550€ 3080ti used 118% 500€ 3080 12gb used 112% 450€ 3080 10gb used 104% So if no ray tracing and power/heat limit get 6950xt used (ideally with amd Cpu for SAM). If ray tracing and 4k 60fps get 3080ti used because Nvidia does raytracing better and higher better vrm over 4070 gives further advantage on high detail texture of 4k / RT and it is still better than team red even before DLSS, more universal card.
I’m happy with my 4070… I upgraded from a 1070 from 2016 and it was the best value for me in Canada in 2023. It was the only sub $1000 CAD video card of the current generation.
That a better reason than me. i upgraded from a rtx 2060 to a RX 6750xt for $410. I had no problem with it besides amd's terrible encoder and the Adrenalin software was sometimes buggy. I "upgraded" then to the rtx 4070 in 3 months. I wasted over $1000 on gpus in 3 months. I could just bought a 4080/7900xtx/ used 3090ti instead of all of this. I guess it's the mental barrier of spending a thousand dollars at one time.
I dont know if frame gen is such a great feature if it adds ~20ms lag. I'm not that fast, when you ask me something I sometimes answer after a few seconds, most likely with a 'what!?' and yet I feel massive improvement in playing forza on my gaming monitor with ~3ms vs my tv with ~40ms. It's nice that fgen is added but I wonder how much single player that single player title has to be to actually be enjoyable with lag. Maybe nvidia just wants to silently force feed new gamers with input latency to later jump them smoothly to their streaming platform ;)
Most of the people complaining about the 4070 have a 20 or 30 series card. I upgraded to a 4070 from a 1080. It's $100 less than a 3080, and I don't like buying used cards. If I had a card that was a 3070 in performance or higher, I'd have waited another 3 generations to upgrade. If I had to upgrade my GPU every generation, I'd sell my entire PC and just buy a console. I wanted the 3080 when it came out, but now I saved $100 and have a much more energy efficient card that will hopefully last me longer than my 1080.
I'll admit I caved in and bought the RTX 4070 however before everyone jumps on me let me clarify why. I had about its price in terms of budget and I was well aware of the other gpu options however, going older gen or AMD meant I would need to upgrade my PSU and I didn't have the budget for that and without compromising on performance the 4070 runs perfect on my 550W psu. I also appreciate the fact that Nvidia cards and software is supported more for the creative work I do for uni ( Blender, 3D, Photoshop) whilst also giving me gaming at 1440p with decent frames and with the added benefit of DLSS 3 frame gen as a user I noticed smoother gameplay without noticeable artifacting which is just great for user experience and I understand that it gets stick for the price, it isn't justifiable for some upgrades but going from igpu to workloads on this gpu was immense. I think the vram could be an issue however Nvidia has just show new texture compression which is more detailed and has smaller file sizes so I reckon that will help all of their gpus and why they have refrained from upgrading vram by large amounts. I may be a singular case where the 4070 as an upgrade made sense but i think you would agree it was the best option for me. Great to see Daniel continue to dive into how it compares and thanks to others who responded on my last comment on one of these vids giving me upgrade suggestions.
The 4070 is actually a great option as a gpu upgrade from two generations prior. The problem is the price which is still very much overpriced at 600. Others are right that it should be at around 400 dollars but since wealth is relative, this number doesn't seem as much to some than it is to others.
why admit you caved in? your needing it for blender and photoshop were valid reasons. everything else you said though makes me believe you're full of it or won't be very good in your major. W/ added benefit of DLSS3 noticed smoother gameplay without noticeable artifacting? you're just regurgitating words without knowing what you're saying. just how much do you think texture compression will help an overpriced gimped gpu with low ram and bus width? nvidia already has texture compression tech way ahead of AMD and using AI to "compress" isn't really compression. it's using AI to add extra details that weren't there. I'm not flogging you about choosing a 4070 because of your psu but that was just another reason you added that doesn't make a lot of sense if you needed for studies. but then again, add up all the other reasons asides from studies and something isn't right here.
@@Dave-kh6tx All I meant by caved in was more of like I made the jump to get it, more of finally deciding on what I was getting, does that make it clearer? I thought needing to spend money on upgrading to a higher psu for less efficient cards was a fair variable to factor into my options (I've saved for a year and I was trying to balance value and quality performance for what id use it for) and I'm not a computer science or super experienced builder but I have followed along with latest new etc which doesn't have anything to do with my degree so I don't think that comment was necessary about my competency in my degree however due to my lack of experience compared to some I will apologise if i've used a term or fact wrong. I just wanted to share my experience and why I made the decision to go with the 4070 but I do appreciate the points you made about price and compression but I think my views on it as a user are justified when playing games and experiencing DLSS 3, just my opinion and you have the right to yours :)
4070 right now is massively cheaper than 3080 even the 8GB version. That with the fact that you get DLSS 3 and massively less power consumption it's a no brainer to choose the 4070. For me it's the same boat, if I consider upgrade now with my old PSU being 550W I will end up paying $150 more for the 3080 at the very least, and I can't even find a reasonably priced 3080 12GB anymore. The way the 3000 series cards were priced made them such poor value that the new 4070 ended up looking good despite being also poor in value. Talking about performance per dollar based on MSRP is just not useful at this point.
The other advantage: availability. Because not even scalpers want them. Let that sink in. Up until now, they were buying them from retail and selling them at insane marked up prices. Now there's no demand for them. People caught on that these are actually 60-class cards in disguise, sold at 80-class prices. Nvidia done fucked up with this generation.
@@NamTran-xc2ip Frame generation isn't the magic bullet Nvidia wants you to think it is. It doesn't work with VSync. You need a G-Sync monitor (which are $200 in addition to the base price of a monitor), otherwise you suffer terrible screen tearing. It causes ghosting and all sorts of artifacts. It introduces input lag (because you're always 2 frames behind), which sucks ass for competitive games. And it doesn't work with all games, the game developers have to add it in, as a feature, because it needs motion vectors and stuff, which isn't something it does automatically.
@@zdspider6778 competitive shooters don't need frame generation anyway. In order to have screen tearing, you need to exceed the refresh rate, of which if you do, you don't need frame generation. Ghosting, artifacts,... I'm not pixelpeeping to spot those. I'd rather have 100fps with these 'artifacts' than 50. The 40 series are insanely efficient. Whatever you say man
So glad at least YOU are telling AND showing the truth about Frame Gen - its Motion Smoothness - it will NEVER have the latency of REAL FRAMES at the indicated FPS counter.
Another positive for energy consumption is SFFPCs. I have a sub-10L case where I did a straight exchange between a 3080 and a 4070, and the reduction in heat and noise is remarkable, from 82C to 68C in Timespy 3D 4K benchmarks. This is an edge case scenario but its certainly worth considering if power and heat are factors.
I'd sell my 3080 10gb for a 4070 any day. An overclock closes the gap at 4k, there's 2gb extra VRAM and my room will stop being a sauna. Plus frame generation is actually pretty cool - I use it a lot on my 4080. Power consumption is why I sold my 3080 ti and never kept any 3090 ti. BTW I own a computer shop, which is why I get to play with a lot of GPUs.
DLSS3 Vs Frame Generation. Nvidia is calling Frame Generation as DLSS3 while also changing name of DLSS 2.x.x to 3.x.x. So, DLSS3 technically speaking works on 3000 series but Frane Generation does not.
I appreciate this video.. I have an EVGA 3080 12GB FTW Ultra Hybrid. Wasn't sure if it was going to be worth going up. Seems even though this isn't a comparison between the 4070 Super and mine. But it gives me comparable to work with. I think I will wait till the 50xx series comes out. As it seems you have proven that the big jump would be the 4090 and if I am going to get a 4090 I will get a hybrid.. and well.. wait for the price to come down on that.