4070 Same or Better performance, while saving more than 100 WATTS of power... The RTX 4070 pays for itself in a year, lower electric bill, and cooler running PC.
$399 1070 was performing like the 980Ti. even the 3070 performed like a 2080Ti. it feels like the RTX 4070 is a 60Ti tier card at $600. Perfromance/$ has moved really slowly over the years.
It's because AMD isn't competitive vs Nvidia. Nvidia will try to milk out as much money and AMD will follow along too because it benefits them. Nvidia simply dictate the consumer GPU market.
@@lifedroid but 600$ not really, maybe 500 or 550$ might be interesting, especially when upgrade, btw anything in 4070 is decent, just a bit shame that it cant even better than 3080
An RTX 3060 ti outperformed a 2080 super 2 years ago at just $399. And here we are now, an RTX 4070 performing the same - sometimes underperforming - an RTX 3080 at $599. It can only outperform drastically with the help of Frame Generation, which most games don't even have. What a time to be alive.
Plus frame generation is useless in multiplayer gameplay. One more thing, i think NVIDIA is paying lots of cash $$$$$ to certain companies or people to support them in the RU-vid comments section. To all those people who are trying to back NVIDIA are either paid or complete disgrace. A decent gaming PC cost $3000 in Australia. That's most people's average monthly income before tax. It used to be half of your salary. In third world countries it's no longer affordable to most people. With the so called inflation we are making rich people more rich.
You're wrong. The price is now significantly lower than the 3080 was during the crypto boom. Performance with DLSS 3 is about 50% higher. Power consumption is 120V lower, the card is colder and needs less cooling, which means it is suitable for compact builds.
knowing covid/crypto era, for this card you would pay 1000 1200 dollars in past, now we skipped idiotic 3000 series and this sounds reasonable , especially for people with gtx cards to upgrade.
@@NostalgicMem0ries Oh god! This card at less than 200W performing like a 3080 would have been the king of crypto. Nvidia did their R&D on these cards during crypto and they thought making power efficient cards will get them a lot of profit from miners but it didn't last!
@@pinakmiku4999 Yes frankly most gaming cards even if they sip 400 watts are used for small amounts of time at peak like 3-4 hours /day even for hardcore gamers. So for gamers the wattage issue mainly causes PSU concern. But miners its a different story.
If you’re buying a new card for the first time or you have a 1070-1080, 2070-2080 then the 4070 is a great card. If you have a 3070 or 3080 then it’s probably best to wait until the 5000 Series. My son has the Asus 4070 TUF OC and my daughter has a Palit 3070 GamingPro. Both play 1440p on Ultra/High no worries.
@@KelvinKMS It is not running it faster. It is generating fake frames that has nothing to do with the game running faster. It makes it look smoother but in reality it is not when latency is higher and you can't interact with the fake frames since they are made by the gpu not the game engine.
Less cores and less memory bandwith.. for someone who needs the GPU for productivity, the 3080 might still be a better choice. Also the 3080 was advertised as a 4K card, while the 4070 is being advertised as a 1440p card. Its slightly behind the 3080 in 4K in that matter.
Hopefully the 4060 ti comes with 12GB of VRAM. They should make the rtx 4050 8GB of VRAM, 4060 10Gb of VRAM and the 4060 ti 12GB VRAM. That way its better somewhat but they probably not going to do that and not just that is going to be 400 dollars for 4060 and 4050 is 300 dollars. I can see it already.
@@BladeCrew You are too optimistic. I have heard that 4060 will have 8 Gigs of VRAM whereas 4050 will have 6 Gigs. If that's going to happen then these cards will be DOA. As it's Ngreedia, anything can happen.
which is also quite disturbing, since the tech market should ideally be climbing towards normalizing 4K sometime in the future instead, it seems entry level is going to be eternally stuck at 1080p (due to horrible VRAM), then 4K capable cards will be enthusiast-level forever and ever
@@PrashantMishra-kh1xt This is why 1080p is still the most popular resolution. 1440p 144hz monitors are approaching price parity with 1080p 144hz monitors. If budget GPUs had 12gb VRAM even last gen, we'd already be seeing 1080p begin to decline in favor of 1440p.
@@anuzahyder7185 ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-Z-DccvWdTqE.html took me a bit to find a comparison that wasn't with the 980ti, but you can see the 1070 getting always from, and I admit, minimun of 15% not 20, to straight up, 100% in some games like forza. In general the difference was way more noticeable than what we got here.
@@2067792332977602 here u r comparing new games after years. I had 980ti gigabyte gaming n then it went bad n they gave me 1070. 1070 was around 5% slower. 3070 was as fast as 2080ti but u need to keep in mind that 2080ti wasnt even that fast. 3070 was aroumd 30% faster than 1080ti. Same goes for 4070. It’s around 30% faster than 2080ti. Rather than hating on nvidia get ur analysis straight
@@anuzahyder7185 Those are more resent cards. The 2k line of card was pretty much the skip generation where most of who already had 1k card decided to stick with it because the growth was so small. The 3k only saw a relatively small difference to the 2k and same goes for the 4k line compared to the 3k. Oh but the price did not stay linear at all. I'm talking of the generations when a new 70 was clearly superior to a previous 80, and you bring up the ones where it wasn't, like your reading comprehension went out the window after eating a cyanide pill. And yeah that personal anecdote of yours would have a greeeeat value, if it wasnt for that if you look up any comparison for benchmarks online you can clearly see the 1070 clearly exceeds the 980 in games by a wide margin except lol and rocket league. How about next time you come with facts rather than "trust me bro it happen to me".
2.5 years later and you can get a 3080 with 2GB extra and 100w less power consumption for £50 less, thanks nVidia for saving me money! 3080 will be another 5+ year card just like the 1070 it replaced :)
It's 2.5 years later and the PS5 and Xbox series x are still the same price.. electronic prices don't go down if there is enough demand for them... Only tech that for sure still drops down by significant amounts these days if you're ok with waiting are tv's
The 4070 is not a bad card either. In my country, the 4070 special price is quite common, so I can get it for around $520. I also purchased the 4070, and most high-end games run well. And considering the heat and electricity consumption that the 3080 puts out in the summer, the 4070 is a much better choice.
Did you calculate your energy consumption over a year. I do not think so. The savings are laughtable. I have got my 3080 for 300 CHF. So 550 for a 4070 seems very high.
@@DJCREEDHDSQ Did you calculate how much hotter the 3080 runs than the 3070 which then (if you have it) will effect how much air conditioning you use which runs up your bill more which is what OP was implying when discussing heat? I do not think so.
i failed to find a 3080 for msrp for the 2 years of its shelf life. it was always around 800 eur even after gpu prices fell end of last year before 40 series launched and even after that. only now in spring it dropped but thats because people were thinking the 4070 would be better, not the same
NVIDIA has been making their cards weaker every generation. I was expecting the 4070 to be a little bit worse than the 3080 Ti, but definitely better than the 3080. But the 4070 will probably surpass the 3080 in time with drivers.
It is bad, historically we would get a lot better performance for less money. Remember the 3070 matched the 2080 TI, now it only matches the 80 non TI variant.
@@KelvinKMS DLSS a feature that is solely dependent on the development team of a game. There isn't a handful of games that supports DLSS. Just admit it, you're sold a lie.
@@KelvinKMS Don't use DLSS 3.0 as a crutch like Nvidia is to blow the statistics out of proportion. DLSS 3 performance is not the same as having those actual frames, and while it is very impressive, it is not enough to sell the 4070, especially considering the fact AMD will make FSR 3 open source and it too will have frame generation, meaning 3080s will soon have that feature too.
12gb is enough for 1440p or below, just because a few unoptimised turds like the last of us cropped up recently doesn't mean you should rush out and upgrade. Don't buy unoptimised games
in spiderman, the 4070 12GB loads 1.5 more GB of textures into its VRAM buffer, and gets 3 FPS more than the 3080 10GB in the last of us, the 4070 12GB loads 2 more GB of textures into its VRAM buffer, but gets 1 fps less than the 3080 10GB so dont let channels like hardware unboxed fool you. if these new games crash, its not due to vram overflow but due to them being unoptimized and not patched properly yet.
The new RTX 4070 is very efficient because it consumes less than 200W, but if that doesn't matter to you, you can find the RTX 3080 for cheaper used and the 6950XT from AMD consistently smokes the RTX 4070 in most games and it costs around the same. You may also want the RTX 4070 if you want DLSS3 with frame generation, but only a handful of modern games support it.
Only handful of games support DLSS3, but literally every new game will support it. People said same thing for original DLSS and now almost all newer games have it.
It costs me 60 dollars a month to have my 3080 running (about) here in Denmark, so power is definitely something people should consider probably just as much as performance.
Have you tried undervolting it? I can help you if you want. I reduced power draw of my 3070 by 50W without losing any performance. Temperature is lower by 8°C as well.
I’ve undervolted my gigabyte extreme 3080ti to 80 percent power draw Runs cooler and performs better Undervolting is the new overclocking lol Crazy times
4070 released at such a weird time for me, since i built a new PC last week with a 3080 in it, that cost £749. seeing that the 4070 I want is going for £679, i've decided to return the 3080 and get a 4070 instead. Almost identical performance, newer, DLSS 3 + Frame-Gen, much better power efficiency & £70 cheaper makes it seem like a no-brainer for me.
I get you man, i rebuild my PC 6 months ago and bought a Asus RTX 3080 for 870 euro and now i see the RTX 4070 is being sold for 660 euros, pisses me off how they play with the prices. I'm happy that my card can last me for a few years at least...
thats totally irrelevant...better question is if this is done in silent mode where you cant hear your pc or is this dont with your pc sounding like jet taking off fans
@@divacroft1034 Irrelevant to kids who live with mommy maybe. I always find it hilarious people are crying about too expensive while saying at the same time "much lower power consumption means nothing" It's called an electricity bill mate.
Seems like the RTX 4070 performs similarly to an RTX 3080. The RTX 4070 is running at 800-900MHz faster with 100-120W less power consumption and 2GB VRAM more, the price difference being 100USD cheaper.
If you already have a 3080, it's not worth it to upgrade since the difference is marginal but if you are buying a new card, definitely get the 4070 rather than the 3080. Just by power, 4070 already is the big winner here.
I would sell 3080 and get 7900XTX which destroys 3080Ti even 3090Ti, 4070Ti as its as good as a 4080 even better in newer games like MW2 etc. So ask yourself why buy 4070 when you can buy 7900XTX which performs better then 4080 and even with RT its still a beast. Even an 6950XT destroys 3080/4070 and its even cheaper I think but its slightly slower then 4070Ti but faster then 3090Ti. Go for 6800XT or 6950XT they have 16GB VRAM compared to 4070 which only has 12GB.. Don't go for 4070's AMD performs better per frames per dollar 4080 is a beast but too expensive. I personally would wait for next generation where 5070Ti beats 4090 and slightly slower then 4090Ti. 5080 will be $1299 5070Ti will be $ 999 = 4090 5070 will be $ 699 = 4080 5060Ti will be $ 599 = 4070Ti = 3090Ti So $699 to $999 for next gen GPU against current generation card that cost $1199 vs $1599 ($500 vs $600 difference). Go for the 4070Ti/4080 for Black Friday sales and the next Black Friday sales go for 5070 or 5070Ti if you haven't bought already.
@@marrow94 People are so dumb they buy a 8-12GB VRAM GPU but as soon as they place newest titles. Their VRM will be capped as those games require over 12GB + the ratio of ultra settings for graphics vs VRM you get is low. This is a scheme from Nvidia that people don't realize the fact that they do this on purpose. For the consumer to buy their higher end cards that has more VRM while AMD gives you 16GB which is more future proof. For 1080p 8-12GB is fine but 1440p or higher like 4K you will need 16GB and more games are graphically more advanced with newer game engines. 8GB is very low 10GB is low and from 12GB you are good but some new games goes over 12GB and in the future there will be more of this.
Or get a 6950 XT for significantly better performance and 4 more GBs of VRAM for only 50 bucks more... Or a 6800 XT for the same reason but 50+ bucks less...
What i like about the 4070 is the lower temps and power draw. Hoping to go for the asus dual fan option as I like that it doesnt take much space and i don't want to have to get a big new case for my pc.
Fun fact: 4070 is About 80-115 watts less power draw. ( closer to 100 watts ) Same performance, same price and about 5 degrees cooler. Vram usage is about 500-750 megabytes higher. ( tho this might be a error. ) My say on this: Keep the 3080 if you have it, or sell it and buy a 4070 for the better power draw and 5 degrees cooler. It’s not only about performance, you need to see it fully then say if it’s bad or not. The 4070 would be better if it was 50$ cheaper, but it’s reasonably good if you want a 3080 but a newer generation that has better power draw and somewhat lower temps. ( imagine a 650$ 4070, that be good ain’t it? )
Great card..3080 performance with 100 to 120w less power and more vram which is great..but pricing is just wrong. This card should be £100 cheaper at the very least. Got a feeling many people aren't paying attention to the power draw lol
most the people who don't pay attention to their power draw probably have their parents pay the bills. Even though I could afford a 4090 I wouldn't buy one because those electricity bills soon add up.
No mention on enabling dlss 3.0 for frame by frame generation? This would help increase frames on the 4070 on cyperpunk and some other games. Which is the whole reason of buying a 4000 series card.
seriously? such a joke for 40 series. When 1060 came out at 2016, it kicked 980's ass. Now the 4070 just barely catches up with 3080 which is a 3-year-old video card.....3 years ,,,,, 3 YEARS....
Exactly. Everyone here is just looking at performance. However, reducing the power draw by 100 W for the same performance is actually an insane archievement.
I upgraded from a 1070 to a 4070 recently, I know it's not a perfect card and that nvidias decisions can be criticised but man I really do feel the difference in performance :D
The problem with the 4070 is that the 192bit bandwidth and 504gb/s bandwidth are much lower than the 320bit 760gb/s bandwidth of the 3080, the 4070 with a 36mb cache and the 3080 with a 5mb cache. The 2mb cache and 86gb bandwidth of the 750ti can't do the same with the 760 192gb bandwidth of the 512kb cache. What's more, when the 750ti first came out in 2014, it couldn't handle the gtx660 with a 384kb cache. Later, after 2020, Nvidia gave up driver optimization for Kepler-based graphics cards, causing the 750ti to surpass the 760.
this aint a problem, most of the bandwidth has gone unused with the decline of the usage of fixed function MSAA modes, and the larger L2 cache on lovelace mitigates the lost bandwidth anyway.
On all these games Running the Asus Rog Crosshair X670E hero + Ryzen 9 7950X OC + RTX 4070 OC Both in 2K and 4K I cap at 144FPS in all. With both ultra and max settings. In intense moments of gameplay where a lot was going on rarely drop below 130.
People be like "But DLSS and Frame Generation" No. Just no. Don't start justifying rip off cards cause some tech to cheat in more frames is present. Normal rasterized performance has to increase decently as well in order for a card to be worth an upgrade.
I miss those days when 80 series Gpu cost only 350-400$ msrp(gtx 1080). This is hell expensive. I see this trend after covid nvidia is hiking their price like hell, 700$ cost is not near to consumer friendly cost. And when they are releasing 50-60 series gpu, then most of the Gpu goes to scalper, and due to that, the end customer is suffering. 😢
It started in 2018 with the 20 series. I'll never forget when the GTX 1070 MSRP was $399 and you could regularly get them for $350. It also performed as well as the GTX 980 ti, the best of the prior gen .... I miss it.
The more I see the 4070, the more it makes sense, yes it lacks true raster performance, but its very efficient, can be found in very small form factors, has good new tech ( hoping FG gets better), sadly trumps amd cards in cuda productivity workloads, this 4070 looks pretty fine, except for the classic rtx tax!
@@ledexyt4917 I went from gtx 960 4gb to rtx 4070 12gb, the difference is night and day. and the prices have dropped from $600 to around $500. im very satisfied with my purchase.
Why? Why buy cards that can only play a percentage of games well or fully utilize the features of game fully? It's RT solution is mediocre, Graphic rendering 3D tools is mediocre and upscaling is only so so.
Idk man, it's good that it's using 100 watts less but I think letting it have ab extra 50 watts would have made it much better and worthy of being 70 series card.
The tendency was that the XX60 had a performance superior to the XX80 of the old generation, as was the case of the 1060 that had the same or better performance than the 980, and the RTX 2060 to the 1080, in this case that trend is not met and for that price better acquire a used RTX 3080.
Hey, guys! Haven't you remembered that originally the "RTX 4070 Ti 12g" was being called "RTX 4080 12g"? So with that respect, this so-called "RTX 4070" appeared in this video should being called "RTX 4070 Ti 12g" right? Thus, the "true" "RTX 4070" still hasn't been launched yet…Lol
I can't believe the same company that released the rtx 3070 for 500$ with the same rtx 2080 ti the strongest of 2019 performance and better ray tracing and dlss Released the rtx 4070 for 600$!! And much much worse than the strongest of 2021 rtx 3090 ti !! It's even can't beat the rtx 3080 ! I can't believe what you did Nvidia!!!
should've gone for a 6800xt/6900xt/6950xt, all new and around the same price or less and perform equal or better depending on which you get. With 4gb extra of vram plus fsr3 is coming out and will be supported by them
I run my 3080 undervolted, usually around 850mV and 1850ish MHz clock. Can boost memory by 1000. Reduces thermal load, fan noise, and in some games, a slight fps increase. That said... The stock thermals of the 4070 are impressive and I'd love to see what that card could do undervolting. Am I going to swap out my 3080 for a 4070? No. Minor increase. Not worth it right now. Maybe a 4070Ti if it goes on a really good deal. But yeah, if you're building now, or buying a prebuilt, best to go for the 4070.
Hogwarts Legacy already eating over 12.5GB vram at 4k resolution while 4070 just released with 12gb of vram.. not too mention unreleased 4060 coming out with a mere 8gb vram. Already out dated for certain games
@@fwef7445 By no means a "moot point".. Some prefer to run at 4k and by all means they should be able to and not have to prioritize graphics over not stuttering, after spending $600+ on a gpu when last gen. gpu's 6800/6800xt from 2 years ago already came equipped with 16gb of vram, last generation! Nvidia are Greedibia at this point and I have 3080, by no means I am biased. They always cheapen out on vram. For the 3.5gb gtx970, gtx780ti 3gb..oh and the rtx4060 8gb? RTX4060, a 2023 gpu that share the same amount of vram from the rx480 card from 2016 era...ya, good consumer friendly practice. Their competition always provide more. But they get away with it. Hey, when new Unreal engine 5 titles come out in near future with 12gb vram buffer will almost fill out while last gen cards from its competition wont even break a sweat that are 2-3 years older. But hey if it makes you happy than by all means.
for everyone saying that the 4070 is not worth it, here in Europe is impossible to find a 3080 a 700$ today, it never went under like 800$, which makes a founders edition 4070 which is always avaible, a really good choice, because it costs 599$, it's new gen which can also be improved in performance with new drivers and has also new technologies likes diss 3.0, so yea 4070>>>3080, better price, overall better performance and less power usage
yup. but now i am thinking, if 12gb is enough for 2024, 2025 games, maybe get a 16gb 4060ti instead which will come in july, save some money (100), save some watts (40), performance will be a 3070 instead of 3080, but now you have frame generation to give you fps where you need them and frame generation requires extra vram of its own, so FG will be much more useful for a 16GB 4060ti than for a 4070.
Y'all know something crazy? The IPC from 4790k - 22nm - 2014 all the way to 13900k - intel7 - 2022 has almost doubled but not quite depending on the benchmark. Crazy how difficult it is to increase IPC even when you can pack in 5 times the transistor as before
I like the fact that you use Nvidia DLSS in "Quality" mode, but if you are going to use it because it gives higher image quality than the native resolution and higher performance, then you should activate it in all games, I saw how in the eighth game you did not activate it and only left it in native 2k, I was also pleased to know that I can play in 4k with this card using DLSS 3 sometimes in "performance" mode, taking into account the "Ultra Performance" mode
It's a great card, too bad it's only going to last 2.5 years until the same thing that happened to the RTX 3070 happens to it, since Unreal Engine 5 is going to require 16GB of VRAM
@@rodolfotroncoso3850 Xbox and PS5 will use 12GB, using all their available memory, something that they did not do in the past because the PS4 and xbox one dragged the current generation consoles. Note that PC and console are different, the PC will need 16GB compared to the 12GB of the console because on PC you can play with higher graphics to take advantage of the video card.
It should outrun the 3080, it does not always and when it does it is likely cheating. The 3080 is the better of the two, anyone who feels the 4070 is worth it, is not too smart! The previous gen 3070 was 20-30% faster than than the previous gen 2080 super. Now it is -5 to a few percent faster for $600, pure ripoff!
The RTX 4070 at $599 is pretty weird, let me explain. An RTX 3080 used costs $400-$500, consumes 300W of power and is as fast as the RTX 4070. The RX 6950 XT on the other hand costs $50 more than the RTX 4070 but significantly faster at raster but slower in RT and has 16GB of VRAM. But the RTX 4070 which has 12GB of VRAM, better RT than RX 6950 XT, consumes 200W of power (compared to 280W-300W) and has frame generation and better support, at this point I don't know what's better or worse.
6950xt is only 9 dollars more rn in some countries its less than 4070 and nvidia stopped making 30 series cards so ppl can buy 40 series a while ago that's why 3080's aren't $499 anymore like a couple months ago. also if they stop making 4070 fe like 4070ti in less than a week 4070 gonna be $650 vs as rock 6950xt $609 beats it in everything except efficiency.
100watts de diferencia entre ambas tarjetas a favor de la 4070 en un rendimiento en FSP muy similar y en ciertos juegos a favor de la 4070. Mi opinion, adquirir la 3080 de segunda mano por 400€ o ir directamente a por esta 4070.
and people say 4070 is bad gpu... lol 100 w lower power usage on average, same fps as higher tier 3080, 12 gb vram 100 dollars cheaper with frame gen... 3000 series were disaster, and 4000 series finally makes sense, just wait 1 2 month till price normalised closed to msrp. this gpu can hit 4k 60 fps in most games, and with new upscaling tehc this will last many many years even if you need to lower to 1440p 70-90fps its amazing gpu