Тёмный
No video :(

3080 12GB vs 4070: The Ultimate Comparison 

Daniel Owen
Подписаться 195 тыс.
Просмотров 244 тыс.
50% 1

Опубликовано:

 

27 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,2 тыс.   
@abdullahjester9798
@abdullahjester9798 Год назад
last generation's performance at today's prices, Good job Nvidia.
@lukasr1166
@lukasr1166 Год назад
Where I live the 3080 12gb barely exists and is far more expensive than a 4070 the few places where you can buy it. So the 4070 isn't all bad.
@puffyips
@puffyips Год назад
@@lukasr1166still not worth it.
@lukasr1166
@lukasr1166 Год назад
@@puffyips It's certainly a better deal than overpriced 3000 series cards. 3070s, 80s, 80 12gbs are so overpriced.
@raulitrump460
@raulitrump460 Год назад
@@lukasr1166 4060/60ti should get 3080 perf not 4070
@Gimpy17
@Gimpy17 Год назад
​@Lukas R buy AMD
@pf100andahalf
@pf100andahalf Год назад
The best thing about the 4070 is that it's driving down prices of used 3080's.
@Kage0No0Tenshi
@Kage0No0Tenshi Год назад
Yeah, thats why I like 4070 🤣
@pdmerritt
@pdmerritt Год назад
is it? i sure can't tell
@kaimojepaslt
@kaimojepaslt Год назад
@@pf100andahalf so what that it went down 50, it will add extra 50 in your power bill lmao. and add that every month. you kids dont use brain properly.
@astreakaito5625
@astreakaito5625 Год назад
Barely, and I don't think it's worth it because frame gen is a big deal you definitely want that option just in case.
@pf100andahalf
@pf100andahalf Год назад
@@astreakaito5625 what I want first is price reductions. I would much rather have a used $400-450 3080 than a new $600 4070. Better performance for $150 less and about to be $200 less. So far I haven't needed frame generation since I play at 1440p and the 3080 is a best of a card.
@sbtg2688
@sbtg2688 Год назад
Guys, remember that Nvidia initially launched the 4070 Ti as 4080 12GB? So they were originally planning to release this 4070 as 4070 Ti and charge us even more
@BlackJesus8463
@BlackJesus8463 Год назад
Would've been the same result,,, nobody buying. Jensen has lost touch with reality.
@shanez1215
@shanez1215 Год назад
That's probably where that 750 rumor came from lol.
@casweeden
@casweeden Год назад
The 4070 is really a 4060 ti. The 4070 ti is really a 4070. They just wanted to move their product stack up a belt loop and hope no one noticed. Didnt work for the 4080 "12 gb" and they had to pivot, but they still got what they wanted with the remainder of the stack.
@Tripnotik25
@Tripnotik25 Год назад
AMD will set things straight and release a 7700XT with the performance of 69xx, we already have the 800$ 7900 XT thats a whopping 10% faster than 6950XT.
@kornelobajdin5889
@kornelobajdin5889 Год назад
​@@Tripnotik25 4 more gigs 10% more performance and 100-150 bucks more expensive. Well pick which one is better.
@commandertoothpick8284
@commandertoothpick8284 Год назад
It aint just about length. Width matters too
@weirddylan7721
@weirddylan7721 Год назад
you say it
@puffyips
@puffyips Год назад
More so depending on the shorty
@arc00ta
@arc00ta Год назад
GIRTH is WORTH
@megadeth8592
@megadeth8592 Год назад
;)
@iseeu-fp9po
@iseeu-fp9po Год назад
. . . ;
@mattheww729
@mattheww729 Год назад
I miss the days when VRAM usage correlated with how good the game looked instead of how lacklustre the code is
@andersbjorkman8666
@andersbjorkman8666 Год назад
When running everything in rdr2 on max settings it looks amazing, and it doesn't go over 7gb vram ^^ And some games that look inferior im every way is hogging vram like crazy. Lazy and incompetent developers when it comes to optimization
@mattheww729
@mattheww729 Год назад
@@andersbjorkman8666 I agree with you there. RE4 doesn't look substantially better to me than games I ran on my 2gb 960 yet it uses 10gb
@__-fi6xg
@__-fi6xg Год назад
what do you expect, nvidia funded cyberpunk...
@NyangisKhan
@NyangisKhan Год назад
@@andersbjorkman8666 You don't even have to compare trash ports to rdr2. Just try comparing them to "the vanishing of Ethan Carter" it'll probably look better than the latest resident evil but will run on a GTX650 just fine. Modern PC ports are a joke. And the guys who ports them are clowns.
@countdespin66
@countdespin66 Год назад
@@mattheww729 RE4 has very high res textures all over the place on two highest tex detail settings
@michaellvoltare
@michaellvoltare Год назад
That 4070 power draw is crazy efficient. But no way I'm giving my money to Nvidia.
@bookbinder66
@bookbinder66 Год назад
as a 1060 owner Iam
@Tripnotik25
@Tripnotik25 Год назад
Crazier even is that supposed "blackwell" nvidia 5xxx gets another shrink in 2024, 5090 is to be 300W TDP (vs 450W 4090). 5070 will be a 120W card.
@Cristianxf1
@Cristianxf1 Год назад
Ngreedia
@HeLithium
@HeLithium Год назад
@@Tripnotik25 possibly single fan 5070?
@Tripnotik25
@Tripnotik25 Год назад
@@HeLithium Cards around 160W and lower tend to have single fan models, 1060, 1070, 2060, 3060. As the shrink will be another power efficiency jump from current Lovelace, 5070 will 99% chance have tiny single fan cards for the ITX builds.
@rarigate
@rarigate Год назад
Remember, this was supposed to be the 4070 Ti
@IsraelSocial
@IsraelSocial Год назад
This is a 4060ti
@BlackJesus8463
@BlackJesus8463 Год назад
@@IsraelSocial lolz They're using TI as a whole ass performance tier instead of added value.
@math3capanema
@math3capanema Год назад
@@IsraelSocial no, it's supposed to be a 4070 ti, and 4070ti is supposed to be 4080 12gb
@Some-person-dot-dot-dot
@Some-person-dot-dot-dot Год назад
@@alexsmirnoveu Deprived? They didn't "deprive" us of anything lmao. They messed up big time this generation though.
@alexsmirnoveu
@alexsmirnoveu Год назад
@@Some-person-dot-dot-dot By "deprive" I meant that we won't get GPU that was supposed to be 4070 Ti originally. I agree that Nv messed up. My point was that the 4070 isn't the original 4070 Ti.
@nicoshah6288
@nicoshah6288 Год назад
This is such a cool and informative video. I have a 3080 12GB and there aren't many quality videos like this. Keep doing what you are doing😊
@merlingt1
@merlingt1 Год назад
Thank you for doing this comparison. I have a 3080 12GB and all reviewers seem to pretend it doesn’t exist.
@davepianist84
@davepianist84 Год назад
Tbh not many people got it.
@lukasr1166
@lukasr1166 Год назад
Because it was overpriced, still is overpriced and I doubt it had much supply.
@vallin411
@vallin411 Год назад
I am glad I got my hands on a brand new PNY 3080 12GB for 7490 SEK at black friday last year, it's about equivalent to $540 US. Was a big upgrade from 1070 and seeing the price/performance from the 40-series I do not regret it one bit. EDIT: Since a lot of people comment on it: The price comparison is with VAT, import fees and other addons for the swedish price. Generally you can take the USD price*14 today to get a rough estimate in the swedish retail prices.
@farbenseher2238
@farbenseher2238 Год назад
You need a beefy cooler though. 375W of power consumption are no joke.
@mickaelsflow6774
@mickaelsflow6774 Год назад
That's about the price of a 4070 @ Webhallen though. So same same. I'm curious: how do you feel the PNY perform? Build quality and noise, mostly.
@oxfordsparky
@oxfordsparky Год назад
having owned a 40 series for a few months I'd take a 4070 over a 3080 12GB every single time, very similar base performance to the point where you wouldn't notice in reality but with the option of FG and using way less power makes it a no brainer if they are both available at the same time.
@Mattiedamacdaddy
@Mattiedamacdaddy Год назад
Feels bad my dad bought a 3080 Zotac for 1200 last year and they’ve dropped to like $500 used oof
@kaimojepaslt
@kaimojepaslt Год назад
@@farbenseher2238 he enjoys beefy power bill also lmao.
@bliglum
@bliglum Год назад
I miss the days when a 70 range card delivered top-tier 80Ti or Titan level performance, for a mid-range price. Now, we get barely 80 range performance, for a top-shelf price.
@poison7512
@poison7512 Год назад
To be fair that didn't really happen until Maxwell.
@Noob._gamer
@Noob._gamer Год назад
Go get a playstation 5 instead of this box with crape stuff inside and with RGB lights and cannot get 30 fps at 4k and with that prices like 1000 s $
@rogoznicafc9672
@rogoznicafc9672 Год назад
@@Noob._gamer no bcz i pirate everything so its still cheaper xD (allegedly tho)
@Chironex_Fleckeri
@Chironex_Fleckeri Год назад
​@@poison7512 Yes and it's glossed over. The 8800 GTX and that architecture was Nvidia's big break. I remember people running SLI 8800 GTX to run Crysis (1). But it wasn't until after Fermi that Nvidia really started making some big leaps. Maxwell and Pascal were culmination of many years of R&D. From 2012-2015 Nvidia didn't really have, in hindsight, good product lineups in terms of price to performance. The issue is that Nvidia has gone for margin. It's likely Nvidia is preparing their business for integration into things like national security. SpaceX is another tech company doing the same. Microsoft. AMD seems to be more business as usual in the graphics segment and their long term contracts with customers like Sony, Microsoft, and all the APU devices that have AMD inside. While their footprint is smaller than Nvidia, it just seems like Nvidia doesn't really need consumer confidence in their historical revenue driver. They're acting that way and investors are aware. But Nvidia will flip right back if mining takes off again. Long term they don't want to make a 1070 or a 1080ti type of product available anymore. They're crunching their product mix in a way that resembles smartphones. The graphics card as a 2 year "can you afford it" check.. not so nice, but Nvidia has business opportunities that make them not need us as much. Perhaps there is some technology (chiplets? SOCs? The client device becoming a streaming platform for gaming?) I don't know what they're working on, but I don't think Intel would've done Arc if Nvidia had continued to make everything a Pascal and then spacing out time between generations. This would be friendly to consumers but bad for Nvidia. It's just cold business man
@Peppeppukii
@Peppeppukii Год назад
​@@Noob._gamer we all know that console are generally cheaper but they're not so great in some games, like rts or competitive fps..., also you can't really setup some highres music player on it, to your liking and a lots more...
@LeonardPutra
@LeonardPutra Год назад
Recently I got a secondhand 3080 12GB with 2.5yrs warranty left for ~$400. One of the best purchase for my PC.
@SebasztianCH
@SebasztianCH Год назад
damn, beli dimana bro? toko ijo kah?
@oxfordsparky
@oxfordsparky Год назад
unless you have the original receipt you don't have any warranty at all.
@CoCo.-_-
@CoCo.-_- Год назад
try undervolting, ampere undervolts so well tbh, my 3090 performs the same as a 3090 ti in games while consuming 320w at most with rt and shit, usually around 300w without if not a little less (stock it was 390w with lower clocks due to power limits) doing a oc ad undervolt at the same time is really nice, my performance seems way more stable as well, i could probably have same performance as stock while consuming 280w or a little under but want that extra 5-8% increase in perf.
@LeonardPutra
@LeonardPutra Год назад
@@CoCo.-_- yep, mine got a real nice silicon, 800mV 1800MHz, or ‘eco’ mode 750mV 1590MHz. 220 watts on 1440p Hogwarts Legacy with nice framerate.
@LeonardPutra
@LeonardPutra Год назад
@@SebasztianCH iya toko ijo. Garansi NJT 3 yr
@pascaldifolco4611
@pascaldifolco4611 Год назад
4070 is 95% of the 3080 but with 40% LESS power consumption (200W to 340W), which is quite insane
@danmckinney3589
@danmckinney3589 Год назад
It's a typical Nvidia grift. They speced the bus for the 3080 way higher than the memory could actually utilize for better paper numbers that only give you a higher electric bill in practice. The next gen they cut down the bus and drop the tier to show a paper efficiently improvement on what is effective the same product. I've lost count how many times I've seen Nvidia pull that play.
@onik7000
@onik7000 Год назад
@@danmckinney3589 200w to 340w - is like 7-7.5 usd per month if you use it 24/7.
@hariyanuar8222
@hariyanuar8222 Год назад
@@onik7000 this is a real calculation?
@noThing-wd6py
@noThing-wd6py Год назад
@@onik7000 that 140w difference is at 24/7 for 30 days ~45€ per month in Germany. We pay 0,45€/kwh but that ranges from 0,30€/kwh to 0,60€/kwh.
@danmckinney3589
@danmckinney3589 Год назад
@@onik7000 at the average electric rate in the US of about $0.20/kWh, the 140 watt difference is costing $0.03/h. 8 hours of gaming a day puts the additional cost at $7.20 a month. Let's just say you were running at full tilt round the clock. That additional 140w would cost you an extra $260 a year. Now take that $260 and divide by 24 then multiply by however many hours of gaming you do a day. That will give you your additional cost. Mind you, that's just the difference between the two. It's not including the $0.04/h you're spending on the first 200w nor the extra $0.015/h you're paying because both of these GPUs actually opporate 50w over TGP. In Total that would leave a 3080 gaming around the clock at costing $740 a year to operate Now the actual point of my comment was to say that this is how Nvidia opporates. The over spec one generation beyond what the hardware can handle, then when they down spec the next gen. They use the "efficiency increase" as an excuse to keep the price high knowing fully well that the energy saving won't even be noticable to the average consumer. That's what makes it a grift.
@TRONiX404
@TRONiX404 Год назад
They went from a 384bit memory bus to 256bit on the 4080, looking like another intentional bottleneck.
@BleedForTheWorld
@BleedForTheWorld Год назад
Cutting costs = more profit.
@xxovereyexx5019
@xxovereyexx5019 Год назад
sometime gpu architecture is much more important than bus width etc
@Zettoman
@Zettoman Год назад
they have to leave room for a super/ti version
@niebuhr6197
@niebuhr6197 Год назад
Because the new gen G6X at 256b offers comparable bandwidth to 1st gen G6X at 320b, using 1/3 of the power, plus 10 times more of on-die cache? Nephews are hilarious
@ians_big_fat_cock5913
@ians_big_fat_cock5913 Год назад
bit bus doesn't matter as much as total bandwidth. The design of the memory itself will matter more in many cases.
@91plm
@91plm Год назад
this is more confirmation that the rtx 4070 should have been a 4060 TI at most, delivering a 10% worse performance in most scenarios than an rtx 3080 12 gb
@raulitrump460
@raulitrump460 Год назад
4060 because 3060 ti faster than 2080 super
@pdmerritt
@pdmerritt Год назад
that was NEVER going to happen....think about it....we had the 4080 12GB before it became the 4070ti so this card was ALWAYS slotted for the 4070 position no matter whether it was spec'd more like a 4060ti or not.
@andreabriganti5113
@andreabriganti5113 Год назад
With those specs and performances, this is actually the 4060 TI and should be priced below 499$.
@IsraelSocial
@IsraelSocial Год назад
It was a 4060ti and the 4070ti is the 4070 but they couldnt justify the price rise! Imagine buying the 4070 for $800 😂 is what actually is happening but everybody thinks that they are buying the ti version
@pdmerritt
@pdmerritt Год назад
@@andreabriganti5113 oh I agree it should be 499 or below but like it or not....this is definitely what we got as a 4070 this time around. Nvidia got WAYYYY greedy!
@mathesar
@mathesar Год назад
It's actually kind of crazy how many anti-consumer moves Nvidia has pulled with the 40 series cards. I'm gonna hold off until the RTX 50 series and hope things get better especially with VRAM. also recently learned they're going to release a 4060 Ti 16GB, What a mess lol.
@Hexenkind1
@Hexenkind1 Год назад
Would not be surprised if they also release another 4080 with more VRAM again.
@retrofizz727
@retrofizz727 Год назад
where did you see this information
@Elleozito
@Elleozito Год назад
same, i was using 1660TI and got a 3060TI for my 1660TI + some money, i'll use until idk, 5070TI comes out, cuz this generation was dogshit, without the new Nvidia Tech 'frame generator' literally didn't change a single thing... but they increased the price, said the moore's law is dead and also as u said, VRAM? BUS WIDTH? like hello Nvidia? we're stuck at 2018?
@memoli801
@memoli801 Год назад
You know , there are other companies out there?
@retrofizz727
@retrofizz727 Год назад
@@memoli801 no
@Birawster
@Birawster Год назад
The most notice difference between 30 & 40 series is the power draw, even tho you can have the same fps performance but you get less power usage, pretty good to consider your monthly electric bill
@Bdot888
@Bdot888 Год назад
Thats the main reason I went for a 4070ti. It will pull a little over half the power of the 3080 and temps are a lot better. That was the main selling point overall for me
@Noob._gamer
@Noob._gamer Год назад
The performance differs is huge compared to 30 series you are comparing 70class to80class of last generation when you compare 4080 to 3080 it’s a big jump in performance and 4070 ti is = last gen powerful GPU 3090 if you compare 4090 to 3090 it is also a big jump in performance AMD 7900xtx also performance similar to 4080 still but ha prices are higher for that performance if it comes same prices as 30 series it would sell like cupcakes
@lethanhtung4972
@lethanhtung4972 Год назад
@@Noob._gamer Please remember that the 3070 = 2080ti, the 4070 should be at least on the 3090 level not 3080
@Noob._gamer
@Noob._gamer Год назад
@@lethanhtung4972I agree
@robertcarhiboux3164
@robertcarhiboux3164 Год назад
we now wait for a 150watt card at the price of 150/200dollars,that can max out any title at 1080p just like it used to be.
@soppingclam
@soppingclam Год назад
The 12gb 3080 was a significant upgrade from the 10gb version. Lots of differences
@rohanb2711
@rohanb2711 Год назад
It's gonna age way way better than 10 gb model
@Kage0No0Tenshi
@Kage0No0Tenshi Год назад
It's exactly same only diff is vram memmory and may be pushed higher on stock clocks only
@WayStedYou
@WayStedYou Год назад
​@@Kage0No0Tenshi extra bus width 2gb extra vram more cores and SMs
@IsraelSocial
@IsraelSocial Год назад
5% for 17% more money no thanks
@nicane-9966
@nicane-9966 Год назад
​@Jesus is Lord wrong. Wider bus as the 3090. Thats why it literally performs toe to toe with 3080ti...
@maag78
@maag78 Год назад
I think this looks great. I paid 900usd for my ROG Strix 3080 10gb and that was 100 less than MSRP. Yes that's the most expensive model but wasn't the cheapest 3080 700? I really don't get peoples issue with the 4070. I'm getting great performance with it in 1440p and it sits at around 180 watts and is cool as a cucumber and extremely quiet.
@pt6998
@pt6998 Год назад
Saving now for Blackwell next year. Getting a card with atleast 16gb of vram on it. Not making the same mistake I did with my 3070. Hopefully by then, competition from AMD and Intel brings Nvidia back to reality on pricing.
@Battleneter
@Battleneter Год назад
The 3070 was always suspect right from the start, current consoles can spare up to 10GB (out of 16GB) of shared memory for VRAM after the game data and OS. Around 10GB is basically the upper target for game developers for 4K and below, excluding the odd fringe case 10 &12GB will be fine for the next 3-5 years, it's not going to be the same situation as 8GB.
@YashPatel97335
@YashPatel97335 Год назад
@@Battleneter you are confusing for accounting same vram in consoles and pcs. Remember console have only 1 purpose to play game and developers will definitely optimise their game so it runs perfectly on console. Pc on the other hand, do more than just gaming. So it works differently. 10gb vram on console can’t be considered same as 10 gb vram on pc. Pc might need higher vram to perform same as on console. And that is why the game developers are not bothered by vram consumption on pcs and pc ports are launching without proper optimisation. And judging by the current trend of AAA released games, we can get games that will eat more than 12-16 gb of vram on pc in THIS generation alone. And this vram consumption will increase with newer generation of consoles. So yeah if you have budget, go for higher vram and native performance rather than software like dlss and fsr. That chide will last you longer. So 10gb console vram doesn’t equal to 10gb pc vram.
@MegaLoquendo2000
@MegaLoquendo2000 Год назад
​@@YashPatel97335The situation's so bad that Jedi survivor has a native sub 720p resolution on ps5, I wouldn't be surprised if current gen consoles (ps5 and xsx) end up using the equivalent to low settings before the generation ends.
@demetter7936
@demetter7936 Год назад
Same as me. I'm putting aside £3000 so I can completely upgrade my PC next year, i'll sell my current parts to get back roughly half the cost.
@Battleneter
@Battleneter Год назад
@@demetter7936 You can argue old consoles also have value, but again consoles are just toys. It's like comparing a screw driver (consoles) to the price of a Swiss army knife (PC), sure both will play games but the PC does a crap ton more.
@blackcaesar8387
@blackcaesar8387 Год назад
Now we know for sure that frame gen will never come to 30 series. Its literally the only thing selling the 4070 now. I imagine that would be the same for every lower tier card still to come.
@butcherrengar3222
@butcherrengar3222 11 месяцев назад
Completely underrated, your breakdowns and even the test you run are perfect great job.
@YetMoreCupsOfTea
@YetMoreCupsOfTea Год назад
I recently picked up a used RX 6800 non-xt, which after some mild overclocking is performing in 3080 territory (ray tracing off, of course). It only cost me around 300 USD.
@Plasmacat91
@Plasmacat91 Год назад
Great content, brother. You have quickly become one of the most influential voices in the community.
@karehaqt
@karehaqt Год назад
Tbh the only thing I see as a big difference is the power draw. DLSS 3 isn't a big thing for me as I never turn it on as it just doesn't feel good, personally it feels janky as hell so I only ever use DLSS for it's image reconstruction, not the frame gen.
@Greez1337
@Greez1337 Год назад
Wow. You're the only person I've see remark the jankiness of DLSS3. A lot of people think it's gonna save them from a 40fps stutterfest experiences in the AAA goyslop released now instead of exacerbate the input delay compared to the smoothened frame rate they see.
@CoCo.-_-
@CoCo.-_- Год назад
@@Greez1337 most of the people using it are 4090 owners playing at 4k or 4070 to 4080 owners playing 1080p mostly and some 1440p with the base fps being atleast around 90fps, its not surprising they don't notice the jank from it, i bet they would if the base fps was 60 or lower tho, honestly it just reminds me of how motion blur makes it look like more frames xD
@karehaqt
@karehaqt Год назад
@@Greez1337 For me it feels the same as turning on motion interpolation on a modern TV.
@sethdunn96
@sethdunn96 Год назад
It's interesting to see NVidia do this. But going from 1080 -> 2070 things were on par, but maybe a 3%-5% edge in favor of the 2070, then a 3060 was on par w/ the 2070 with about a 3%-5% boost in favor of the 3060. And now, you would expect a 3080->4070 to be on par with a slight advantage of 3%-5%, but it's actually at a disadvantage of 3%-5%. Very sad.
@necro4468
@necro4468 Год назад
My boy just got a 4070 build from a sponsor and was like "I have the best pc in the group now." I have a 3080 12 gb and said nah your just up here with the big dogs
@rakon8496
@rakon8496 Год назад
To answer to your request around wanted content: I would appreciate more eduactional testing setups that are investigate behaviour like in this video with the bus width/bandwith. And being aware of how easily you can leave performance or even screw up your experience in the Nvidia(AMD) control panel... basically an evergreen for a changing audiences. E.g. i would appreciate videos explaining single settings in depth and testing their performance/experience implications in several different game engines- alongside the usual comparative testing. That could add to your benchmarking imo. THX💙
@rohanchooramun7288
@rohanchooramun7288 Год назад
What's the point of having supposingly better and more powerful RT Cores but then you decrease the amount of RT cores on the RTX 4070.
@Impossibly-Possible
@Impossibly-Possible 5 месяцев назад
Because the 70 series is really the 60 series and so on, if you look at it like that you understand the performance, they pushed the slider over by changing the names on everything, the 90 is the 80 and 80 is 70 and 70 is 60 and 60 is 50 and prices went up for what would have been the real product stack and then slide the names over making profits go up many many times, selling a 60 series card that should have been 250 dollars as a 70 series for 600 is INSANE mark up in price to performance.
@hakdergaming
@hakdergaming Год назад
its the nerfed CUDA cores on the 4070 that cause non-RT rasterized performance to drop significantly while they peddle frame generation that uses the SAME tensor cores as previous generations, but they softlock it to 4000 series in drivers. while also giving you a bit more RT cores to polish the turd of the rest of the GPU.
@MaxIronsThird
@MaxIronsThird Год назад
nah, it's just the bus width. 4070 RT cores are better than the 3080's, so even with LESS RT cores, it performs better, they don't matter in raster though.
@hakdergaming
@hakdergaming Год назад
@@MaxIronsThird sorry for the misunderstanding. when I mentioned they "nerfed" the CUDA cores, I meant they reduced the CUDA core count. with the RTX 4070 only having 5888 CUDA cores, and the 3080 12GB having 8960 cores. yes i know the efficiency of the 4070 is MUCH better, and even though the difference in cuda cores is around 35% it manages to perform only 8-18% worse in raster graphics while also consuming much less wattage. however this drop in raster performance, especially for the price of the card and the fact that nvidia has historically released new 70 class cards that are supposed to match an 80 class card of last gen. makes this newer card a much worse deal when raw performance is considered. however, if you need less power consumption, and energy in your area is rather costly, than the 4070 would make more sense. and frame generation is still a pretty decent technology even though frame interpolation has been out for years, and this tech is legit just frame interpolation with some AI smoothing to prevent the image from looking unnatural. which again, uses the SAME EXACT tensor cores that every nvidia card has been using for DLSS since the 2000 series
@frankguy6843
@frankguy6843 Год назад
Grabbed a barely used 3080 12GB last year and have not regretted it at all, the recent drama around cards being limited at 8GB are of no concern, and the 40 series has nothing that is compelling comparatively. Not that NVIDIA couldn't produce something compelling, but they chose not to and here we are.
@justlamb
@justlamb Год назад
in the uk, the 3080 12gb costs £100 more than the 4070, uses double the power, and has no DLSS3 features. but good comparison
@Underground.Rabbit
@Underground.Rabbit Год назад
3080 3080ti 3090 3090ti had like 5% difference. between each other when overclocked a bit. Unless the game required massive amounts of VRAM ofcourse.
@Kage0No0Tenshi
@Kage0No0Tenshi Год назад
Imagine 4 years later rtx 4070ti and 3080ti and below can not run AAA games becuze of low vram memory. Would pick rtx 3090 any day over 4070ti or even rx 6950xt
@WASD-MVME
@WASD-MVME Год назад
Wrong
@CoCo.-_-
@CoCo.-_- Год назад
@@WASD-MVME yeah, i think the perf difference at lets say 4k between the 3080 and 3080 ti is atleast 15% while stepping up each after is 5% depending on model
@CoCo.-_-
@CoCo.-_- Год назад
@@InnerFury666 3090 and 3090 ti are within a few percent of each other depending on the 3090 model, mine runs about on par with one for example here is a video ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-NZACJFQlJK8.html, you can achieve the same result with an undervolt (depending on lottery) as out of all of ampere the 3090 is the most power limited, if the clocks are on par with the 3090 and 3090 ti, they are within 2-3% of each other, but fe vs fe due to the higher power limit on the 3090 ti its about 10% yeah
@TheSlowEcoboost
@TheSlowEcoboost Год назад
somehow my 4070 uses even less watts than my 3060 ti and when you undervolt it a tiny bit the card just becomes the best choice for older power supplies or bills
@andersbjorkman8666
@andersbjorkman8666 Год назад
I got a 4070 for a good price in EU and did some maths, and I'm saving about 40 dollars a year compared to if I'd have gotten a 6950xt :P
@jesusbarrera6916
@jesusbarrera6916 Год назад
If you can afford a 4070 you can afford a better PSU
@TheSlowEcoboost
@TheSlowEcoboost Год назад
@@jesusbarrera6916 it cuts down costs to save an old one
@froznfire9531
@froznfire9531 Год назад
@@jesusbarrera6916 I mean ofc you can but say 100 bucks for PSU and the money a 3080 will cost you over the years, this will stack up. It only makes sense at a very big discount in my eyes
@LeonardPutra
@LeonardPutra Год назад
You should test the undervolting capability of your 3080 12GB. Usually it runs fine on 800-850mV, with 1750-1920MHz GPU clock. At 850mV, 1920MHz, it is pretty much like a stock frequency, with a lot lower power consumption at 240-280 watts. But then, the 4070 underclocks really well too, I’ve seen it undervolts to 140-150 watts while maintaining pretty much the same performance. That’s GTX 1070 level!
@Checkout17
@Checkout17 Год назад
Indeed, I have a Rog Strix RTX 3080 12GB (OC version) and run it at 1900Mhz with 900mV, runs cooler and stay between 250-280 watt! Thats almost 100 watts less power consumption without losing much performance. The default clock of this card is 1860MHz. Also, this card has same chip as the 3080ti/3090. And same cool block as the 3090, (for the ASUS ROG STRIX version) They had to do this of course :-D
@Krenisphia
@Krenisphia Год назад
I undervolted my 4090 to 825mV @ 2400Mhz. Power consumption dropped to 220-250W for all my games. That's a pretty big drop from 400W stock, so I'm very happy. Temps-wise, GPU core stays around 40 degrees and vram / hotspot gets to 50 degrees. Never even touches 60. The best part is performance loss was very small, less than 10% which is not even noticeable for me. :)
@LeonardPutra
@LeonardPutra Год назад
@@Krenisphia That's cool! (pun intended). That really show the power of TSMC 4nm process node!
@MrIsmaeltaleb85
@MrIsmaeltaleb85 Год назад
Love my 3080 12gb. Suprim X model. Very good silicone in mine. Been running it since September 2022 at 1980mhz @0.875v +1200 on memory. Pushed to it limits at 430w it can hold 2295mhz in unigine heaven 4k.
@xxovereyexx5019
@xxovereyexx5019 Год назад
3000series is very power hungry, 4070 only needs +200watts
@Kage0No0Tenshi
@Kage0No0Tenshi Год назад
Nice undervolt for 1980Mhz, I stick to stock voltage cap it to stock and was able to overclock to 2115Mhz stable for my rtx 3070 msi gaming x trio sadly a$$hole Nvidia cap wattage to 255W I can not pass 260W what I even do. unigine heaven is pretty bad now days and would recommend to use firestrike at last or even timespy, I can do stable 2200Mhz something too, on unigine heaven but does not stress GPU. Lucky you that your GPU can pull 430W my stuck 255W and below.
@Kage0No0Tenshi
@Kage0No0Tenshi Год назад
4070 is taking about 50W less than rtx 3070. Not big deal and 4070 have really bad overclocking potential vs 4080 and 4090
@franks8127
@franks8127 Год назад
I have a 3080 12g that takes an undervolt at 8.35 1875 mhz and plus 500 on the memory.... it uses about 280w and is really stable...... its a gigabyte oc card
@MrIsmaeltaleb85
@MrIsmaeltaleb85 Год назад
@xXOverEyeXx i have 4 profiles in MSI Afterburner. If i run the efficient mode (for my kids) 0.825v @1860 with stock memory i can keep it around the 200w in most games. Not far off the efficiency of a 4070 with about the same performance. Mind you, you can also undervolt a 4070.
@siyzerix
@siyzerix Год назад
So now we're just paying for software upgrades basically and reduced power draw (which is something you EXPECT from a new gen of gpu's. Its like saying the new gen will be faster). So now we're paying for software upgrades. Brilliant. Keep defending it guys. You're soon going to see performance uplifts be a thing of the past.
@raulitrump460
@raulitrump460 Год назад
Power isnt realy reduced this 4070 is 4060 with 190-200w. 3060 12gb is 170w.
@niebuhr6197
@niebuhr6197 Год назад
​@@raulitrump460 this gpu has 46 SM, it barely consumes more than 28 SM, but it's 90% faster. 4060 lmao
@HosakaBlood
@HosakaBlood Год назад
I mean this is not suspose to remplaze a 3080 12gb owner this is still a uplift if you compared to what buyer they are targeting witch is the 3070 and below there's a reason the 4090 have a huge gap this Gen maybe because every single review talk crap about the 3090 being a scam on being 15% faster than a 3080 for 2x the prices maybe Thas why nvidia nerf lower tier this Gen so review stop talking shet but guess what the 7900xtx was barely a uplift for the same prices so who is here to blame
@siyzerix
@siyzerix Год назад
@@HosakaBlood Nvidia and AMD are just colluding at this point. The xx60 class generally have performance close to the last gen xx80 class GPU. This ''rtx 4070'' is that. Thats basically what the 4060ti should be at the minimum. So, what nvidia is doing is selling us at best a 4060ti for $600. Its a pathetic performance jump. I mean its barely 15% faster than the rtx 3070ti. And the vram, its been long overdue. We should be having 16gb vram at this price at minimum and realistically having 4070ti performance at minimum for what is suppose to be a 4070.
@siyzerix
@siyzerix Год назад
@@raulitrump460 Yeah, thats pretty much true. At best its a 4060ti.
@Horendus123
@Horendus123 Год назад
Great video My suggestion for another video 3080 10GB vs 3080 12GB On current gen titles, has the extra 2BG been of much benefit
@cbdeakin
@cbdeakin Год назад
Yes, it would be interesting to see how much the extra 2GB of VRAM helps at 1440p/4K. Particularly in demanding games like the Witcher3 next gen.
@adaeptzulander2928
@adaeptzulander2928 5 месяцев назад
It not just the extra 2 GB, its the bus size increase 320 -> 384 bit.
@IStrikerXLI
@IStrikerXLI Год назад
FYI, the replay feature in Fortnite has options to set and lock the time of day. Could be helpful in eliminating variance between runs.
@johndelabretonne2373
@johndelabretonne2373 Год назад
Daniel, as usual, I completely agree with your assessment! I'll wait to see what kind of price/performance AMD gives us at 7700 & 7800 levels. I'd probably be most inclined to get a 7900 XT if it were at or under $700...
@juanpaulocabugsa771
@juanpaulocabugsa771 Год назад
Bought RTX 3080 2 weeks ago for 450usd. prices of RTX 4070 here in Japan is around 740usd which is too damn high. Thanks for this video 😊
@nipa5961
@nipa5961 Год назад
Luckily new 6800XT and 6950XT cards are still available. Nvidia is not an option at these prices right now.
@nr1771
@nr1771 Год назад
Depends what you want to use the card for. Nvidia's drivers are still better than AMDs for a lot of things.
@nipa5961
@nipa5961 Год назад
@@nr1771 AMD's drivers are also still better than Nvidia's for other things. It's not worth paying a few hundred more for Nvidia cards.
@nr1771
@nr1771 Год назад
@@nipa5961 AMD's drivers are not better than Nvidia's for anything. For a few things, they are about as good, but if you use your card for anything besides standard raster gaming, you will start to appreciate just how far behind AMD's software is compared to Nvidia's. In pure raster, they've improved, but they still have issues like massive idle power draw or dysfunctional multidisplay support. To take just one example of the many software issues AMD still has, just talk to all the people who bought a 7900XTX to use for VR and then returned it because it either A) refused to recognize their headset, B) crashed upon launching any VR platform, or C) performed worse than a 3070.
@nipa5961
@nipa5961 Год назад
@@nr1771 I've made very different experiences it seems. AMD drivers were much more stable for me than Nvidia's. Also, a friend just "upgraded" to a 3060 and has massive problems with his monitors not waking up properly now and is forced to blindly restart his machine a few times a day. Just one example. Nvidia also has no equivalent feature to Radeon chill. Speaking of power consumption RDNA was way more power efficient than Ampere. Strange how everyone seemed not to care last gen but brings it up since last fall. XD They both have their pros and cons, but in the end they are very equal. So again, sadly AMD is the only option right now, since all equivalent Nvidia cards are much more expensive and lack VRAM.
@nr1771
@nr1771 Год назад
@@nipa5961 I'm glad you've had a good experience with AMD's drivers, but it's definitely not shared by a lot of people out there. All you have to do is look at any AMD forum and you'll see a lot of people talking about driver issues still. And I'm not saying this to fanboy Nvidia. I hate the way they've priced this generation of cards. I'm just saying that if you care about anything other than standard raster gaming, AMD still has a lot of issues (if all you care about is standard raster gaming, AMD is the better value and you should buy AMD).
@jetkat6935
@jetkat6935 Год назад
For some reason 4070 footage looks laggy and riddled with stutters even though the frame times are very similar. Something wrong with the recording of 4070 results?
@rileyhance318
@rileyhance318 Год назад
4070 doing all of that at 180w vs 340w on the 3080. glad to see some effort is going into efficiency. never the less i will be keeping my 3080 12gb
@BlackJesus8463
@BlackJesus8463 Год назад
Still a $600 1440p card tho. I'd rather have the power to run 4K TBH
@MaxIronsThird
@MaxIronsThird Год назад
not effort, just a better node TSMC's N4 vs Samsung's 8nm
@rileyhance318
@rileyhance318 Год назад
@@BlackJesus8463 3080 was a 1300-2000 dollar card for over 50% of its life. 600 is not bad
@NostalgicMem0ries
@NostalgicMem0ries Год назад
@@BlackJesus8463 1440p and 4k differs very little and monito/tv upgrade costs a lot. not to mention 4070 can run most games at 4k 60 fps , for 4k 120 144hz gaming you will need 4090 that is super expensive and 7900 xtx can match that too
@BlackJesus8463
@BlackJesus8463 Год назад
​@@NostalgicMem0ries Not really. 1440p monitors are often more expensive than 4K tvs, especially oled. Everything else you said is bs too.
@xTurtleOW
@xTurtleOW Год назад
Bigger price than used 3080 and less than 3080 performance very nice
@mr.cookie8265
@mr.cookie8265 Год назад
well it uses about half the amount of electricity. The 4000 cards are super efficient
@victorxify1
@victorxify1 Год назад
@@mr.cookie8265 wow u can save $6 a year on your power bill, thanks Nvidia
@whiteerdydude
@whiteerdydude Год назад
​@@mr.cookie8265 They better be. They are on an a significantly more efficient transistor node. The fact that this gens 4070 can't consistently beat last gens 3080 (12 gb, but all 3080's should have been this for the price) in raw performance is really sad. And to top it off, Nshitia wants 100 bucks more for this card than last gens 3070. It sucks. If this card was matching the 3080 ti like how the 3070 was matching the 2080ti this wouldn't even be a discussion.
@fafski1199
@fafski1199 Год назад
Like anything used, it's always a risky gamble. Most of those used 3080's listed on EBay, will no doubt be ex-mining cards or will have had a fair amount of wear and tear. Having the safeguard and peace of mind of a 3 year warranty, in itself is worth paying out an extra $100, at thier price range. Secondhand will always be cheaper, with the potential of getting a better bargain. However there is always several risks and cons involved. You could find yourself 2 months later with a dead GPU, with no way to get a refund and be $500 out of pocket.
@mr.cookie8265
@mr.cookie8265 Год назад
@@victorxify1 the 4000 series cards are also alot cheaper than the 3000 series cards (at least in europe) the 4070 costs about 600€, the 3080 10GB costs about 770€ and the 3080 12 GB costs about 2270€
@gramostv_official
@gramostv_official Год назад
So 3080 is outperforming 4070 at 1440p and higher? Nvidia 🤣
@galacticdoge5996
@galacticdoge5996 Год назад
finally got the video i was looking for! thanks
@ogcryptowealth
@ogcryptowealth 7 месяцев назад
Nobody talks about the huge efficiency upgrade for the same performance of a card that was once selling for close to $1500 at a new cheaper price of $600, so not only is power cost lower which pays for itself over time if comparing to RTX 3080 power cost but also lower temps on the newer architecture means the 4070 will likely have more endurance than the power hungry 3080! Just a thought I know is not popular because all people worry about is raw performance but I value efficiency and endurance more than raw performance!
@several.
@several. Год назад
DLSS and Frame generation is going to be the saving grace of the 40 series. It's just insane tech, and theoretically only going to get better and wider supported.
@Vespyr_
@Vespyr_ Год назад
It's software. It's software they could push to previous generations to make them better if they wanted to. But they won't and lock it to their latest cards for profit. Unlike AMD which allows even their oldest cards support their newest technology.
@froznfire9531
@froznfire9531 Год назад
@@Vespyr_ lets see how well FSR 3 will be Currently, FSR 2 is clearly behind DLSS 2 so I wouldnt expect wonders. I like that AMD releases it Open Source but if it works better on Nvidia, you just cant help it
@Vespyr_
@Vespyr_ Год назад
@@froznfire9531 When a company patronizes a market in this manner, it is specifically targeting established customers of previous generations of their brand. New customers specifically are unaffected by this exclusivity, until the next model releases. Nothing stops them from backward porting this technology. They won't even do it by one cycle, just a few years apart. There is more to a company, than performance. They did this with Gsync too, until sales forced them to concede.
@lughor82
@lughor82 Год назад
For everyone interested why the bandwidth got smaller... The VRAM is added to the graphics cards as chips and every GDDR6/GDDR6X/GDDR7 Chip has 32 data pins to connect to which is called a 32-bit wide bus. The memory bus can be shared between 2 memory chips, but I don't think there are recent examples of such shared bus systems. (the 660 Ti would be an example) Usually you can simply multiply the 32 bit bandwidth of a chip times the number of chips and you will get the memory bandwidth.
@lughor82
@lughor82 Год назад
I have found the method which is used to give more VRAM without making the bus bigger. It is called the Clamshell mode. There is a x16 mode (two 16 bit channels) and a x8 mode (two 8 bit channels). The x8 mode is called the clamshell mode and you can put double the VRAM without using a bigger memory controller. You can still use the VRAM chips in parallel, but you only get halve the bandwidth per chip.
@MateusSilva-ps6xr
@MateusSilva-ps6xr 2 месяца назад
I'm here in Brazil considering buying a 4070. Here the prices are very high due to the conversion of the dollar into local currency and the inhumane taxes. Congratulations on the particular analysis. One more subscriber.
@deathwishsquish9142
@deathwishsquish9142 Год назад
I genuinely cannot understand people who call the 407p's 4k performance "not worthwhile at 4k" like, this thing does better than a ps5 and xbox series do you absolutely need 120fps.... and even then most games will get over 60fps at 4k or 1800p.... like....
@ARobotIsMe
@ARobotIsMe Год назад
Great work Owen!
@kalestra4198
@kalestra4198 Год назад
Michael oweeeeen
@dwedj
@dwedj Год назад
that power draw tho
@cks2020693
@cks2020693 Год назад
10C temp difference too
@emanuelacosta2541
@emanuelacosta2541 Год назад
Exactly man, this generation is so efficient. I have a RX 6800 and is a monster at low wattage, but this 4070 is something else.
@cks2020693
@cks2020693 Год назад
@@emanuelacosta2541 there is a chinese tech guy that UV 4070 to 100W and took all the FANS OFF, and the highest temp was like 81C, while still maintaining 85%+ performance
@FenrirAlter
@FenrirAlter Год назад
​@@cks2020693 u can do that with an 3060 too. Almost like the 4070 is a 4060
@emanuelacosta2541
@emanuelacosta2541 Год назад
@@cks2020693 Damn now I want to do the same with my RX 6800, I need to try, that's incredible!
@MaxIronsThird
@MaxIronsThird Год назад
Nvidia thought the big L2 cache would be way more performant than it is or they just want the 4070 to be 1440p card and not a 4K card. Even with better RT cores and more Optical Flows acc, the 4070 isn't able to match the 3080 12GB in RT mode. That's ridiculous.
@WilFitzz
@WilFitzz Год назад
Have you considered doing any monitor reviews? I know you have a lot on your plate already, but it would be cool for someone to pick out some value monitor picks!
@Revanse
@Revanse Год назад
There is only gpu which we can buy for 600 USD and that's 6950XT , 20% quicker , 16 GB Vram
@DrearierSpider1
@DrearierSpider1 Год назад
We all know the cache and memory bandwidth of the 4070 were meant to gimp its performance at 4K, so you'll be upsold to a higher tier GPU.
@smgames9873
@smgames9873 Год назад
I got a strix 3080 12GB last year and I’m really happy with it
@quukskev4970
@quukskev4970 Год назад
Well the drivers of the 3000 series are some what final, the 4000 drivers are still being configured for all games, so it will likely chance over time
@tureebluh
@tureebluh Год назад
This is probably why Nvidia is giving the middle finger to gamers now lol. A LOT of comments last generation about the power draw, then they release a new generation with the same performance but half the power and the top comment is "last generation's performance at today's prices. Good job Nvidia" lol... I think some gamers just like to complain.
@giucafelician
@giucafelician Год назад
The RTX 4070 delivers almost similar performance compared to the RTX 3080, despite consuming 120 watts less power... Not " half the power"!! But the 4070... ready for "Planned obsolescence"! Because of VRAM!
@megadeth8592
@megadeth8592 Год назад
ya just keep ignoring all the shit nvidia does
@tureebluh
@tureebluh Год назад
Don't be disingenuous with the numbers... we can literally see them in the video. I haven't bought either generation because of Nvidia's shenanigans, so I'm not sure where you get that I'm "ignoring all the shit nvidia does". I'm also mature enough to give them credit for fixing one of the biggest complaints I saw against the 30 series.
@michaelwoods7770
@michaelwoods7770 Год назад
The fact that they are banking on tricks to sell cards is simply silly. They artificially limit Rtx 30 cards so they don’t use these tricks cause they would lose all credibility at that point.
@davepianist84
@davepianist84 Год назад
I tested my 10Gb 3080 vs a 12Gb 3080ti, ram wise the 2gb didn't make a difference at all. For reference Re4 remake can be played with the 10Gb card all maxxed using a 2gb texture option while the 12Gb card only allows you use the 3gb option (crashes at 4gb). That been said it isn't relevant 🤷🏻‍♂️
@Fahad1999win
@Fahad1999win Год назад
Thanks for the information, did you play in 4k60 RT full native ?
@davepianist84
@davepianist84 Год назад
@@Fahad1999win You're welcome, no I don't play at 4k, I tested in at 3440*1440, all maxxed and the textures set to 2Gb.
@IOADESTOYER
@IOADESTOYER Год назад
If you want to know the real prices/values of 4000 series Nvidia GPUS then just scale it down by 1 model lower: rtx 4090 $1600 ---> $1200 rtx 4080 $1200 --> $800 rtx 4070ti $850 -- > $600 rtx 4070 $600 --> $400 etc etc If you notice how price make sense perfectly when you scale the prices down 1 model, you will see the pattern where Nvidia just scaled up prices by +1 model.
@privacyprivacy8206
@privacyprivacy8206 2 месяца назад
3080 used it's perfect choice
@christophermullins7163
@christophermullins7163 Год назад
This guys is going places. The gimped vram bus of 40 series will go down in history as the worst change nvidia ever made. We need more ram AND the same number of ram chips. The 384bit bus will make 3080 far better at demanding 4k raster.
@GewelReal
@GewelReal Год назад
And then you woke up and ran out of VRAM
@zipperman6045
@zipperman6045 Год назад
The thing is that you are right the memory bus is leading to it being faster at 4K however its drawing 70% more power for 10-12% more performance and this relatively small gap in performance means as DLSS3 becomes more prolific it won't matter as much
@lexiconprime7211
@lexiconprime7211 Год назад
I have some doubts that a lot of people are looking for a 4070 for 4k gaming. Some might, but I doubt it's a significantly large amount of folks.
@christophermullins7163
@christophermullins7163 Год назад
​@@GewelRealyeah 12gb isn't really enough anyway but at least the 3080 12gb has a firm lead in 4k. I get the draw of NVIDIA but I'm going AMD next. Enough vram without selling body parts sounds good to me.
@lukasr1166
@lukasr1166 Год назад
the 4070 ti and below does suck at 4k. But I wouldn't recommend these for 4K gaming even if the specs were the same as the 30 series. 4080/7900xtx and up would probably be the best buy for 4k. That's by design of course.
@technologicalelite8076
@technologicalelite8076 Год назад
1:10 I like how the name of the card is now see through with your computer screen, some interesting innovation!
@CaptToilet
@CaptToilet Год назад
As I said before Nvidia just threw all RnD into the 4090 and then said screw it to the other class of cards. This 4070 should have been a 4060ti at best. Improvements to the tensor cores and RT cores is one thing, but that improvement means jack if the memory bandwidth can't keep up.
@pepoCD
@pepoCD Год назад
man such a weak -70 card. this is a 4060ti at best and $150 overpriced. all 4000series cards but the 4090 are complete disappointments
@arditm2178
@arditm2178 Год назад
Hello Daniel. Is it possible to make any benchmarks with AI tools? Like stable diffusion. 3080 vs 4070. I'm kinda new to AI, but it seems like nvidia is the way to go.
@ladrok97
@ladrok97 Год назад
You can't go AI (i.e. image upscaling) on AMD. In image upscaling like 90% of models is on CUDA and only +/-% is based on Vulkan
@Suilujz
@Suilujz Год назад
​@@ladrok97 saying you can't go AI on amd is just a lie, sure it isn't as easy as cuda but there's directml and vulkan implementations for a lot of projects. AI is also much more than "image upscaling"
@eqrmn3934
@eqrmn3934 Год назад
@@Suilujz no its not worth it getting an AMD for AI. Too much work to get it running, slower speed than nvidia cards and most tutorials on youtube would be using software that support nvidia. Its a shame though because AMD cards have more vram
@ladrok97
@ladrok97 Год назад
@@Suilujz Maybe it's a lie, but yesterday I wanted to test other upscaler to get X4 from 480x360 and majority is blocked by cuda and brute forcing it with 6600xt it's pointless. I plan to upgrade to 7900xt (or maybe wait for 8800), maybe then brute forcing this limit will work. But if someone wants to use AI, then it's far easier going with Nvidia than AMD and I doubt situation of "max 20% work on Vulkan" won't apply to most of AI use case
@Suilujz
@Suilujz Год назад
​@@ladrok97 I got my 6950 XT today, just upscaled a 512x512 image by 4x in two seconds with r-esrgan 4x+. Don't know which one you trying to use but there's perfectly working ones out there
@hackmaster4953
@hackmaster4953 Год назад
1:13 wow it's limited edition with transparent box, nice catch Daniel ;D
@ocha-time
@ocha-time Год назад
Look at that power draw on the 3080. Christ.
@autoglobus
@autoglobus Год назад
The 3080 12Gb doesn't just have more memory and bus to support that memory, it also has more Cuda cores , more Tensor cores and a higher TDP. The 3080 12GB is just another in the long line of nVidia examples where they named things the same even though it has most specs different. It's the exact same thing they tried to pull with the 4080 12GB vs 4080 16GB , but backed off and called it 4070Ti eventually. And it's not the same as with the 4060Ti 8GB vs 4060Ti 16Gb where memory is the single difference between them and is somewhat excusable.
@atomicfrijole7542
@atomicfrijole7542 Год назад
Yep. The 3080 12gb is a treasure.
@TheMadYetti
@TheMadYetti Год назад
energy usage is VERY impressive. its 2x the FPS per W used, so at least one thing nvidia did good
@yeetus59
@yeetus59 Год назад
Except the problem is its basically impossible to find any 3080 12gb in existence. When you look up "3080 12gb" you get nothing but 3080 ti results... and thats still quite a bit more $ than the regular 3080.
@jayclarke777
@jayclarke777 Год назад
I had a chance to get my hands on a 12GB 3080, but it would have been a dumb purchase-considering I had a vanilla 3080.
@ketrub
@ketrub Год назад
i managed to cop a 4070 a decent deal below MSRP (for my country, EU so all prices are fucked, basically) on launch day, so combined with the lower energy usage and the fact i don't use 4K i think it was an okay option i do agree with your conclusion though, not very happy with Nvidia but for what it's worth 4070 happened to align perfectly where i live
@ChancySanah
@ChancySanah Год назад
to be fair to the 4070 released at msrp, those 3080's were 2-3 grand if you could find them. Pretty sure the msrp for the 12gb was like 1200 bucks anyways.
@alpha007org
@alpha007org Год назад
In the EU, during shortages, I frequently saw 3090 for lower price than 3080 (ti). Ridiculous and too bad I didn't screenshot some. 3090 1800 EUR, 3080ti 2000+ EUR. I don't get it how the same store could sell a weaker GPU for more money with a straight face.
@zdspider6778
@zdspider6778 Год назад
Those were scalper prices. There's a Tom's Hardware article from June 2022: "Grab an RTX 3080 12GB at its Lowest Ever Pr*ce of $769".
@alpha007org
@alpha007org Год назад
@@zdspider6778 Sure, but in the same store, like ComputerUniverse from Germany, which is quite a reputable company with decent pricing for the EU? That why I said, too bad I didn't screenshot those listings.
@ChancySanah
@ChancySanah Год назад
In general comparing prices with last gen things that came out in mid 2020 and into 2021 isn't really fair, because msrp was a fantasy and around 2021 the chip shortage started kicking in so it probably cost wayyyy more to r&d the 40 series or the 7000 series. A lot of things went wrong to get us here.
@CoCo.-_-
@CoCo.-_- Год назад
@@alpha007org me: got my 3090 for £670...
@GiddyGoat
@GiddyGoat Год назад
It’s wild how power efficient the rtx 4070 is compared to the rtx 3080
@jeroendelaat6899
@jeroendelaat6899 Год назад
Hi daniel! I just wanted to say thank you for all the videos. 6 months ago I built my pc based off of your tips and am still happy with the result now. Im commenting now though, because I wonder how much of your audience, like myself, only watch you for a month or so, get their information they need and leave straight after. Thats why I figured an appreciation post is in place. Have a good one!
@theonerm2
@theonerm2 2 месяца назад
The bigger L2 cache on the RTX 4070 makes up for the narrower bus width some.
@HanSolo__
@HanSolo__ Год назад
Frame generation is commonly despised.
@hastesoldat
@hastesoldat Год назад
I don't care. To me it's probably the best feature to ever happen in the history of gpus. I've been dreaming of the day it would finally be a thing for more than a decade. And I'm excited for it interpolating several frames per generated frames in the future. Thanks to it, I might experience ultra high frame rates in my life-time.
@DeadPhoenix86DP
@DeadPhoenix86DP Год назад
Too bad the 3080 would have not worked with my current PSU. So i went with the 4070 instead. I paid MSRP price. Over my older GPU i only use about 30 watts more, but having over double the performance and VRAM.
@kaimojepaslt
@kaimojepaslt Год назад
and thats what smart people do, and dont have to dump double the power bill monthly.
@thatbritishgamer
@thatbritishgamer Год назад
​@@kaimojepaslt smart people don't buy a 4070 as they know they're a rip off.
@Rodrigo38745
@Rodrigo38745 Год назад
@@thatbritishgamer if you want a new gpu for that price whats the better solution them? exactly
@jesusbarrera6916
@jesusbarrera6916 Год назад
@@Rodrigo38745 used 6950XT and a better PSU....
@Rodrigo38745
@Rodrigo38745 Год назад
@@jesusbarrera6916 I said new card, a lot of people dont want used cards also like me I use my gpu to do work and nvidia is miles ahead in most cases.
@Mattribute
@Mattribute Год назад
With the 3090Ti, I’m left seeing the 4090 as the only meaningful upgrade. Guess I’ll just hang out for now.
@ez45
@ez45 Год назад
21:34 actually, the black price means the listing didn't sell at all
@marufulislam4311
@marufulislam4311 Год назад
Makes me wonder why nvidia didn't make 4070 with 384 bit bus. Evil NVIDIA limiting performence of their own cards for no reason
@biskwiq
@biskwiq Год назад
if u want a powerful low profile gpu go for 4070 and plus u get a frame generation. look at those wattage usage, u can save a lot of energy for sure.
@mayssm
@mayssm Год назад
Meh. I'm as excited for the 4070 as I am for a dental exam.
@Gimpy17
@Gimpy17 Год назад
A 4070 is about 4 times larger than a low-profile gpu
@Billskins4dayz
@Billskins4dayz Год назад
going from a 1650 super to a asus tuf gaming 3080 12g couldnt be happier with its performance
@air21511
@air21511 Год назад
Great, I was wondering why no one did reliable compare of those two till now. That makes my personal chart complete now and re confirms it for 4k with amd cpu. 850€ 4070ti new 125% 650€ new 6950xt 125% (450€ used) 600€ new 6900xt 120% (450€ used) 600€ 4070 new 100% 550€ 3080ti used 118% 500€ 3080 12gb used 112% 450€ 3080 10gb used 104% So if no ray tracing and power/heat limit get 6950xt used (ideally with amd Cpu for SAM). If ray tracing and 4k 60fps get 3080ti used because Nvidia does raytracing better and higher better vrm over 4070 gives further advantage on high detail texture of 4k / RT and it is still better than team red even before DLSS, more universal card.
@djm7900
@djm7900 Год назад
I’m happy with my 4070… I upgraded from a 1070 from 2016 and it was the best value for me in Canada in 2023. It was the only sub $1000 CAD video card of the current generation.
@justinongstation8233
@justinongstation8233 Год назад
nice im from 1060 to 4070 as well good fps and superb cooling
@cryme4640
@cryme4640 Год назад
@@justinongstation8233 same but from 1060 6 to 4070 Ti its so good :D
@Mcatz7
@Mcatz7 Год назад
Same here. From 1070 to 4070.
@ClearButOpaque
@ClearButOpaque Год назад
That a better reason than me. i upgraded from a rtx 2060 to a RX 6750xt for $410. I had no problem with it besides amd's terrible encoder and the Adrenalin software was sometimes buggy. I "upgraded" then to the rtx 4070 in 3 months. I wasted over $1000 on gpus in 3 months. I could just bought a 4080/7900xtx/ used 3090ti instead of all of this. I guess it's the mental barrier of spending a thousand dollars at one time.
@kilosera
@kilosera Год назад
I dont know if frame gen is such a great feature if it adds ~20ms lag. I'm not that fast, when you ask me something I sometimes answer after a few seconds, most likely with a 'what!?' and yet I feel massive improvement in playing forza on my gaming monitor with ~3ms vs my tv with ~40ms. It's nice that fgen is added but I wonder how much single player that single player title has to be to actually be enjoyable with lag. Maybe nvidia just wants to silently force feed new gamers with input latency to later jump them smoothly to their streaming platform ;)
@kelorednaxela
@kelorednaxela Год назад
I agree about the pricing. 4070 should be $500. It's a 1440p card for the newest games.
@suspectedcrab
@suspectedcrab Год назад
Most of the people complaining about the 4070 have a 20 or 30 series card. I upgraded to a 4070 from a 1080. It's $100 less than a 3080, and I don't like buying used cards. If I had a card that was a 3070 in performance or higher, I'd have waited another 3 generations to upgrade. If I had to upgrade my GPU every generation, I'd sell my entire PC and just buy a console. I wanted the 3080 when it came out, but now I saved $100 and have a much more energy efficient card that will hopefully last me longer than my 1080.
@davidord5228
@davidord5228 Год назад
I'll admit I caved in and bought the RTX 4070 however before everyone jumps on me let me clarify why. I had about its price in terms of budget and I was well aware of the other gpu options however, going older gen or AMD meant I would need to upgrade my PSU and I didn't have the budget for that and without compromising on performance the 4070 runs perfect on my 550W psu. I also appreciate the fact that Nvidia cards and software is supported more for the creative work I do for uni ( Blender, 3D, Photoshop) whilst also giving me gaming at 1440p with decent frames and with the added benefit of DLSS 3 frame gen as a user I noticed smoother gameplay without noticeable artifacting which is just great for user experience and I understand that it gets stick for the price, it isn't justifiable for some upgrades but going from igpu to workloads on this gpu was immense. I think the vram could be an issue however Nvidia has just show new texture compression which is more detailed and has smaller file sizes so I reckon that will help all of their gpus and why they have refrained from upgrading vram by large amounts. I may be a singular case where the 4070 as an upgrade made sense but i think you would agree it was the best option for me. Great to see Daniel continue to dive into how it compares and thanks to others who responded on my last comment on one of these vids giving me upgrade suggestions.
@soapa4279
@soapa4279 Год назад
I don't think anyone is going to stone you for buying one. The 4070 is still a good product. It's just named and priced wrong.
@BleedForTheWorld
@BleedForTheWorld Год назад
The 4070 is actually a great option as a gpu upgrade from two generations prior. The problem is the price which is still very much overpriced at 600. Others are right that it should be at around 400 dollars but since wealth is relative, this number doesn't seem as much to some than it is to others.
@Dave-kh6tx
@Dave-kh6tx Год назад
why admit you caved in? your needing it for blender and photoshop were valid reasons. everything else you said though makes me believe you're full of it or won't be very good in your major. W/ added benefit of DLSS3 noticed smoother gameplay without noticeable artifacting? you're just regurgitating words without knowing what you're saying. just how much do you think texture compression will help an overpriced gimped gpu with low ram and bus width? nvidia already has texture compression tech way ahead of AMD and using AI to "compress" isn't really compression. it's using AI to add extra details that weren't there. I'm not flogging you about choosing a 4070 because of your psu but that was just another reason you added that doesn't make a lot of sense if you needed for studies. but then again, add up all the other reasons asides from studies and something isn't right here.
@davidord5228
@davidord5228 Год назад
@@Dave-kh6tx All I meant by caved in was more of like I made the jump to get it, more of finally deciding on what I was getting, does that make it clearer? I thought needing to spend money on upgrading to a higher psu for less efficient cards was a fair variable to factor into my options (I've saved for a year and I was trying to balance value and quality performance for what id use it for) and I'm not a computer science or super experienced builder but I have followed along with latest new etc which doesn't have anything to do with my degree so I don't think that comment was necessary about my competency in my degree however due to my lack of experience compared to some I will apologise if i've used a term or fact wrong. I just wanted to share my experience and why I made the decision to go with the 4070 but I do appreciate the points you made about price and compression but I think my views on it as a user are justified when playing games and experiencing DLSS 3, just my opinion and you have the right to yours :)
@zxbc1
@zxbc1 Год назад
4070 right now is massively cheaper than 3080 even the 8GB version. That with the fact that you get DLSS 3 and massively less power consumption it's a no brainer to choose the 4070. For me it's the same boat, if I consider upgrade now with my old PSU being 550W I will end up paying $150 more for the 3080 at the very least, and I can't even find a reasonably priced 3080 12GB anymore. The way the 3000 series cards were priced made them such poor value that the new 4070 ended up looking good despite being also poor in value. Talking about performance per dollar based on MSRP is just not useful at this point.
@dogbiscuitninja
@dogbiscuitninja Год назад
The only advantage: The 4070 draws significantly less power.
@zdspider6778
@zdspider6778 Год назад
The other advantage: availability. Because not even scalpers want them. Let that sink in. Up until now, they were buying them from retail and selling them at insane marked up prices. Now there's no demand for them. People caught on that these are actually 60-class cards in disguise, sold at 80-class prices. Nvidia done fucked up with this generation.
@NamTran-xc2ip
@NamTran-xc2ip Год назад
Did you forget frame generation?
@zdspider6778
@zdspider6778 Год назад
@@NamTran-xc2ip Frame generation isn't the magic bullet Nvidia wants you to think it is. It doesn't work with VSync. You need a G-Sync monitor (which are $200 in addition to the base price of a monitor), otherwise you suffer terrible screen tearing. It causes ghosting and all sorts of artifacts. It introduces input lag (because you're always 2 frames behind), which sucks ass for competitive games. And it doesn't work with all games, the game developers have to add it in, as a feature, because it needs motion vectors and stuff, which isn't something it does automatically.
@NamTran-xc2ip
@NamTran-xc2ip Год назад
@@zdspider6778 competitive shooters don't need frame generation anyway. In order to have screen tearing, you need to exceed the refresh rate, of which if you do, you don't need frame generation. Ghosting, artifacts,... I'm not pixelpeeping to spot those. I'd rather have 100fps with these 'artifacts' than 50. The 40 series are insanely efficient. Whatever you say man
@cptnsx
@cptnsx Год назад
So glad at least YOU are telling AND showing the truth about Frame Gen - its Motion Smoothness - it will NEVER have the latency of REAL FRAMES at the indicated FPS counter.
@KoolAidManOG
@KoolAidManOG Год назад
Another positive for energy consumption is SFFPCs. I have a sub-10L case where I did a straight exchange between a 3080 and a 4070, and the reduction in heat and noise is remarkable, from 82C to 68C in Timespy 3D 4K benchmarks. This is an edge case scenario but its certainly worth considering if power and heat are factors.
@spoots1234
@spoots1234 Год назад
I'd sell my 3080 10gb for a 4070 any day. An overclock closes the gap at 4k, there's 2gb extra VRAM and my room will stop being a sauna. Plus frame generation is actually pretty cool - I use it a lot on my 4080. Power consumption is why I sold my 3080 ti and never kept any 3090 ti. BTW I own a computer shop, which is why I get to play with a lot of GPUs.
@DrearierSpider1
@DrearierSpider1 Год назад
You realize the power draw difference between a 4080 and 3080 Ti is miniscule (320W vs 350W)?
@Xinvoker
@Xinvoker Год назад
Typical gt 1030 owners
@FenrirAlter
@FenrirAlter Год назад
And then the computer shop clapped
@thetruedarksoul168
@thetruedarksoul168 Год назад
how do you become this uneducated about computer parts and work at a computer shop
@thetruedarksoul168
@thetruedarksoul168 Год назад
@@DrearierSpider1 3080 is 330w power draw so idk what this man is on about
@ZeroZingo
@ZeroZingo Год назад
I would definitely go for the 4070, for the efficiency and DLSS3
@StubbySum9
@StubbySum9 Год назад
what i´ve been reading DLSS3 works on 30 series card aswell and if you´re on a 3080 there´s no need for the 4070 imo :)
@janbenes3165
@janbenes3165 Год назад
DLSS3 Vs Frame Generation. Nvidia is calling Frame Generation as DLSS3 while also changing name of DLSS 2.x.x to 3.x.x. So, DLSS3 technically speaking works on 3000 series but Frane Generation does not.
@poison7512
@poison7512 Год назад
DLSS 3 and frame gen in Returnal actually INTRODUCES stutter that wasnt there before. This is on a 4080 and im not even at %100 GPU usage. Terrible.
@charlesgoin8217
@charlesgoin8217 7 месяцев назад
I appreciate this video.. I have an EVGA 3080 12GB FTW Ultra Hybrid. Wasn't sure if it was going to be worth going up. Seems even though this isn't a comparison between the 4070 Super and mine. But it gives me comparable to work with. I think I will wait till the 50xx series comes out. As it seems you have proven that the big jump would be the 4090 and if I am going to get a 4090 I will get a hybrid.. and well.. wait for the price to come down on that.
Далее
СМАЗАЛ ДВЕРЬ
00:31
Просмотров 263 тыс.
Oh No! My Doll Fell In The Dirt🤧💩
00:17
Просмотров 6 млн
ФОТОГРАФИЯ ЦЕНОЙ ЖИЗНИ
32:38
Просмотров 1,7 млн
Star Wars Outlaws PC Performance Tested
16:02
NEVER install these programs on your PC... EVER!!!
19:26
RTX 4060 Ti vs RTX 4070 super : 1440P GAMING TEST
3:01
This issue is plaguing modern gaming graphics
23:30
Просмотров 842 тыс.
Linux from Scratch
2:35:42
Просмотров 154 тыс.
Is the RTX 3080 still good in 2024? | 1440p Gameplay
30:39
I Bought a $5000 PC in a Random Asian Tech Mall
22:12
NixOS Setup Guide - Configuration / Home-Manager / Flakes
3:01:39
СМАЗАЛ ДВЕРЬ
00:31
Просмотров 263 тыс.