There appears to be some misconceptions about DLSS, so I’ll address those here (and in the next video). - DLSS upscaling does not give 4070 Ti a performance advantage over the 3080 or even the 7900 XT. - FSR doesn’t handicap the 4070 Ti’s performance. - DLSS is not faster than FSR, visual it’s generally better, but in terms of fps they’re actually much the same. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-w85M3KxUtJk.html - DLSS 3 should not be included in benchmark graphs, it’s a frame smoothing technology, it doesn’t improve performance in the traditional sense, input remains the same. - We used FSR in favor of DLSS for the benchmarks as it provides us with highly accurate apples to apples data for comparison with AMD and Nvidia GPUs that don’t support DLSS. - The primary difference between FSR and DLSS is seen visually, NOT FPS PERFORMANCE!
Not sure FSR on all cards is apples to apples. You might think it is fair, but Nvidia cards work better with DLSS. My ARC does not like FSR either like it does with XeSS. I would call that red delicious to granny smith maybe. But this is giving AMD a slight nod imo...slight, but a nod.
If you can link me to some evidence to support that claim I will be sure to review it and re-evaluate but thus far we haven't seen it in our own extensive testing. Not interested in the Intel GPU though, strictly talking AMD and Nvidia here.
Ultimately both DLSS and FSR are rendering at lower native resolution and use different techniques to upscale the images. While you suggest that the performance are the same, the fact is that DLSS provides a better visual, which means technically it can get away with rendering at an even lower resolution with higher performance, and match the FSR's visuals.
Agreed, but since you were only testing with Nvidia (apples to apples), I would have liked DLSS 2.0+ enabled to see how the 4070ti would perform with ray tracing assuming it has more tensor cores etc. or I could be completely wrong :)
This was the benchmark we wanted! Thank you Steve and it really helps when you explain the cost per frame data in making a purchasing decision. I really was considering an upgrade to a 4070ti or 7900xt but it seems to make more sense to wait. Content like this is invaluable to the community. ❤
40xx series only makes sense in my opinion if 1) you obviously have the budget to kill 2a) you are either a new builder with said budget and want to get as much as you can out of your system without considering the bang-for-buck of your rig 2b) you have a super outdated system and again, have the large budget to kill and want a huge upgrade Source: upgraded from 1060ti and i7-7700k to a 4070ti and 7700x - happened to get a large bonus from work or I would’ve been shopping 30 series no doubt
Happy 3080 owner here, purchased back in December 2020 and will be hanging on for the next generation. Looking forward to picking up a much more efficient GPU with 2x performance!
If you are not using ray tracing 3080 more than enough to run 4k 60 any modern game . Save money for better efficient cards i mean putin 1000 watt gpu is no brainer
That’s the beauty of buying top of the line, you spend more upfront but potentially save more after years of not needing to upgrade. I was a very proud 1080 ti owner, and even though I have a 7900 xt now that 1080 ti will always have a special place in my heart. I hope my 7900 xt lasts me as long as it did.
Same here most of us bought our Rtx 3xxx series cards while still on AM4 or LGA 1200 requiring us to build an entire new rig to reach the performance you're paying for. Pc's are expensive. I'm waiting until mine blows up before I spend thousands on a new Rig.
Good choice. Def not worth to upgrade gen to gen. I was using 16xx series and didn't decide to upgrade until the 4070ti came out. Was originally looking at 3090s which are more expensive so this card was a no brainer for me. I thought of getting a 4080 but I'm only gaming at 1440 so I gains prob wouldn't have been worth it unless I was looking at switching to 4k in the future.
I don't think upgrading to next series of Graphic Crads to be reasonable except for people who are too wealthy and don't know what to do with their money ! Personally recently *Forced* to upgrade from 1060 to RTX 4070ti which I bought on Sale so somehow Price to performance was justifiable. so I think you can keep your card at least for 2-3 more years
@@chilpeeps Wait until they release the RTX 4060. If the RTX 4070 Ti is 12gb, where does that leave the RTX 4060? Right back at 8gb, because regression is actually progression.
picked up a 2070 super in that short window in 2020 when everyone was selling off their cards in anticipation of the 30-series (mine was brand new, but heavily discounted due to low demand). i wanted to ride out the launch excitement and upgrade to a 3080 roughly half a year later when nvidia would catch up with the initial orders. welp, about that... it's been two and a half years since, and the same miners who destroyed that hope now left nvidia and amd with a whole bunch of surplus gpus, offloading their problem to them, and by proxy, to us. in 2023, we can finally buy a 2020 gpu for its 2020 msrp, or get a somewhat more powerful one at a proportionally higher price. i hate this so much. we were robbed an entire generation of an upgrade and we're still two years behind in price to performance (which is the one metric that counts for everyone not buying the absolute high-end, since then you could always upgrade to a more expensive card) -- which is better than lagging five years behind, like we did in 2021, but it's still not great. i hope battlemage proves powerful. would be so great if the long-term effect of this anti-consumer strategy by nvidia and amd would be that they opened the door for a third gpu manufacturer.
We are 2 man ! I pickup my 2070 super before RDR2 (PC) for 500€.... if i want more fps now i should spend.... 550€ 😐 for a .... 3060 Ti 😐.... what a bad time for GPU market
It all begins with your own demands in resolution and features! Today you can buy a used 5700 XT for below 180 bucks or new 6650 XT for like 250 bucks and play 1440p, high settings 80 - 120 fps, which is quite nice I would say. With 27 inch 1440p monitors being a sweetspot as well as consoles also targeting this intern res with 60 fps, it seems like a good time to overthink that 4K with RT and 240 fps+ demand and open the eyes on the real bargains.
@@naamadossantossilva4736 for now. and it's mostly because their current GPU is quite weak, and it's hard to recoup the investment of an entire new architecture just on a low-end, low-margin product. in light of all that the A770 and A750 are priced extremely reasonably, when you consider they were supposed to compete the RTX 3070 and production costs have been scaled according to that. it's very much a first-gen product, but it's a promising start. if battlemage lives up to what intel wants it to be, they will have a far easier time pricing it better. and they will have to beat amd and nvidia on the price points at every performance tier, they can't count on mindshare yet. and even once they'll have fully caught up and become able to pull the same moves nvidia and amd are pulling, at least we'll have three manufacturers competing, not just two. more competition is always more good for the consumer.
Aww yiss, 1440p halved is 720p so you could even run obscene small resolutions without blurry pixels :P 1:2 conversion still crisp just pixelated like console emulators. Or you could letterbox the 2560x1440p to 2560x1080p ultrawide if that's your thing - that may still free up some GPU headroom without doubling pixel size. A decent 27 inch monitor with a low threshold like 30 or 45 FPS for variable refresh floor and you'll be golden for a while. We can always reduce settings and resolution until Nvidia/AMD come to their senses.
trust me, cost is very important, unless we are talking about those that buy the best of every generation. if cost didn't matter they would just buy the 4090 (or the future 4090ti)
@@gutar5675 the naming doesn't really matter since Nvidia moved the goalpost. price is still the deciding factor for people. and those that don't care buy the best GPUs like the 4090/4090ti.
I have been upgrading every generation since pascal, cost is important and I do it by selling my old card. I will probably upgrade to 4080 a year from now or when it comes down about 25%
The 4000 series are good GPU's, but as you mentioned, the consumer is not getting more for the money with the 3080/4070Ti cost per frame being the same. Unless coming from a 2000 series or earlier GPU a buyer will have to decide if the performance boost is worth the cost. I'll be passing over the 4000 series and wait to see what the 5000 series offers?
What 20 series though? A 2070 super is faster than a 3060... a 2080ti is faster than a 3070... really depends. I think if you had a 10 series, then now is time to move and even then, a 1080ti will match a 2070 super so its not really a desperate situation there either. I upgraded froma 20 series to 40 series but also went up in tier... 1st time ive ever owned an 80 tier card actually
@@GregoryShtevensh Why would you upgrade to a lower tier next gen? Why compare a lower tier with a pre gen high tier? Surely if you had a top tier card then , wouldn't you buy a the next top tier gen?🤔
@@madb132 you tend to see upgrades when the performance is 20 to 40% better... lower tier may be cheaper... if you're on a 1080ti, a 4070ti will be a huge upgrade, so I guess that's why some people do it. Personally I went from a 2070 super to a 4080 and I started at a gtx760, then a 1660 super, then the 2070 super, and now the 4080. If I did get a 50 series card, it would likely be an 80 or 90 tier but a 70 series its possible id consider a 70 tier
Your comment logic doesn't make sense here, for example, if 3080 is $100 for 100fps and 4070ti is $120 for 120fps. You're paying the same amount for just more frames including inflation price. So actually not as bad as people think and people forget how much more power efficient the 4070ti is.
@@GregoryShtevensh What ? 3060 ti is faster than the 1080ti by atleast 20 percent in recent titles and the 2080 ti is not faster than a 3070, they perform similar. Do your research and see benchmarks before talking trash.
@@The-Dark-Tower you are confusing GPU (Radeon RDNA 3) with CPU. x3d is for CPU (Ryzen Zen 4). it's the CPU AMD Ryzen 7 7800x3d coming out in April. There is currently no GPU with X3D to be released. Neither are there any solid leaks nor rumours of it coming out.
@@ChinchillaBONK I don't think he is confusing anything. There are rumors of a 7900XTX3D because the GPU has the exact same connectors the Ryzen CPUs are using for their 3D Upgrade. But from my knowledge these are just speculations and there isn't anything solid that confirmed this.
This really all boils down to economics for the common consumer. We all know that next generation hardware will be improved, but most of us build only once in every 4 to 5 years. I purchased a 3080 (12 GB) card last year at this time and it will have to suffice for another 3 years. Upgrading yearly is far too costly for the typical user.
Having the 12gb version will probably give you another year over just the 10gb version which will surely run into Vram issues in the next couple years (already is in Hogwarts Legacy, even 12gb isn't enough at 4k. 10gb vram will start to have issues at even 1440p sooner rather than later, especially with Ray Tracing and Hd Textures etc). You can sit comfortably for 2-3 years, and then maybe get a couple hundred for your used 3080, and then purchase a 6xxx card from Nvidia or a 9xxx card from Amd (maybe Intel will pull a rabbit outa their hat and have a worthy high tier card by then. Seems unlikely though). Alternatively you might be able to get a good deal on a 5xxx Nvidia card or a 8xxx Amd card right before the next generation launches, just be following price to performance avidly in 2025 to 2026 to get a scope of the landscape in anticipation of a buy.
managed to get a 3080 10GB from EVGA’s queue system back in spring of 2021, and I feel super fortunate! It seems every other tier recently has been a miss (the 20 and 40 series vs. the 10 and 30 series). Hopefully with the 50 series things get better!
My problem with this generation is that where I live the 4070ti and the 3080 cost the exact same. So even though everyone hates the 4070ti, it's really the only option I have.
This comparison is meant for 3080 owners wondering on wether an upgrade would make sense. When you're looking to buy a new GPU (not just upgrading from an 3080), it makes no sense to pick up a new 3080. A used 3080 might make sense at a good price but honestly I don't think that the 100-200$ saved going used on a 3080 would convince me to get a (ab)used card, that when it fails is a total loss without warranty.
You could get an RX 7900 xt for a little cheaper and slightly better perf. With the added benefit of 20 GB vram. HUB recommends it over 4070 Ti at current pricing.
@@rzarectz that's true. I used to have the 6900xt, but returned it because I had so many problems with it. That made me a bit weary of getting an AMD card again, but the option is definitely there.
@@TCOTEevari What sort of problems did you run into? I'm planning to get an AMD card this generation. Nvidia tax here is crazy, the 4070 Ti costs north of $1000.
Given how many games are already pushing the VRam limits I find myself getting more and more pleased with my decision to get a 6800XT. That 16gb is perfect and I got it for only £70 more than a 3070 would have cost me at the time. Nothing I've seen from AMD or Nvidia this generation is remotely appealing, and frame rates I'm getting at 1440p are excellent.
I have a 6700XT and is also at 1440p. I haven't seen a shadow of VRAM trouble since it simply can't do RT with satisfactory results. The 7900XTX, however, can do Ultra + RT at 1440p and get above 90FPS even without FSR and will not run out of VRAM until long after the performance has become an issue. For $1000 I think that's a fair deal. With RT it's basically a 3090ti in performance and VRAM at half the price, but without RT it's going to be an absurdly good performer at 1440p for many many years.
I’m looking at the 7900XT, considering it can be found for actually $100 below MSRP instead of above it. Not bad value (relatively) once it’s almost $800.
The 6800 XT seems like a 1440p beast. Good value for money. I'm running a 1080 and seriously considering a 6800XT (Nitro+) for 670€~. 16GB VRAM and roughly 3080 performance. Lovely. The 7900 XT would be 330€ more (until the price drops further - possibly 7800 XT as well, but who knows when and such).
@@Battleneter 3080 won't max out settings at 4K anyway, or even remotely close. The GPU is way too weak. No-one in their right mind will use a 8-10GB GPU for 4K gaming. It's 1440p solutions.
Nvidia did it right again, looks like if I want a decent upgrade in 4k from my 3080 10g I'd need pass up this 4070ti for $850 to get a 4080 for $1,300. they know how to get all the money from people.. I'm pretty sure they have a team of people who do nothing but set prices, fps caps on products so they can force you to upgrade thier way.. all thier products are by design and structured just right to make you spend more money.. id love to know the true manufacturers cost between a 4080 and 4070ti, I bet its like 20 bucks..
4070ti is a tiny 295 mm² die, Smaller than even 1070 314 mm² die, loads cheaper than the massive 628 mm² 3080 die. They could sell 4070Ti for less than half the price they are now and still make ok profit.
It seems your smart enough to decide. Act smart, be smart, within 10 years you get your affordable graphicscard. Start saving now! For if prices never change.
The best part is how they took dies that should have been the $400 4060 and $500 4070 and used them to become the $900 4080 12G and $1200 4080 16G, respectively. Then when people revolted they were ready to bow their heads and say, "Oh, sorry. We'll cancel the 4080 12G! You're welcome!" and replaced it with a $800 4070 Ti. This let them sneak the 4070 die in as a 4080 16G and only knock a cool $100 off the 4060 die posing as a 4070 Ti instead of a 4080. All of this cleverness will lead to a $1350-1400 4080 Ti and a $1999 4090 Ti. Meanwhile, AMD's competitive spirit leads them to not directly confront Nvidia on any price point and instead narrow dodge them everywhere while also producing so few cards as to barely be visible on the Steam hardware survey. I wonder if Nvidia is paying AMD at this point to play pretend as competitor while AMD focuses on making console chips instead.
3080 10gb is still a great card. It can be had regularly for around $400 used. I picked one up for $350 with a year and a half warranty remaining, very happy coming from a 1080ti.
3080 will do everything you want it to unless you trying to play everything in 4k ultra with ray tracing turn to the max. which most people that know better will never do. i barely turn on RT with my 3080 and if i do its the minimum. RT is cool but the minimum RT still makes you game look awesome.
@@delanescott7872 3080 will absolutely slaughter 1080/1440p yeah, 4k is doable with few adjustments (also depends on the game), but can be tricky for almost every card out there, except AMD and 4080/4090 who are just flying smoothly. And even then the 3080 can be solid for 4k gaming if you just play at high without rt or minimal, or can just use dlss and still perform high at max settings. People said that card was an overkill at release, and it still shows.
@@delanescott7872 I always indicated before, that playing any games at FULL MAX settings is never a good idea. Since most of them aren't well optimized. HUB optimized settings for Red Dead 2 is a perfect example of that.
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI???
With this price you can enjoy ps5 and psvr 2 😂 But dont buy ps5 right now sony is planning to launch ps5 pro . Go for it consoles games are well optimized and always looks pleasing
As always, you've done an excellent work, thank you! I've been thinking about upgrading from RTX3080 10GB to RTX 4070ti, but in Europe, the price of 4070ti is in range of 1k to 1.2k USD and for that price it doesn't make sense to me, even thought I'd able to sell 3080 for 500,600 USD. Paying 100% more for 20% performance is definitely not worth it.
Happy 4070Ti owner here, upgrading from a 2060 Super. Absolutely fantastic card for 1440p, feels like getting a cheaper and more efficient 3090. Still, the price could've been waaaaayyyy better though. If you already have a 3080 you don't need to upgrade, especially if you're also on 1440p.
Problem is just the vram and 192 bus...I dont upgrade often, well my current PC is soon 9 years old...I just worry about the stupid limits of this card for the insane price it is.
As the owner of an rtx 3080 what I can say is the only one that is worthy of an upgrade is a 4090. With the forbidden price I just have to say to Nvidia see you (maybe) for the next generation.
Just wait till 5xxx or 8xxx from Amd before an upgrade. Hopefully there will be something solid by then at under 1k that is worth upgrading to (ideally with similar to 4090 performance).
@@jaroslavmrazek5752 Seems like it will require a miracle for the 5080 to be under 1k (let alone anywhere near the 700 msrp of the 3080). Nvidia needs to have 4080 rot on shelves for them to reconsider their 70% price jump on the 80 series.
@@jaroslavmrazek5752 Perhaps one will be able to get a 5070ti with 16gb of vram and a larger memory bus (192>256) at a similar price to the 4070ti launch msrp. Thats probably the best hope as it stands now. If the Gpu market slows to a crawl though (and sales are flatlining relative to 2021-2022), Nvidia might reconsider its outrageous pricing for their next gen of cards. Generally though when products creep up in prices, companies are reticent to lower it in future generations, since they want to impose a permanent high-price so they can get higher margins. The days of quality 1080ti and 3080 at $700 launch are probably gone.
Supposedly there is close to 0 performace change. Which makes sense since its one of the few times the only real difference is memory size. Its similar to how theres no performance change with the 3080ti and 3090.
Sharp eye, he is not mentioning that the 4K results are hitting max VRAM walls on the GPU. That is the issue with the much lower 4K FPS results on Ultra/Epic settings.
Very relevant to me. I have a 3080 10GB that I was lucky enough to get just weeks after launch and just $20 above MSRP. I have assessed the new GPUs and there is just nothing worth upgrading to due to the price. I think the 3080 will be perfectly fine until RTX 50/RNDA 4 cards are released. Upgrading now would just be a waste of money and make it less practical to upgrade later. I'll continue to save and plan an upgrade in late 2024 early 2025, unless 4080 Ti is miraculously a great value.
Same and it performs great at 4K even when you slab on Ray tracing, just turn on DLSS and you’ll easily be hitting 70+ fps average in AAA titles. I got a 73 fps average running the new RE 4 Remake Demo at 4K Max Settings with Ray Tracing. 50 series will likely be better value and make more sense for 3080 owners
Yep 10GB is fine for 4K and below and that's largely because current gen consoles have 8-10GB available from the shared ram that can be used as VRAM after the game data itself and OS, and this is the target for most developers. 8GB like on the 3070 was always too marginal however.
@@Battleneter Consoles rarely use native 4K in AAA games, that's the resolution that will be needing more video ram in future games. Tbh I have a 4070ti and upscale with DLSS for most games I play.
It's interesing how they would kneecap the performance of the 4070ti as much as they did. At 4k memory bandwidth is extremely important, but both gpus aren't perfectly configured. I think the whole big navi and ampere will run into some issues with memory (except the 3090), because they either won't have enough vram (3080) or not enough bandwidth (6800xt) for 4k. DLSS and FSR do remedy this, because bandwidth and memory size requirements become less, but it's still a shame that performance on such high-end cards is already so bad after just 2 years without using tricks. The 1000 series aged a lot better and was still for at least 4 years. I think bad optimization can also be to blame, because graphics aren't that much better than what they were 4 years ago.
@@richardhunter9779 Consoles also have games coded specifically for their hardware configuration. Good luck optimizing games for thousands of CPU/GPU configurations on PC.
Great review, would be amazing to see some VR performance data too for the new generation if you can test it since those benchmarks hardly exist. BTR did 4080 and 7900xtx, however, no one has tested 4070ti and 7900xt in VR.
I agree - that 1080ti is still excellent. A few months ago I finally ditched my 1080 (non ti, but still delivering at 1440p, though struggling at times) for a 7900XT, and it provides a massive increase in performance. While it is more expensive than it should be, it was a lot cheaper than the rest. I was put off Nvidia by their shocking greed and dirty marketing tricks. I also have to balance power usage, which is why I didn't go the XTX route. I'm also more interested in raw power with no upscaling usage, and I'm not currently too interested in RT, though is 100% is The Future. For your 1080ti, the biggest issue is stomaching today's shocking upgrade pricing. Then again, it is still an excellent performer.
@@ChrisM541 exactly. I keep thinking i should upgrade and i'm sure it would be a massive performance upgrade, but not worth £1000 . I think i paid £450 for my second hand 1080ti over 5 years ago, replaced the pads and paste and it has never put a foot wrong @ 1440p
1060, but same feelings. Also - tend to play older games, and am locked into a 1080p monitor. But yeah. 20 series gave no performance uplift, 30 was stuck on pandmic / mining crazy scalpers, 40 made manofacturers take the scalper margins for themselves.
I was happy with my 3080 anyway but this video reinforced my decision, with regard to missing out on dlss3 frame generation hopefully AMD's FSR 3 frame generation will be usable on my 3080.
@@Orcawhale1 If the guy can afford it and it'll noticeably improve his gaming experience, - I think it's completely ok, he can sell his old card and someone could get a good GPU while he's enjoying a new card
@@Orcawhale1 At least 2 generations? No that's way too long for some people. If you need the performance and have the budget what's wrong with upgrading every generation?
Great content. Holding onto my 2080 Super for a bit longer. Still satisfied with its performance on a 165Hz 1440p monitor. When good 4K monitors become much more affordable, or a less expensive mid-range GPU can max out my current monitor with games I play, THEN I'll reconsider upgrading.
@@aaronweir474 Sadly, I killed it with static when I was cleaning out the case last year. I have a 4070 now (not a Ti or Super) with the same hardware otherwise. Seems to run about twice as fast in most games with the same settings. Can't complain.
@Hardware Unboxed .. Hello Steve .. Umm 🤔 between 7900XT and 4070TI in your video comparison you recommended the 7900XT .. do you still stand true to this decision ?
Thank you for this video, as a 3080 owner (playing ACC and CP2077 at 3440x1440) who doesn't want to spend more than £800 this is a great watch. Around 5% improvement across those 2 games. I'm sticking with my £550 EVGA FTW3
Same GPU and Same Resolution here and although i'm looking up prices of 4080 or 4090 nearly every day, i won't buy them and just wait until the next generation. I mean the 3080 still rocks and the difference between Ultra and High/Medium is not that noticable ;)
@@bobomb7504 yeah exactly is not worth upgrading unless I'm seeing a min 25% improvement so it's going to be 4080 or 7900xtx but I'm not paying over a grand for either of those. They need to be significantly better price to performance than the 4090 to even start considering
Just a quicky on lowering texture quality settings, the difference between ultra and very high on most games is negligible in fact even at 1440p is undiscernible and is usually enough to give the performance boost to make a game run smooth.
As a 3080 10gb owner I'm still happy with its performance. I only game 30-60 mins a night on a 1440P 144hz and it's more than enough at that resolution. My rule is 50% performance uplift minimum to jump. I upgraded from the 5700XT which was a 80% jump and I had a 960M laptop as my daily driver before the 5700XT so that jump was like 300%. 20% is nice but too minor in most cases.
Agreed. 50-100% is the upgrade zone. You maximize your investment for the years you have your GPU and then when you do upgrade, you'll actually notice a massive difference in performance. Win win imo.
I bought 12GB 3080 suprim X in october 2022 for 1050€ , I don't regret. Only thing I can complain, is power consumption and I hope I will use it for more than next gen cycle.
@Kamil Fingr Got the same 3080 12gb suprim x in August for $1199 AUD. Never seen it under $1500 before or since here in Australia. Yeah the power draw is crazy. Thanks to the samsung node. TSMC is always superior. I have mine set at 1850Mhz - 0.825Mv. Max power draw is at worst 250w and temps max out at 62c (summer in Australia)
bought my 3080 12 gb strix for 699 from Newegg about 6 months ago, I was a bit hesitant since the 40 series was coming but I think I think I made a good choice, its more than enough for 1440p and it looks amazing, will keep this card for years to come
I have a 3060 12 gig, which do you recommend to make the change? I currently have a 2k monitor but I'm thinking about buying a 4k one, which graphic do you recommend?
Great video1 I was fortunate to get a 3080 10G last year at almost MSRP during a sale. Love it! Now, I've also got a computer that runs a GTX1660. Curious what you think of that card for current games? How long do you think it will still be able to perform? I know you've put the 1660 in some benchmark data--but not all. The questions are open for anyone, by the way.
@@keonxd8918 GTX 480 -> GTX 660 -> GTX 770 -> GTX 1070 Ti -> RTX 3080. All bought second hand, most cards been used for years and never had any issues with any of them. Always prefer second hand when possible.
Just picked up a 3080 for $400, basically kept brand new by the first owner. I'm coming from a GTX 970 so... Can't freaking wait to include it in my next build.
I'm running at 2080Ti with an i9 9900K. I'm waiting for the Ryzen 7800X3D to release to upgrade my CPU (+ motherboard and memory) but, for the price, don't see any of the GPU upgrades to be worth the cost. I'll wait for the 5080. I can't tell you the FPS but Hogwarts on ultra with RT was running fine for me.
I've heard a lot of that of people who play the tutorial/first hour of the game. With RT in ultra you will get 20ish FPS in many zones. Also frametimes are worse. Not worth it.
I suppose I have to ask, what is fine for you? I was running Hogwarts on ultra with a 3090 ti RT, and was getting upwards of 100 frames in some places, but in the important places I was as low as 40 frames in some places. 40 being frankly unacceptable.
@@vermillionaether I think 60 is still the absolute minimum. You would still feel a drop from 100 to 60 but far less than 40, and using something like RTSS you could cap it to 60 with flatlined frame pacing for the ideal smooth experience. Hogwart's Legacy is in a long running of horrible recent console ports. I fear no level of enthusiast hardware will mitigate the coming UE5 storm. Let us hope some developers actually care, and develop PC alongside console independently.
@@nipa5961 at 1440p I find 8gb to be fine. It wont max every game but it's fine. 12gb is more than enough at 1440p and probably more than enough at 4k outside of particularly demanding games.
Not sure it's really that powerful. It already has issues at 4K and those issues will likely crimp performance at 2K in future titles as well. Unacceptable for a $800+ GPU that's pretty much already gong obsolete due to it's lack of memory bandwidth.
Hello Hardware Unboxed! Is the ryzen for any reason bottlenecking the gpu? or why that numbers at fortnite? seems very low for a 3080 even for a 4070 ti. What am i missing?
It almost always makes sense to skip the next generation of GPUs when gaming. Considering current pricing and DLSS technology it makes even more sense. I bought my 3080 at launch and plan to keep it until 1080p at medium settings with DLSS Set to performance nets me less then 60fps. I think I have years of use left and may not need to build another rig until RTX 6xxx series comes out.
Why are you using FSR instead of DLSS? both are nvidia cards it would make more sense to test them with dlss not fsr unless I am just missing something.
I guess I’ll be holding on to my 3080 a bit longer, I am interested to see how the 7800X3D will be, but I still think the 7900XTX is a better card overall than the 4080
I mean, there's nothing to actually suggest that here. Remember too, 4000 series cards have tech access the AMD cards don't. This is...a really bad review.
@@asmodeusr1578 Nvidia has the upper hand on producing more fake frames, congrats. Im a 3080 owner and this whole RT and DLSS fake garbage.. complete waste of silicon.
@@iHadWaterForDinner Those technologies are simply icing on the cake. You can leave them off if you want and still have a monster GPU. I have a 4090 and nothing stresses it out except poorly optimized games with all of those off. It's incredible. Laughable with the DLSS options enabled. I can see this being an absolutely wonderful benefit in a few years when games really start to tax these cards. I don't really use RT yet unless it's a negligible impact on frames, but Nvidia is thinking ahead with these tech's and I think they're better to have then not have.
Quite content with my RTX4070Ti as someone who upgraded from GTX1080. The value proposition is pretty bad, but it's not like I had much of a choice in that regard.
CsGO Source 2 release is immiment. It'd be freaking awesome, if Tim would do one of his old optimization guides when it drops. Even if it's only a one off. There are a few of these for the current release and they have pretty big viewcounts. And if the current game has 1.3 million players, imagine how many there will be if the new update hits! And many of those will be wondering, how they'll ever get good again, as 200 fps vs the old 400 fps is literally unplayable. That would be awesome, thanks for your great content :)
Since I bought my 3080 used last June for about 30% of the cost of a decent RTX 4070 Ti new right now, I'm still fully content with it and if possible, will try to keep it until there is some good used deal for a RTX 40 series card or might just skip it entirely for a RTX 50 series. It does the job at 1440p, don't have much need to swap it apart from the VRAM becoming a slowly rising problem for some games though.
this is the comparison i've been waiting for. my 3080 10G has been an incredible workhorse. i think it will remain so for a while, at least until i have a 4K monitor.
Another cost to consider at the moment is energy consumption. My friend bought a 3080ti and ended up exchanging it and paying a bit more for a 4070ti because it’s far more efficient.
4070ti just killed the 3080! Keep in mind, this card runs WAY MORE EFFICIENT so, less heat & electricity costs (100W difference) Has 2GB VRAM more than the 10GB 3080 I'm getting this for sure
If you feel bad for the bad pricing and value of the 4070ti now, just remember this was the 4080 12GB Ngreedia tried to sell at 900$. Yeah 30% more price for 20% performance. GG Ngreedia
@@giglioflex he probably doesn't pay for his PC or has Stock in nGreedia or he is just dumb to support company that just wants more money from his pocket
I regret buying the Riva TNT, Geforce 2 MX, 4200TI years ago. After that i have only bought laptops, some with AMD and some with Nvidia graphics. Now i have a GTX1060 6 GB in my laptop and i play games that works with that. I will not buy a new laptop in years. If i knew how this company (and AMD) would treat it´s customers in the future i would have bought something else if possible.
2:40 The bigger problem is the RTX 40XX is a much smaller sized silicon, but the price is a lot higher compare to previous generation. The founders editions of the RTX 40XX had many empty spots on the PCB, cause Ngreedia is just playing greedy and sell less for market up prices. We are buying less silicon for much higher prices & less components on the PCB, Ngreedia is just fooling Gamers left and right and there is no end in sight.
Exactly, the premise of this video is stupid. If you have the best GPU of the last gen, the only upgrade for you should be again the best GPU of the current gen, not the mid tier GPU of the current gen which is supposed to be on par with the best of the last gen 😀
This really justifies the RTX 3080 more since you can get it for much cheaper currently. Around a 300-400€ price difference and even more when getting one 2nd hand.
@@SweatyFeetGirl So in other words you're better off waiting tfor the 5060Ti because it'll perform the same (as 4070Ti) but be more efficient saving you money over time.
@@tomstech4390 So in other words, anyone on a budget is better off just waiting 10 years to upgrade since the GPU rat race is a joke and whatever is good now is trash in a year's time.
@@SweatyFeetGirl In what world? Even in Europe's current very high kWh cost, the average being 0.25 euros per kWh and if you had a RTX 3080 (12GB version) running 24/7 for 365 at 350W rated TDP, that's going to be 150 euros more in energy costs compared to a RTX 4070 Ti. I don't see how anyone would be running the 3080 24/7 365 days at 350W and even then, if you get a 3080 for a fair price, it still won't put you above the RTX 4070 Ti cost.
yar, that 192-bit bus width is a purpose to hobble the longevity, or 'future proofing', of the "4070 Ti" A GPU that was capable of 60-80 fps on Ultra with RT Enabled in 4k in 2022 will struggle to reach 60fps in 2025's newest games, making the user feel the need to either lower settings or buy a new card. IMO this is all designed on purpose. I can not come up with any other possible reason to have 192-bit bus instead of a 256-bit bus except to build in literal pre-planned obsolescence.
Loving my 4070ti . Huge upgrade from my rx6800 and pulls the same wattage. What you forget to mention in your video is how efficient the 4070ti is which is a consideration with these current energy prices.
I wanted to mention this as well. Extra price for 4070ti would be worth it for this reason alone. And this is the reason why I won't buy CPU from Intel.
@@nedelchokadiev1999 I actually checked this with energy cost calculator and the difference is indeed maybe a 1€ per month, so this argument indeed goes out of the window.
It’s a real shame about the super narrow memory width and low memory capacity on the 4070ti…Nvidia essentially castrated it and made it obsolete before it even went on sale 😮
I still have a 3080 and am heavily considering making my next card a amd card. Nvidia is crazy for setting these prices so high even for the mid range.
So I'm guessing the upcoming 4070 will be 3080 performance? Hopefully that GPU will be priced right. But i honestly doubt they would do that. Nvidia is Greedy as always.
People will be praising the 4070 for 600 while it's going to be even worse value than already bad 4070 Ti. But this is what Nvidia wants: dumb sheep blindly buying anything they shove to the market
@@GewelReal Its a much better price vs the 4070 Ti, 4080 and 4090. The majority of people are on a limited budget. Don't expected GPU's to come down again anytime soon. Due the inflation everything has gone up.
I'm keeping my eye on the market as I intend to be making an upgrade soon (Have been running a GTX 970 system for about 7-8 years now). I have a 1440p 32" monitor which is more than big enough for me and I'm very happy with the pixel density, but I have to steer clear of recent AAA titles or suffer significant graphics reductions to remain in a playable state. While I find it interesting that the 3080 catches up in performance at 4k (likely down to the aforementioned memory bandwidth issue), I don't think people like myself buying into midrange systems should really be looking too much into 4k performance: especially since the average is still 1080p. A side note criticism about the average performance graphs at the end, I feel it would have been more indicative to show the lower performing card's average FPS vs the higher rather than a percentage difference, because with some games running at 300 FPS, the percentage difference is moot, and showing that both cards run at a level far exceeding human perception would better highlight the reality of them having similar performance rather than suggesting that you should buy a 3080 over a 4070TI because you have no benefit while running CS:GO. This is literally my biggest criticism, and it's not even that big of a deal; just food for thought. Love your content, and look forward to more comparisons to help make me a savvy shopper!
looking at this comparison, the 7900xt is definitely more future-proof than 4070ti. And if someone have 3080 12gb, should be happy and can easily ignore the current generation of GPUs
I feel the same i had 1660 so I upgraded to 7900xtx, If I would be 3080 owner or even something like 6700 then I wouldnt be interested this time thats for sure.
here in my country the RTX 4070 Ti sells at a blown up price of 950 to 1000 $ (equivalent in USD) I was too tired of waiting for the 4070 or the 4060...and then thought, with the current 4000 series pricing, there's no way I would get a 4000 series at 400$... I just got an RX 6700 XT this week for $285 and now I'm leaving this comment before I boot up Cyberpunk 2077 to finally run at constant 90-100 fps at 1080p max settings, no RTX ofc, don't need it
The one nice thing about the 4070 Ti is that it doesn't have near the same power demands as the 3080, which, unless you want to use your PC as a space heater, can be beneficial for some people. My 3080 is in my living room PC because it's just too damn hot to run in my small office where my gaming PC is at without setting the power limit to like 40% or getting a separate AC unit just for my office. The 4070 Ti is the perfect answer for this situation, whenever pricing becomes more favorable, since it gives better than 3080 performance, for similar to 3070 power usage.
@@ironmike755 what a ridiculous question. Even undervolted (which my 3080 is), that only saves about 40W (any lower and it's unstable in many games) and is still drawing more than 100W over the power draw of an undervolted 4070 Ti, or 25W more than the limit of the stock 4070 Ti which it never even hits anyway.
@@davidbergner4916 my 3080 is unstable that low and needs 887mV for 100% stability across all games, so it pulls just over 300W. I have tuned some that could run stable at 850mV, but mine isn't one of them. I don't have a 4070 Ti of my own (yet), though I have completed a couple client builds with it, though only one paid for tuning, so I only undervolted that one card. But stock, most games run in the 200-240W range on the 4070 Ti (despite the 285W TDP), and undervolted, that range shifts down to about 175-200W. Synthetics like any of the higher end 3DMark suite will push that number higher, both in stock and UV form, but actual game engines don't use the full power of the card, and those are the power numbers I care about. The 30 series cards will use the full card power in game engines, so there's the initial efficiency difference just with the 4070 Ti at stock, which is further widened by undervolting. So I'd say you could likely save another 80-100W depending on the UV level the new card takes, over the 3080. That said, it's still probably not worth the upgrade unless you just cannot stand the heat output.
Nice video, interesting, thank you! However, I'm currently using a 3080 VENTUS 3X 10G OC as my main driver. And I build for a side job about one system a week. So far every 4070 Ti I've tested and every youtube that compares one across multiple benchmarks, shows that my (barely overclocked) 3080 runs games within 5% of the 4070Ti and even occasionally beats it. Did I somehow end up with an exceptional 3080? I wonder why I see this but you guys are seeing a significant improvement with your 4070ti over the 3080 10g. No accusations or complaints, I just really wonder...
Went for the TUF 4070ti as the upgrade from my ol' 1070, and as I paid just under $800 Canadian for a new one, I'm way ahead. The main reason I went this way over the 3080 was because a 4070ti will last an additional generation over the 3080 before I might have to dial back on game settings. It was an easy decisioon at that price.
I camped out the FE 3080 drop on scan here in the UK. The fact I got it at retail means I have zero regrets with this card! Despite all the bleeting that "ur card is obsolete with 10gb of ram bro"
I have a 3080 and will certainly be waiting until the 50 series before I consider moving up. I'll probably spec out a new box (faster CPU, more memory, bigger PSU, etc.) for it.
Same, I have a 3080 10gb master with a core i7 and I use a 4k TV as a gaming monitor it runs everything smoothly just not full maxed on some settings but honestly playing cyberpunk on close to highest options works well and the 4000 series price per performance isn't worth it. 5000 series with a core i9 will be my future build.
Got the 3080 for msrp at release so can't really justify the price hike. Am considering upgrading my 8700K to a 7950X3D though. Would be nice to see some benchmarks for the new zen 4 with older/worse gpus
@@jaroslavmrazek5752 Was the uplift noticeable? 8700K shouldn't really bottleneck a 3080 so I have my doubts. Especially when I can get a nice new sim rig for the same price.
@@KinGzeDK oh the difference was huge and yes, the 8700K, overclocked even, did bottleneck my 3080 a lot in newer games, especially open world like Spiderman. My friend with 9700K had some bottleneck issues too. 7950x however is a beast, no issues at all.
@@jaroslavmrazek5752 I don't suppose you have any numbers? Chipguider tested 3080 fps in spiderman and with 8700K vs 7950X3D it is 92.6 vs 96.2 on ultra. With low settings it's 194.5 vs 202.0. Sadly they don't have 1% and 0.1% lows.
@@KinGzeDK Depends on resolution. In 4k definitely not. In 1440 maybe 10-20 percent. in 1080 could be 30-50%. Its not worth it to upgrade the cpu ram mobo if youre not playing most time 1080p i think. At least wait until prices drop low from the new amd and imtel gen cpu
Got a 4070 ti for $713 open box at Micro Center. I am coming from a 3060 ti. I feel good about the purchase. After a small undervolt it runs quieter, cooler, less power draw and a huge performance uplift. The only question I have is, should I have gone with the 7900 XT...
I love how you guys always stick up for us the audience and regular guy/gal. I also hate the pricing. But, at least with the new cards we have options we didn't previously have. For new buyers, having the option is better than not having it at all. So, even though it does suck for sure, and it is illogical that we are not getting better value after two years, at least we can choose to say screw it, I'm paying for it, and this is especially true for the 4090. Me personally -- hell no. I'm not giving Nvidia or AMD my money. Sorry, not going to happen. I might buy a used or refurb card in the next year or two, but I'm not buying new. I would be happier to support Intel right now.
Great video as always... will keep my 3080 as long as I can....and since i still love my 1440p high refresh monitor, I think there'll be no real need for another GPU in the foreseeable future. ;)
"AMD and Nvidia are working together to beat gamers into submission" Thank you for saying that out loud. It's clearly what's going on here. I wish more tech reviewers would say that out loud, but they are too afraid of nvidia and amd/too stupid to see it. Thank you!
The difference is gonna keep on getting bigger as time goes on with newer games since 40 series will be prioritized with driver and game optimizations.
4070Ti will probably be the card i end up buying. The 4070 non Ti will most likely be too cut down. My 2070 not good enough anymore for 3440x1440 resolution (i had it for 5 years at this point).
Alot of people upgrading to the 4070 Ti, The 4070 will be to cut down and most poeple upgrading is coming off gpus like 1080,2080 super / Ti and such and the 4070 Ti is a huge upgrade and worth the price jump tho its a bit higher, How ever the 4070 Ti to 4080 is 400$ and not worth it so you did a good choice Actually i have a 1440p / 165Hz and i had a GTX 1080 and went to a 4070 Ti as the GTX 1080 wasnt cutting it any more. I was getting 120 fps and bad 1% lows @ 1440p / Low settings and i wasnt going to play at 1080p / 60fps. Also bayonetta stuttered in some spots @ 1440p with a GTX 1080 / 4070 Ti no issues
@@Dempig I had a GTX 1080 @ 4k for a whole and didnt have problems which was a 8GB card but i dont play at ultra every game with RT, Also a 3080 i still would have buyers remorse and used? DONT trust used gpus as people leave they game on for hours, mining and people dont clean they pcs. Learn to not play EVERY single setting maxed out and the 4070Ti will last for years
@@DETERMINOLOGY Why spend $800 on a card that you have to lower settings on and HAVE to use dlss? Its a terrible card. TBH Any card with less then 16gb vram is not worth it. Might as well wait 6-12 months for a much better card
@@Dempig LOL not trying to wait. I got the 4070 Ti no problems. Yall wanna play super maxed out, RT on at 1440p with that card. Even doing that to the 4080 it will choke at 4k with high fps. So thats dead as well and we wait again. Learn to use settings wisely your gpu will last. I had a GTX 1080 @ 4k for 6 years really didnt have much problems but i wasnt trying to max out everything no need. The people that complains is the elites thats trying to max out super settings. GET a 4090 if thats the case