@@Jwalker76This is actually true, when you look at games like Cyberpunk with it's kinda low textures quality, game devs also have to consider RT and Frame gen, which also eats up Vram.
Yeah it depends on what you play and games costing over 70$ means I'm playing so many fewer on release. Besides if you need a 500-800$ card to run 1080p that's on the devs for not optimizing it.
@@ANIManiak89 I am running a 5700XT and I have never had any issues with FSR although I tend to play RPGs and the shooters I do play are not that GPU bound.
I used to run a gt735😂 , I upgraded to a 3060ti it felt undescribable, I felt very powerful being able to play on native res on mostly high to ultra settings😂😂
3 generations means from a 20-series to a 50-series. You've just given an example of a 5-generation jump, Zach, stop exaggerating. *I did originally say 16 series to 40 series but the 16 series is odd and doesn't fit with most other generations of graphics cards.
16 series came out alongside 20 series meaning what you described is a 2 generations jump but you’re right in the fact that he’s wrong but seemingly not by how much he’s wrong with his math but his point still stands
Wtf are you talking about? The 16 series is based on the Turing architecture, it's literally the budget tier, non ray tracing line up of the 20 series. Heck, the 16 series launched AFTER most of the non-Super 20 series cards.
@@superpork1superboy771 OK, tbh the 16 series was a bad example because it doesn't really fit, but from like the 20 series to 50 series is a much better example
I think it's worth mentioning that we've started to see diminishing gains between generations, and that looking at a 1070 for comparison is ignoring this point
Going through all the Nvidia 70 cards starting from 4070 we got a 30% increase to the 3070, then a 15% to the 2070, followed by a 23% increase to the 1070, then a whopping 91% to the 970. The 870 does not exist on userbencmark but the 970 is only a 44% increase to the 770. So the huge gap only applies going from the 900 to 1000 series cards. Now this is only when talking gaming related specs.
@@TeKett RU-vidrs trying to generate views have started being extremely unrealistic about what is required to play games as well as costs. Comparing the largest gains of any generation to now, comparing the lowest prices of any generations to now. The last 4 years have been pretty normal in terms of generational improvements and while costs have gone up, so has everything due to inflation. Prices will probably never go back to pre 2019 prices, and not because of AMD or Nvidia or Intel.
@@yamo511 People have also forgotten that even the 4090s price is very in line with ultra enthusiast tier offerings from the past as well. Many cards were run in pairs or even triple/quad as the top end so while a gtx 580 may have been 500$ a top end rig often had 1000-2000$ of gpus in it. But we did have cards like the 1500$ r9 295x2, the asus rog 5870x2 was 1000$ back in 2010, titan cards have been 1000$ basically since they came out (the 90 series are the new branding for titan) Cost on the low end has gone up sure, but we're talking about going from 200-300$ to 300-500$, which is largely down to inflation and increased shipping costs.
Upscaling tech like FSR and DLSS has definitely extended the life of older cards. I personally run a 5700XT (5-year-old card) and I play AAA titles at 1440p just fine. You have to also remember that most people still play at 1080p.
Same here running a power color 5700XT undervolted and recently did my first attempt to reapply thermo paste on the gpu. I dont get the eye watering 4K120fps reflection pictures but its good enough for me at 1080p.
Especially with FSR 3 just ramping up. That's gonna be killer. Even on nvidia GPU's that support DLSS, 2000 and 3000 don't get frame gen. I plan on using my used 3080 at 4k for as long as possible and when it can no longer run 4k even with DLSS or FSR 3? i'll drop it to 1800p and then again to 1620p Both of which still look perfectly fine to me. So long as it can manage above 45 FPS I'm happy. I played Cyberpunk at 1620p DLSS Ultra performance with Pathtracing (just cause path tracing looks so damn good) Barely noticed visual issues after 10 minutes of playing besides the very distracting shimmering on hair and any thin lines, Still looked great though.
Hey hey hey, I’m running that beautiful 1070 today still. I absolutely love that soldier. I’ve pushed far more out of it than I should have been able to get away with. Maybe I won the PCB lottery with it, my my 1070 is a beast!
I’m rocking a 3060 12 gb and a ryzen 7 5800x. They’ve done me well and i don’t plan on upgrading whatsoever until a part breaks or they’re rendered obsolete
Same but 5700x, until now no game was unplayable, was more about if I could run it at high frames or resolutions with max settings. I would only consider a upgrade if monster hunter wilds will be too hard to run a high frame hates on 1080p, (I have a 1080p 144hz monitor).
I'm on a 1080ti and an i7 8700k. I'm running Cities Skylines 2 at just under 30fps (200k population), and baldur's gate at 60-90fps. And more competitive games like Valorant and Fortnite are smooth.
I had a 1080 Ti Founders Edition with an i7 7600k. It allowed me to play any game I wanted and if it wasn't that the current work related stuff I needed for requires an RTX, I would still be rocking it. But a few days ago I upgraded to a i7 12700kf and a 4070 Ti Super. Won't be upgrading for years
i was on the exact same setup until a month ago when i bought i5 14600k & rtx 4070ti, the only reason i upgraded was because of 2-3 new games that was struggling (enshrouded, star citizen and i cant remember the other one). everything else was solid, i still think i should have held out to see the 5000 series gpu's and 15th gen cpu's before upgrading
If the bastard in the triple A game company stop making the game heavier than what it should be with the excuse of wanting good graphics, then maybe we won't have to upgrade our shit annually😮💨
Or, or, consider this, literally tens of millions of people DO buy new hardware and want games to take advantage of it. Cyberpunk 2077 has sold over 25 million copies.
More like if the AAA companies stop loading their games with known malware like denuvo we wouldn't have to upgrade our shit. Too many times have I bought a game found it didn't run correctly then had to wait for crackers to remove that shit just to make the game functional... then again this lead to piracy talk so I'll just wander away now.
@@PineyJustice and cyberpunk runs just fine on older hardware... well for many it didn't run properly at launch regardless of hw... but it got better. Then you have games that can't handle intel/nvidia's top performers causing issues lately...
@@PineyJustice triple A company loves people like you because they know they could get away with doing anything that doesn't make sense, also of course cyberpunk would sold that much after they pretty much have fuck themselves up multiple times and is now trying to redeem themselves. You know what will make your game more popular? The fact that it is small in size but packing in lots of content. Fuck graphic in general, i don't need the game to have 4K 60 fps realistic backshot images in it just so for the story line and gameplay to be dog shit.
The nice thing is, at least a 1070 or 1080 still work fine for 1080p gaming. When those cards cane out, trying to game with stuff that was 6 years old was essentially impossible. Sure a 1080 isn't the most elegant or effective solution now, but it's still viable.
About same. I went from 1080ti to 3080ti. Best decision ever, because I game at 1440p now and 1080ti couldn't do it. And, things are so smooth and flawless on the 3000 series.
I think that if you plan your builds based around the consoles for that generation you could definitely have it last 6+ years. I had a gtx 980 from 2014 all the way until 2022, my build was based around being faster than the current consoles so that way I could still play the latest games at good settings.
I get where you're coming from but the Nvidia 10 series cards had a lot longer life span than other GPU iterations. I am currently rocking an AMD 5700 XTX and I don't plan on upgrading until the next generation of AMD. That being said I'm also going to be upgrading my 5600 g to the newest am5 architect.
@Brain ... Spot on. My 1080ti was quite a performer, and I upgraded in Covid to 3080ti. And, I could notice that the GTX one wasn't doing as well with 1440p.
I have a gtx 1070 and I get 60+ fps in cyberpunk 2077 on high settings, so its definitely holding in there so far. But in my next build I plan on getting either a 7600xt, A770, or a 4060ti.
@@axelfernandez8964he is definitely not lying, i have a gtx 1070 and an i5 8400, i get around 60 fps playing at 1080p with high/ultra settings (using fsr 2.1 obviously)
Very reasonable if you start with highend gear. I got my overspecced 3930K when the common wisdom was like a fast dual/quad core and 8-16 gig of RAM. Still a weapon. Wouldn't expect 10 years from a video card though, they still seem to keep squeezing out decent generational performance gains.
I went from a 3080 11700k 32 gigs of ddr4 ram build and a 3080 ti 5900x 32 gigs of ddr4 build to the pc in my profile picture my first all white high end build with a gigabyte aero oc 4090/ 7800x3d/ 64 gigs of ddr5 ram. The upgrade is huge and I am looking forward to the 5090/ next x3d chips/ next motherboards/ next ram. I cant wait to see how big the performance jump will be.
@@hiddenguy67if you can't muster up a few thousand dollars for leisure over the course of 5 years, you're either a child, unemployed, or living paycheck to paycheck.
I have a 1660TI on a laptop. Its showing its age, but it can still play most games well enough. If you really really need/want and can afford high fps 4k graphics, cool, not sure if thats most people though
The thing is that we're almost at the end of hardware power since its phisycally impossible to make anything smaller than we already have and if not just very inefficient. Software optimisation is gonna be the big thing in a few years
I used my 1660 from basically release. I got it right before the pandemic in December of 2019 and that card ran like a BEAST for almost 5 years. I just replaced it after over 4 years of use.
We can definitely skip 3 generations of CPUs and 2 generations of GPUs if you play at 1080p and want 60fps+. I got an r9 380 and it lasted me until 2021, I got a 6700XT and it still going strong and probably won't upgrade to the RX 8000 series. The i7 2600k lasted from 2011 to 2019 no problem. The ryzen 5 1600 still gets you 60fps+ 7 years later.
Bought a prebuilt with a 2080Super in the Spring of 2020... I hadn't bothered to check what parts were good, and had been hearing about the new Super cards at the time, then the 30 series stuff dropped like 6 months later. Not to mention my PC had a 9th gen Intel, right before 10th gen dropped... Needless to say, I'm sticking to my research, and I'm hoping that the new stuff is worth upgrading to.
I own a 1080ti from when it launched and as a 1080p has always served me well. Now I will change in the 5000 series to another top graphics card maybe a 5090.
Had a Gtx 1060 since 2018, switched to Rx 6800 a couple of months ago. Also had to switch motherboard, cpu and ram so yeah. Quite the difference. I intend to keep this upgrade just as long as my old components/pc. The pc itself I got in 2015 where only the Gpu had a small upgrade from a gtx 660 to a 1060. So, I'll try to keep the new components another 9 years.
Every 2 to 3 generations I'd say. It depends on whether you're always playing the latest great games and if you're ok or not with low to medium settings at 60fps.
The way I look at it is depending on what you play, you don't need to upgrade. If playing modern games now a days you most likely will have to upgrade. If you play non-AAA title games you most likely don't have to upgrade. I have a 3060ti still and couldn't be happier with my performance at 1080p
If people are constantly playing AAA games that are super demanding, they have the $$$ to drop for $60 and $70 games, they have the money and need to upgrade more often. People who play games that are less demanding do not need to upgrade for a long time. I am using a 1060 and I can get 60-80 fps on average on the games I play. Sometimes need to drop the quality a bit but that is fine for me. I do not have the $$$ to go grab a 4070 and I believe that a 4060 or lower is not worth the money.
As much as I enjoy my 2080 I can see it starting to show it's age in some demanding titles, it seems the time for an upgrade is getting closer, specially now that I have a 4k 144hz panel.
lol 4k was really never playable until the 30 series imo to begin with. If you cant push a high refresh rate monitor with a brand new 700$ card its not worth it. So i never got the higher end or upgrade to above 1080 until the 30 series. And even then my friend with a 3080 sold their 4k monitor to drop to a 2k for better performance and frames so yea nah
still running my GTX 1060-6gb. looking to upgrade in the next year or so but haven't run into a game I couldn't play 1080p at above 60fps yet. Best card I've ever bought!
I'm still rocking a gtx 1080 for 1440p. I don't have time to play many games so it's hard to justify the cost of a new 1440p gpu, but fsr and lowering some settings helps keep my card running newer games like baldur's gate at good framerates.
There is also the issue of the price of some of these upgrades that draw more power and in turn, kill itself faster due to the sheer amount of heat they put off. Basically if something currently works, don't fix it because your next upgrade might not last.
IMO it also depends on the kind of game you're playing on ! I'm really interested in the latest games like Cyberpunk. My games are GTA V Minecraft and BeamNG. So my GTX 1050 is more than enough. I will probably change my setup in a year when GTA 6 comes out
I built my first PC two years ago and have upgraded my GPU 4 times. Luckily, I was able to use EVGA Step Up for one and found buyers for the two others.
I don't think I'll have to upgrade mine much anymore. I don't think I'll ever go past the 5090 because yeah sure it might not be "good" anymore in the future but I quite admire the quality that's out now.
My 8700k and 1070 ti have literally never let me down. I’m replacing this year because I think it’s time. But I still play all my favorite games without any issues. On high settings.
Every 3-4 generations is perfectly fine. My 1060 is still chugging along, but I did order a 7900xtx since I’ve upgraded all the other parts of my computer.
Yeah I have this problem. I went from a GTX 1070 to a RTX 2080Ti last summer. Hopefully it’ll last a little, but it’s essentially a raytraced 1080Ti with some more beef, but not by much.
I got a Strix 4090 oc lc 9 months ago in a major all new rig upgrade after using the same high end desktop I had built 13 years ago. I’m hoping this new i9 13900ks 64gb high end desktop lasts just as long or longer. I went from gaming at 1080p 17” to a 32” 4k and I’m “happy happy happy!!”