Тёмный

Proof that the RTX 2080 can Handle 2K & 4K Gaming Just Fine 

The Long Long Mann
Подписаться 323
Просмотров 2,3 тыс.
50% 1

• Matt Miller PCs is (Po...
This is a follow up to the above linked video. For people who want to know the context for why I needed to make this. This should be more than enough to show that the RTX 2080 can absolutely handle 2K and 4K gaming at mid/high settings just fine. I know full well that the usual crowd is going to say that '60 FPS isn't high enough' and 'You didn't play it with Raytracing so it's not really proving anything' and 'the games you chose aren't good enough examples of modern games' even though that's not the point.
Games shown in order are: Modern Warfare III, Baldurs Gate 3, Elden Ring, and Cyberpunk 2077.
All games are shown running at 2160x1440 (Commonly referred to as 2K) at 60FPS maximum. The only exception is Cyberpunk 2077 is also shown running at 4K (3840x2160) with the maximum still set to 60 and the game is holding stable at 30.
CPU - Ryzen 7 5800
GPU - RTX 2080
RAM - 32Gb DDR4 @3200MHz
I consider this to be effectively more than enough proof of the capabilities of the RTX 2080. As I said, I know this isn't going to be enough for the people who just refuse to accept that anything but what they say is correct. I will not be making a follow up with more games and the settings I play them with. I had considered titles like Dragons Dogma II, Helldivers 2, Forza Horizon 5, and Resident Evil 4 Remake but I wanted to keep the video to roughly 15 minutes as I do tend to ramble. Suffice it to say though that all those titles have very similar settings and are running at 2K 60FPS just like the 4 games I've shown here.

Игры

Опубликовано:

 

1 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 177   
@NathanaelHunter
@NathanaelHunter 20 дней назад
I SWEAR TO FUCK STOP BRINGING UP FRAME RATES. YOU ARE NOT SPECIAL. THE HUMAN EYE CANNOT SEE FUNCTIONALLY OVER 60 FPS. CUT THE FUCKING BULLSHIT YOU WHINY ARROGANT FUCKING OXEGEN WASTES. JUST FUCK THE HELL OFF. I'M TIRED OF YOUR REGURGITATION OF DECADES OLD BULLSHIT. THIS HAS BEEN HASHED OUT OVER AND OVER AND THE REALITY IS THAT YOUR FUCKING WRONG SO JUST FUCKING CLOSE YOUR FUCKING COCK SHEATH MOUTHS AND LEAVE. I DO NOT WANT TO HEAR THIS FUCKING DISGRACEFUL BULLSHIT FROM YOU PEOPLE ANYMORE. GODS FUCKING DAMN. ADDITIONAL FRAMES BEYOND 60-90FPS ARE VALUELESS INFORMATION TO THE HUMAN EYE. ACCEPT THAT OR DON'T, THAT'S REALITY AND YOU ARE NOT SOME ASTOUNDING UBERMENSCH LIVING SPECIMEN THAT CAN WORK IN THE TENTH OF A SECOND INPUT RESPONSE TIMES HIGHER FRAME RATES GRANT. I consider this to be effectively more than enough proof of the capabilities of the RTX 2080. As I said, I know this isn't going to be enough for the people who just refuse to accept that anything but what they say is correct. I will not be making a follow up with more games and the settings I play them with. I had considered titles like Dragons Dogma II, Helldivers 2, Forza Horizon 5, and Resident Evil 4 Remake but I wanted to keep the video to roughly 15 minutes as I do tend to ramble. Suffice it to say though that all those titles have very similar settings and are running at 2K 60FPS just like the 4 games I've shown here. This will be the last I say on the subject. Showing more games and the settings doesn't do anything more to prove or disprove what the RTX 2080 can do. The games I've gone with are all what can be considered as modern games. They all have come out in the life span of the current generation of consoles. If those are called modern consoles, than we should all be capable of agreeing that games on them are modern games. If you take fault with that for some reason, that's a you problem that YOU can go work out on your own. To hit the main subjects one last time: The latest and greatest hardware isn't needed for PC gaming. If you want to poor shame me, just know that my Steam Library is worth well over $20,000 and I own a Steam Deck, Desktop PC, and 2 laptops with one of them having an RTX 2070 Max Q. I'm not 'too poor' to own a better GPU. I don't see any need to buy newer hardware. I will not get any serious value out of a 30 or 40 series GPU. If this is upsetting to hear that's too bad. Cry about it. A Stable 60FPS is effectively the standard. If you're going to try and argue that the standard is higher, keep in mind that consoles currently still push and struggle to meet 60FPS in some titles. It is considered the standard by almost all companies in gaming that 60FPS for the majority of players is what the expectation should be while 120 FPS is the expectation for the players with the higher power hardware. Don't blame me for agreeing. I can easily get my games higher than 60FPS and intentionally do not. I leave VSYNC on because Screen Tearing is ugly. I don't care what you personally think the standard for FPS should be. As of right now, in 2024, the standard is 60FPS with higher power hardware being expected to do stable 120FPS. Anisotropic Filtering DOES impact performance if you have the settings cranked up. 16X will have a noticeable frame rate impact for games running with high/ultra settings. I consider 5-10 FPS to be a noticeable difference and this is why I lock my games to 60 Maximum and that I'm fine with 57-60FPS and consider this to be 60 FPS. 1 or 2 FPS dips below 60FPS for a split second every now and then is fairly unnoticeable but a constant state of being 5-10 FPS lower because you have the settings turned up and the FPS uncapped with AF set to 16X will absolutely be noticed. The RTX 2080 is roughly on par with an RTX 3070. It's slightly weaker, but not by much. 8GB of VRAM is still more than enough for modern games and as is put on display, this is more than enough for half decent 4K 60FPS gaming too (If I had lowered the Cyberpunk 2077 settings rather than stick to the medium settings I could have definitely gotten 60FPS and this is even more easy to imagine with other games) DDR4 RAM is more than fine for gaming. DDR5 is better but it's far from needed. 3200MHz is more than enough for gaming, you really do not need faster than that. It's perfectly fine RAM. Showcases and Benchmarks are not the same thing. A Benchmark needs to push the hardware to what it is capable of, getting the most performance out of it possible in regards to settings and framerate. A Showcase highlights aspects of a product and may be targeting a specific point, be it higher settings or FPS. These are different things. They are distinct and need to be treated as such. Matt Miller PCs video was a Showcase and not a Benchmark. He did not at any point do what is considered standard practice for a Benchmark. He intentionally made the games run at a worse graphics setting to max out the possible FPS. That is not a Benchmark, that is a Showcase. Saying that I 'don't understand hardware' because I decide to use very simplistic statements and terms and gloss over fine details does not make that statement true. Nitpicking the small things that I ignore or simplify (sometimes I will admit too much) does not mean that everything I say is wrong. The overarching topic is if the RTX 2080 is good or bad and if Matt Miller PCs was doing a fair showcase of the capacity the GPU can handle. I do not own or have any free Benchmarking software on my computer, but Steam has a built in FPS counter that is visible in the top left corner during all of my gameplay. The main point I was making is the RTX 2080 can play modern titles with decently high graphics settings at a stable 60FPS.
@itsNightShine
@itsNightShine 20 дней назад
We couldn’t see your FPS counter in the last few games, but I will take your word for it. My bad is was wrong. I didn’t believe the 2080 could hit 60 FPS in most games, and in some cases, you had to lower the settings to low or medium to achieve that. Props for keeping your word and showing proof. It would have been nice to see the FPS counter a bit bigger and in all the games.
@itsNightShine
@itsNightShine 20 дней назад
What I will argue is that it was a fair showcase, not a benchmark as I previously stated, because the GPU was always at 100% usage or close to it. He just decided to showcase the games on lower settings.
@NathanaelHunter
@NathanaelHunter 20 дней назад
That's fair, I leave the Steam FPS counter on for my own personal use and it is rather small and intentionally not invasive visually. I also have the advantage of it being several fold larger for me than likely anyone else as I have a 50 inch TV sitting three feet away from my face. I don't know that I can agree I went and set any games to a Low setting, dropping specific settings to Low, yeah, I do that, but I would generally argue that the overall setting range I aim for at all times is Mid/High or Mid range. To avoid making 2 comments myself when I can hit both in one, in a way, you are not wrong, he did max out the GPU usage and the pushing of the GPU to the possible limits is what a Benchmark is for. However, for a benchmark, it's not just that it was maxed out, it has to not be intentionally targeting something. He was aiming to get the highest FPS possible rather than simply get the highest performance capacity from the GPU. In that he was lowering graphics and not raising them this wasn't a Benchmark, but a Showcase. I commented in his videos' comment section stating that aiming for a specific FPS range and preferring higher FPS over better graphics is fine, it is deceptive and unfounded to refer to it as a Benchmark. As I say a lot to people on this, if you're making a video that can potentially influence purchases, any form of slight of hand or deception is unacceptable. Anyways, I'm sorry I didn't enable the high contrast FPS counter, the thought never even crossed my mind that it's hard to see, I usually game with the lights off very close to a large TV so I'm rather blind to how these look on smaller devices. I'm glad that actually making the video showing the settings and performance in modern titles was enough to convince you that the RTX 2080 is more than capable of playing at 2K. While I didn't show it, and in Cyberpunk I do wonder how I'd pull it off, I can assure you it also does manage 4K 60 with the settings lower. I explain it rather poorly, but the 2080 and current gen consoles are from the same time period of release. They can both be expected to perform similarly, so I'm always a little shocked when people don't think the 2080 can handle 4K while the consoles can. The surrounding hardware as well as the settings need to be tweaked, but it can be done easily.
@ydkma
@ydkma 18 дней назад
uhh the human eye can see over 500fps not only that but having more frames and a monitor and gpu that can output more than 60 has a huuuuge impact on response times
@itsNightShine
@itsNightShine 18 дней назад
@@ydkma that is not what he is arguing. Yes higher fps is better but he is saying that 60 fps is considered a « standard ». He is probably targeting 60 because that is the point where the experience is perfectly enjoyable and that it’s a good compromise between visual and performance.
@Vladimir-nc9ru
@Vladimir-nc9ru 19 дней назад
Ofcourse it can handle 2k 60! 2080 is basically 3060 which is basically ps5.
@NathanaelHunter
@NathanaelHunter 19 дней назад
A solid 100+ comments and Matt Miller PCs were so adamant this isn't true that he threatened me over it. I didn't think it was necessary to make this video, and yet, I was all but forced to.
@kush2023
@kush2023 20 дней назад
bro you were saying you can run AAA games at 4k 60fps and now your switching up saying 2k 60fps with medium settings LMAOOO sad af
@NathanaelHunter
@NathanaelHunter 20 дней назад
1. Why did you comment this 3 times? I hope it was an error and not you just being that much of a mong, though given you type like a toddler and have the memory of a half dead goldfish I can't rule out anything. Don't worry, I've deleted the extras so you don't need to feel embarrassed about it. I took a picture though, just in case you try and start saying I delete comments or some other nonsense. Knowing how you act I can't very well leave that possibility unconsidered. 2. I said the RTX 2080 can handle MODERN games at 2k and 4k 60, just because you are deciding to believe I said something doesn't mean I actually said that. I never at any point even said AAA. I said modern, not AAA. So if you're going to claim I said something, the least you can do is actually get what I said correct. That's the worst attempt at a gotcha I've ever seen.
@kush2023
@kush2023 20 дней назад
@@NathanaelHunter lmao i replied too you saying that you cannot run any AAA games on 4k 60fps with a 2080, and you replied saying something like "I have first hand experience and you absolutely can" go back and look for yourself bud and i didnt comment it 4 times for me must be a glitch or sum
@peik_haikyuu2265
@peik_haikyuu2265 20 дней назад
You are actively running dynamic resolution saying you can run 1440p🤣 no you can run 1080p upscaled to 1440p
@NathanaelHunter
@NathanaelHunter 20 дней назад
And there's the moronic 'But Dynamic was turned on in CoD' that I explicitly addressed. You obviously didn't listen or are intentionally lying about what I said. Dynamic Resolution does not kick in until the GPU is under maximum load, something that does not happen with the settings I use. I was well below the range for Dynamic to kick in, and the only way it could have is if I was running CoD twice, given the load was near half of the 8Gb. At that range, if you had been paying attention, I had explained that it was possible to increase the graphics settings on Modern Warfare 3 by a decent leap. You also should have noticed that in all cases, upscaling of any kind is disabled, so the arguably more invasive setting that is an always of or always off function that could have easily made every game run much smoother was never enabled. So, as I said, Dynamic WAS enabled in CoD, but it never kicked in nor will it kick in unless the GPU is at capacity. The Timestamp for what you're intentionally ignoring is 1:20. I say this in the video, the GPU is shown by CoD to not be anticipated to experience full load. That's what the Estimated VRAM Usage is for, looking at what settings are impacting the load on the GPU.
@NathanaelHunter
@NathanaelHunter 20 дней назад
That's not even close to what was said, once again. This time though, you don't even have the benefit of the statement coming from a comment on a different video to hide behind. You're literally choosing to misrepresent a comment that's visible at all times to you when typing that one out. You're acting in the very definition of bad faith. Way to out yourself as entirely unreliable.
@randomgames9025
@randomgames9025 18 дней назад
i have a rtx 4060 which is almost an equivalent problem is 8gb varm kick in modern games I hope they fix that problem because I want to play games at 2k
@NathanaelHunter
@NathanaelHunter 18 дней назад
If you set your settings slightly lower than what I did, see if that does it for you and I would also suggest turning on DLSS to get the most out of it.
@randomgames9025
@randomgames9025 18 дней назад
@NathanaelHunter problem is you have to lower the texture in order to save some vram texture are the biggest settings that affects the games visuals and appearance and it cost no fps it only needs vram so lowering it will make the game look so bad 😕
@Noa15Lv
@Noa15Lv 18 дней назад
My nephew is using my 2080s which i had in my pc since 2020 He's playing on 1080p monitor most of the times, but yeah that gpu was running on 1440p resolution all the time. Even in Forza Horizon 4 i've pushed it higher JUST for the pictures. Edit: Also, that GPU runs VR games pretty good, except the "Low VRAM error" message which popped up quite often without affecting VR games fps.
@NathanaelHunter
@NathanaelHunter 18 дней назад
I didn't even think about mentioning how it handles VR because I used to play so much of it but as of late I've been too messed up from some injuries to play VR anymore but man, the 2080 can really do VR decently.
@Noa15Lv
@Noa15Lv 18 дней назад
@@NathanaelHunter Yeap! Its an good gpu. Don't overreact over plebs on the internet since you don't need "latest n greatest" stuff to play videogames. Tweak your settings and you'll be fine. Present settings [low,mid,high,ultra] my arse.... Those can go didle themselves.
@NathanaelHunter
@NathanaelHunter 18 дней назад
They have definitely pushed me to my limits of civility, I can't lie. The fact that they think there's a massive leap in fidelity with modern games between mid and ultra is astounding. These titles are so high detail with the textures that at a mid range setting it's not all that much of a visual bump going to high or ultra and the performance hit is massive. I sometimes believe there is an intentional push by GPU makers to have those settings in there only to justify the existence of these newer GPUs. It's not like these games are PC exclusive, they're ports of the console versions half the time anyways so there's basically not major difference.
@skinball
@skinball 18 дней назад
I use my 2080 super to play games at 1080p Max but it is nice knowing I can do 1440p and 4k. Good Vid man
@NathanaelHunter
@NathanaelHunter 17 дней назад
It can if you 1. have a display that has the right resolution capability and 2. use the right settings. I suggest using DLSS if you want to get more FPS as it's going to do wonders. Enjoy!
@mmremugamesmm
@mmremugamesmm 18 дней назад
just a question. i understand saying that resolution scaling is not needed if the card does not get 100% usage. so why have it enabled as it just leaves the door open for debate. also i stumbled onto this video. i am not familiar with you or the guy your talking about. i had a rtx 2080 for about 4 years. i just adjusted settings to get what i was happy with.
@NathanaelHunter
@NathanaelHunter 18 дней назад
Great Question and it's one that doesn't have a particularly smart answer. So, I use my desktop in a VERY abnormal way in that I have dozens of Chrome Tabs open while I'm gaming. Videos, Technical Manuals, Remote Desktop sessions, the works. That's all very intensive, so while the game (in this case MWIII) will never draw on that setting from its own usage, it can sometimes be needed when I'm not recording and have intentionally closed all the extra stuff that's open. So while I can understand the confusion, it's something that I do have a reason for and the setting itself could very well have been left off. I just prefer to give the honest statement of what my settings are, and that one is something that I leave on, and as the bar showed, it wont kick in from the game itself, but an external program using VRAM.
@mmremugamesmm
@mmremugamesmm 18 дней назад
@@NathanaelHunter so the answer is it’s just how you do things it’s what you do. Well when you’re making a video and just my opinion. debating about what a video card can do or not do. I would just simply remove all potential performance boosts , it just helps you prove your point better. Save yourself the headache. Have a good 4th
@NathanaelHunter
@NathanaelHunter 18 дней назад
I can agree, anything that could boost performance is a good idea to leave off in a display of potential maximized output, and you'll notice I left DLSS at off and the only instance of anything that could alter the render resolution was in MWIII where it wasn't capable of kicking in anyways. Like I said, I had no intention of hiding anything, so I made absolutely no changes to the settings I use in my day to day with these titles.
@atomicelements1494
@atomicelements1494 19 дней назад
Been running a 2070S for like 4 or 5 years and until very recently that has been just fine to play 95% of games at 4K@60fps
@NathanaelHunter
@NathanaelHunter 19 дней назад
It's shocking how many people either don't accept that 60FPS is the standard for games or simply don't want to believe that older cards can play games at 2K 60 let alone 4K 60. I expect that you will probably see that GPU be just fine until likely around 2028 if not later depending on what games you tend to play.
@BarManFesteiro
@BarManFesteiro 18 дней назад
the last 2080 super soldier
@NathanaelHunter
@NathanaelHunter 18 дней назад
I can't tell if you're mocking me or if you think I'm championing the 2080. Either way, don't mistake me saying the 2080 is no slouch for my endorsement. I think people should do their own research and pick the hardware that does exactly what they need it to do with some room for future titles to run well and buy that rather than blindly buying the latest tech simply because it's available. I'm only highlighting the 2080 because it performs up to snuff from my perspective and is on par with modern consoles. If somebody has a personal taste for higher quality graphics or framerates then I more than encourage buying a stronger GPU. I do not however condone acting like a fool and simply shouting that anything old is bad and buying it is wrong, or worse, attempting to misrepresent older hardware with deceptive 'benchmarks'.
@Mrakantor6
@Mrakantor6 18 дней назад
I had 2080, 3080 and 4090 and I can assure you 2080 can handle everything 2k/4k at mid/high settings @60fps and that you can probably push it @90-100 FPS with DLSS or FSR. As per the video you should consider providing at least some overlay during gameplays with Max. Min. Avg. and 1% Low FPS when claiming any statement, otherwise they are pointless! Setting up MSI AfterBurner for that ain't a lot of work!
@NathanaelHunter
@NathanaelHunter 18 дней назад
I swear the overlay is there in my raw footage. My point was just the framerate, I'm not interested in the min maxing and temperatures and I'm sure if I told anyone what the cooling situation is on that computer they'd fucking lose it. Anyways, I don't know what happened but in the raw footage there is an FPS counter from Steam visible in the top left but in the version I uploaded for whatever reason it's only visible during the Baldur's Gate 3 section and I do not know what happened. But yeah, the 2080 can get more FPS if I tweaked the settings even more and I could have put in the effort to do something more like a proper benchmark but I aimed to show EXACTLY what my games look like and the settings I run them at. I didn't take any time to optimize, I just showed what they are as they are. I wanted to get the point across that these settings are what I use, not that they're the highest they can be pushed. I could probably easily get better graphics or a higher framerate with some tinkering, but I don't really feel the need.
@angeryanimal398
@angeryanimal398 19 дней назад
a lot of yall dont understand the point of this video. he is just highlighting that with a little bit of brain power and smart budgeting, you can hit graphical goals as well as performance goals. games these days shove so much unnecessary detail into stuff that it ruins your performance while having a neglegable effect on visual quality. you DONT need the most super high end gpu to enjoy these games at decent graphical settings. i also have noticed in my over 200+ hours playing cyberpunk that there is literally not much difference between medium-ultra settings in terms of visual quality and there is a massive difference in terms of performance. i've also noticed things like cloud quality, distant shadow resolution and stuff like that literally makes the slightest change to the graphical fidelity of that game but once you turn them down you get a lot more performace. even things like screen space reflections, you have to REALLY take a look to get what difference is there between psycho and high/medium settings, but yet again, changing that option up makes a huuuuge performance difference. you can play on 2k with this graphics card just fine, especially with the DLSS capabilities of it. remember, the higher resolution monitor you use, the better the upscaled image quality will be and even if, you can still probably squeeze a lot of performance by tuning the settings in a smart manner. now 4k60 is debatable, especially on really demanding unoptimized titles like starfield and the last of us, but thats the reality for like 90% of modern day GPU's, no matter how good of hardware you have you wont run unoptimized games like that, well. anyways sorry for the rant, just my two cents
@NathanaelHunter
@NathanaelHunter 19 дней назад
That is one interpretation of what I was showing, though it's not explicitly what I was trying to convey. I do appreciate that you can understand the idiocy of hardware elitism though, older cards don't necessarily have to be thrown away or replaced just because they aren't the newest tech, just like a phone.You are absolutely correct, with smart understanding of settings in games and knowing what games you want to play, anyone can make informed decisions and buy the right hardware for what they are seeking to play. Modern western AAA games are far too focused on looking pretty and not caring about being optimized. Poor PC ports are what makes the elitists so emboldened, not that older hardware is actually bad. The RTX line has access to DLSS and in none of my games do I even turn it on, and if I did, the titles would run even better, at higher FPS and graphics settings. Modern western AAA games are why people don't think older hardware is good enough, they wrongly think that bad ports are running poorly on older hardware because it's the hardware not being capable, but these games are simply just poorly made. The 2080 is on par with the current generation of consoles, and these same titles are on console and PC, and yet, for no reason, people claim the 2080 and other older cards aren't good enough. If more people would simply chose to be informed consumers rather than sheep who believe whatever nonsense the social media creators they want to follow say, they likely will have far better lives for it. They would spend less and get exactly the right hardware for their needs. unfortunately voices like yours are too far and few between in online spaces around PC hardware.
@angeryanimal398
@angeryanimal398 18 дней назад
@@NathanaelHunter you are incredibly correct about hardware being much more powerful than we realize. i saw someone the other day running a custom version of blender they coded themselves on a flip phone. A FLIP PHONE, that made me realize how some hardware is much more powerful than i ever thought. i mean we sent people on the moon with only 64kbs of ram so my question is why cant my computer run starfield at 1080p high settings at 60fps without upscaling. well, the answer is obvious, lazy development and evil corporations trying to get us to buy the "latest and greatest". honestly, the hardware we have now is more powerful than people in the 90s could have ever imagined. honestly, at this point im trying to find other ways i can use this stuff other than games to demonstrate how legitimately performant todays hardware is. bottom line is we need to wake up as consumers and fight for our rights.
@angeryanimal398
@angeryanimal398 18 дней назад
@@NathanaelHunter i also dont understand why people are hating on you trying to use your hardware for longer. i mean i have other things i wanna spend my money on, i have hobbies, friends, family. why would i want my 3-4 year old hardware to be obsolete? especially with how expensive things are these days. i used to game on a 4690k and gtx 980 for around 6-7 years until it became apparent to me that i couldnt really enjoy new AAA games anymore and even then when i bought my PC back then the hardware was a few years old. why should i have to change my GPU every few years even though its still incredibly capable, because of lazy development? because of greedy corporations? fuck that... buying new expensive hardware to play games you should've otherwise been able to play just sounds like putting a band-aid on the problem
@NathanaelHunter
@NathanaelHunter 18 дней назад
Oh that's an easy one, it's known as buyers remorse. It is most potent for people who make very expensive but less than rational purchases. It's most notable with things like cars, but computer hardware draws this response out as well. There's a few ways it can manifest, in the case of people you're seeing here, they bought overpriced GPUs and instead of being satisfied with it running games better for them, they expect it to be an exclusive walled garden where games should be made FOR them and anyone with hardware that isn't what they got should be prevented from access. Another way is that one may end up attempting to justify the purchase by downplaying alternatives, something you also will see here often. Saying that while what you have isn't bad, it's not as good and thus it's bad by comparison. Another way that we can see here is denial that what they bought may not have been necessary, as we see every now and then some comments outright deny that other options perform well. All options are different ways of expressing buyers remorse and they're all a form of internalized regret. I've said it often, but hardware elitism is disgusting and the people who most partake in it usually aren't even getting genuine value out of what they have. What's more, there's the very real likelihood that they are simply attempting to brag, though in recent times this is less likely as 30 and 40 series GPUs aren't nearly as hard to find anymore as they were about 2 years ago. One of the things they usually do is to say that you're only using old hardware because you're too poor to buy the latest tech, and in some cases this could be correct but it's not really an argument rooted in anything other than that buyers remorse. This is something I see spouted often that makes me laugh when people use it to shame my having a 2080 still because I've spent tens of thousands of dollars on my Steam Library, so it's literally a case of them being ill informed that makes them look like a salty child who wasted their money.
@PineyJustice
@PineyJustice 18 дней назад
@@NathanaelHunter I don't think anyone with a current gen midrange or better gpu is regretting anything lol. My 7900xtx is pulling more than triple your fps at 1440p in cyberpunk with the settings cranked, as it's been doing for nearly 2 years now. An HD 7970 from 12+ years ago will play cyberpunk with a pretty decent experience with the right settings, once you start turning everything down your whole point goes out the window.
@bankruptjester1
@bankruptjester1 18 дней назад
i had a 2080 till i upgraded my cpu then gpu with an upgraded cpu the 2080 runs 3440x1440 pretty well running gta 130fps and cod mw3 120fps pretty well with it then got a 4070 but the 2080 is an amazing card still lil slow just need to run some stuff on medium im gonna be using it in a mini pc for travel and best card ive ever owned durability to never games running a bit to high of a resolution and still runs well just switched it ou would probably last me another 2-3 years easily.
@NathanaelHunter
@NathanaelHunter 17 дней назад
I personally use a gaming laptop and a Steam Deck for travel but I do hear good things from people who use mini pcs and I've dabbled with the Steam Streaming functionality and I must say, if you have the internet bandwidth and don't mind the minor problems of streamed games, it's fantastic.
@kepler_45
@kepler_45 18 дней назад
I play at 4k with my 3060 just fine
@mmremugamesmm
@mmremugamesmm 18 дней назад
also good video, i like tech talking and i have always said put your settings at a realistic level and enjoy the games.
@NathanaelHunter
@NathanaelHunter 18 дней назад
Glad you liked that. People really should just buy the hardware that's what they need for the games they play and set things in such a way that maximizes both performance and visual fidelity.
@mmremugamesmm
@mmremugamesmm 18 дней назад
@@NathanaelHunter absolutely correct
@DeathStriker88
@DeathStriker88 18 дней назад
Finally, I am feeling really good to meet another person who have the same mindset as me. Doki Doki Literature Club, Super Mecha Champions, The World Next Door, Limbo, Inside, Little Nightmares, Trials Fusion, Trials Rising and few other games are what I love or want to play. I don't see any valid point that can prove these games need only RTX 4090 to at least open. All of these games can be easily played with a 1650, RX570, RX580, and any budget or super ultra budget card. Latest and greatest products are just money waste. Try to save your money and later invest them in very trustworthy and reliable places so that you'll be earning even more than what you just invested. But who's gonna listen to me? I am glad at least you understand that it's nothing but money wastage.
@NathanaelHunter
@NathanaelHunter 18 дней назад
If only more people could understand that. For the price of some of these newer cards people could buy 2 or more older GPUs and run a dual card set up instead and outperform any single card set up on the market.
@DeathStriker88
@DeathStriker88 18 дней назад
​@@NathanaelHunter Undoubtedly. In my country a decent car can be bought, that too directly from showroom in the price of 2080 or 2080 Super. You can buy home, start your own business and what not. Even I can't count the list of possibility well. But still I am glad at least you understand
@DeathStriker88
@DeathStriker88 18 дней назад
I had played all these with Intel HD4000 with tons of settings put to lowest or disabled and they never gave my any issues. It is hard to believe but I had personally experienced the games like the Trials Rising and Trials Fusion, who have good enough graphical fidelity do run really well in iGPU only. Now even the 1650, RX570 and RX580 feel like money waste to me
@NathanaelHunter
@NathanaelHunter 18 дней назад
The funny thing is that a lot of people try and poor shame me that I use a 2080 without realizing my Steam Library is about the same cost as a Maserati. I made my hardware choices very deliberately, it's not like I couldn't or can't buy a better GPU, it's that I don't need one and I'm not easily pressured into buying one just because it's more powerful. I have a lifetime of games available and not one requires a more powerful GPU than my 2080 and it's unlikely any game coming out anytime soon will either.
@NathanaelHunter
@NathanaelHunter 18 дней назад
I played for a great many years in the early 2010s on integrated graphics, it's the people who have experienced the lack of a GPU in their system that I've found can fully appreciate and properly assess the value of a GPU. It's easy to say somebody should use whatever high performance piece of hardware if you've never experienced anything else.
@UTFapollomarine7409
@UTFapollomarine7409 14 дней назад
i dont know dude im trying to see those 1% lows and those highs my guy lets see the video again with msi after burner or fps counter on the overlay and truly see you do 4k 60fps ultra in cyberpunk dude! you may want to upgrade that 2080 ti to at least an rx 6800 or 3090
@theonerm2
@theonerm2 20 дней назад
Why do a video trying to show performance without fps numbers or the msi afterburner overlay? Either way I know the RTX 2080 isn't so bad. I had an RTX 2080 Super but I ended up selling it and getting an RTX 4070 Ti Super. I like having the latest generation hardware.
@NathanaelHunter
@NathanaelHunter 19 дней назад
The FPS counter from Steam is built in at the top left corner. It is small, but it's there. I will not by installing afterburner, I don't need bloat on my computer. I'm not saying people shouldn't upgrade to newer hardware if that's what they want to do, just that the 2080 isn't the low settings 1080p GPU that Matt Miller PCs was trying to pass it off as.
@ItsJord
@ItsJord 19 дней назад
@@NathanaelHunter doesn't even show on MW3 or CP 2077
@NathanaelHunter
@NathanaelHunter 19 дней назад
Pause and look at the top left, it is visible, I assure you.
@theonerm2
@theonerm2 19 дней назад
@@NathanaelHunter Not in all of the games you showed. I only saw it shown for one game in the video. I might be blind if it's actually in the others. I am starting to see a weird flashing thing in my peripheral vision sometimes and I have no idea why. It kind of scares me a little.
@ItsJord
@ItsJord 19 дней назад
@@NathanaelHunter I think you need to pause and look at the top left. It ISN'T there
@PineyJustice
@PineyJustice 18 дней назад
I don't get how someone who paid out for an upper tier card 4-5 years ago still wants to cling to that card and run medium settings when it's outperformed handily by even the lowest end current gen cards. The only regret you should have is not picking up a 6950xt for 500$ last year, or a 6700xt when they dropped well below 300$. You can limp anything along and play games for quite a while, but you're giving things up to do so, like higher settings, better framerates and efficiency.
@NathanaelHunter
@NathanaelHunter 18 дней назад
The RTX 2080 outperforms the RTX 4060. I'm not even going to address the fact that I have a base model 2080 and not one of the more powerful that were available. On top of that when I bought it the 30 series cards were already about to release so it wasn't even top of the line at that point. The audacity to call a 2080 limping along is disgusting, it's elitist fucking shitspewing. I have no respect and only my sincerest get fucked to offer people who act like you are. I'm done trying to be even a little nice about it, fuck the hell off. You all make me sick.
@mixelplxsdad2705
@mixelplxsdad2705 18 дней назад
Everybody gets bent up on needing that new GPU. I have a 4070 super aorus master overclocked close to a 4070 ti. I will not need ANYTHING for 3-6 years. there really is nothing I won't be able to play. I can play almost any game at 4k ultra 60+ 1440p is childs play. You don't NEED the best to play games really well. 4070 super I feel is a reallllly good spot.
@matthewmuller8117
@matthewmuller8117 18 дней назад
bro talking like a 4070 super isn't a high end card marketed as being capable of all those feats
@Kudo716
@Kudo716 18 дней назад
Now do it with ray tracing enabled. Video games are MUCH more optimized than they were, and every card from the 20 series onward is capable of playing at higher resolutions if you tweak the settings. Which I mean, yeah, that's quite literally always been the case. I've been PC gaming since 2006 and it was a feature even back then. If you're looking to play modern games at 1440p 60, or 4K, sure, it's possible, but you're not gaining any of the benefits modern cards can offer, especially in terms of graphical fidelity and lighting. Yes, Cyberpunk looks great with or without Ray Tracing, but RT on is sooooooooo much nicer. If you don't care about graphics and only care about performance, that's fine. Also, just because the human eye can't see over 60 FPS does not mean there aren't any benefits to going over 60Hz. In fact in a game like MW3 it's almost necessary just to keep up sometimes. Do any online refresh rate test and you'll quite literally SEE the difference. Do you HAVE to get a new card? I mean, no, not now, but I highly doubt that a 2080 is going to be of much use in even a few years.
@NathanaelHunter
@NathanaelHunter 18 дней назад
Alright, two things, I'm not addressing the middle parts. I'm hitting top and bottom as those are the biggest issues. No, I'm not messing with Raytracing, there's no reason, the performance hit for fancy high tech lighting/reflection effects is pointless and doesn't offer any real benefit, though it's not like I can't just turn it on and make some minor tweaks and get a stable 60FPS with DLSS and Raytracing. Modern titles have never been less optimized and I do not know where you get this notion that they're in some massive great state. Almost every huge release in the last few years has had sever issues on PC. They're often a complete mess and even now updates sometimes make these blockbuster titles break horribly. The human eye cannot see over 60hz and on top of that very few people without extensive training or practice can take advantage of the fraction of a second input delays. There is simply no basis to claim that input delay that's often less than a 1/5 of a second is actually making massive differences. On top of that many titles are server side anyways so it's far more likely that you'll lose a fight on the response time to the server than the delay from HID input.
@Kudo716
@Kudo716 18 дней назад
@@NathanaelHunter "Modern titles have never been less optimized and I do not know where you get this notion that they're in some massive great state. Almost every huge release in the last few years has had sever issues on PC." Are we talking about server and connectivity issues or actual in-game performance? I thought it was the latter. "The human eye cannot see over 60hz and on top of that very few people without extensive training or practice can take advantage of the fraction of a second input delays. There is simply no basis to claim that input delay that's often less than a 1/5 of a second is actually making massive differences." 120+Hz monitors have been the norm for years at this point. I cannot think of a single human being who plays games on their PC that doesn't have at LEAST a 70(something)Hz. Before the Series X, going from playing Overwatch on a 120Hz display back to a 60 on the XBone was extremely jarring. There's absolutely a lack of smoothness. The vast majority of competitive online games benefit from high refresh rates. You want to talk responsiveness? Your mouse's polling rate is at least 10 times higher than 60Hz, and that's if you have a shitty mouse. I have a Razer Orochi V2, a pretty middle of the road gaming mouse, and my maximum polling rate was 1010Hz. Having a higher refresh rate quite literally increases the amount of responses your in game character has per second. Instead of only getting mouse responses on 60 of those 1010, you're getting 2 and a half times more in every second. That can easily be the difference between life or death in a game like Overwatch, with Widowmaker.
@NathanaelHunter
@NathanaelHunter 16 дней назад
caseguard.com/articles/how-many-frames-per-second-can-the-human-eye-see/ www.sighthound.com/blog/human-eye-fps-vs-ai-why-ai-is-better#:~:text=But%20our%20eyes%20can%20only,and%2060%20frames%20per%20second. www.healthline.com/health/human-eye-fps I can keep linking these articles all day. No average person is going to take advantage of extra frames beyond the capacity of the human eye. There is more lag from the average connection to servers than the input delay created by 60FPS over 120FPS. The % of the population who are in positions that would train their eyes and body to respond to tenth of a second stimuli is incredibly small and there are a billion to one articles saying as much. There's not a case to be made to validate going beyond 100FPS that holds serious water and it's a biological limitation of the human body. I cannot stress enough that even during testing by the US military they weren't getting results from people who lived or died LITERALLY in those tenth of a second times. I don't care if you THINK that people are winning engagements by those lag times, but it's NOT that. Maybe at the highest possible levels of play with people who train a dozen hours every day to be at the apex of their skills in whatever twitch shooter title they play, but 99% of people are just not biologically capable of acting in these fraction of a second response time differences that higher frame rates create. It's an incredibly well researched subject, particularly by the military. If there was any value to higher frame rates, the military would be the first people to be looking for ways to make value in it. F1 drivers win by these kinds of reaction times and they don't have server lag to deal with and they train constantly for that split second. I'm completely baffled by how this decades old propaganda that was born from the 30FPS era of gaming is still being tossed around without the proper background information that disagrees with the premise.
@AGENTEN-ry6lr
@AGENTEN-ry6lr 19 дней назад
57-60 fps in cod? man stop it, thats fucking horrible. I had a 2080super over 2 years ago, and that shit could definitely not handle 1440p high settings on triple A games. Sounds more like bro is struggling to cope that his 2080 is obsolete.
@NathanaelHunter
@NathanaelHunter 19 дней назад
You exemplify the issues with hardware elitism, you think that because old hardware isn't as powerful as new hardware that it's unacceptable, but you are wrong. You don't seem to get the importance of having older hardware or the value of that hardware, and it's something that people who only use and see computers for their power output seem to have a particularly bad issue with. There are many older titles locked off from some modern hardware simply due to when they were made and what was needed for them to run. For starters, I would find it incredibly unlikely you have any older versions of Windows available to you without a downloaded copy, let alone have a DC or floppy drive reader to install that OS. I would doubt you have any games on physical mediums, and likely would scoff at anyone who does. Thinking that modern hardware is simply the only option and anything else is obsolete is the worst possible sin somebody who claims to enjoy technology can commit in my eyes. I've lived through multiple eras of technology and have a deep love and appreciation for each and every distinct era that's come to pass, so I'm not so easily convinced that the latest and greatest tech is necessary, let alone that the value to cost ratio is acceptable. People who think like you disgust me, and I have no issue in saying that you and anyone who thinks like you is a blight on the future of technology. My video shows that teh 2080 is still capable of performing in line with modern standard expectations for games, and contrary to what you believe, 60FPS is the standard still, not whatever you seem to think it is. Game companies aim for 60FPS for their titles and only the most high end monitors have the capacity to even handle higher frame rates properly. On top of that, these standard expectations are in place because the vast majority of players aren't on the latest and greatest hardware, and they never will be. The Steam Hardware Survey consistently shows that the expected PC for a game to be running on is fairly likely to be about on par if not slightly better than modern consoles, but not the leaps and bounds you seem to think that a PC should be. You are misguided and in that you have become arrogant and incapable of understanding reality. You incorrectly think that the 2080 isn't powerful and because it isn't the best GPU on the market that stating it's capabilities to be within acceptable modern expectations for performance I am 'coping' with it. You however are the one who is coping, you've been shown the reality of the RTX 2080 being more than capable of performing rather well in modern and demanding titles and that defies your notion of what should be allowed. I would not be shocked if you think that games should only be made for the most powerful machines and that everyone who doesn't have them shouldn't be able to play. You are a disgrace and you disgust me.
@ohnobro3770
@ohnobro3770 18 дней назад
Lmao you liked your own comment. Also 60 fps has been the norm for decades and it will continue to be if the next consoles need updates and whatnot to run at the advertised 120. If the 1080ti is still a capable card the 2080 is damn capable
@NathanaelHunter
@NathanaelHunter 18 дней назад
In his defense, it's possible one of the other incredibly shriveled brained mongs gave him a like. That said, so many people don't understand how the human body actually handles frame rates and reaction to stimuli and they just say that 60fps isn't good enough without having any understanding of the variance in capability to react to 120fps information over 60fps or the availability of monitors and market space that higher framerate capable monitors hold. If a card can run a game and get a stable frame rate, that's entirely fine. Skill in a game is infinitely more important than the frame rate the game is running at. These people likely were saying the same thing back when games were running at 30fps, except back then the difference between 30 and 60 was actually note worthy, while the leap from 60 to 120 has diminishing returns and a hardware adoption issue that it has to overcome before it's the standard.
@AGENTEN-ry6lr
@AGENTEN-ry6lr 17 дней назад
@@ohnobro3770 i like my own comment? no n*gga, now i did.
@NathanaelHunter
@NathanaelHunter 15 дней назад
What are you a fucking shit licking child? Swear or don't but you sure as shit will not act like a fucking pathetic school child and censor yourself if you are going to fucking swear.
@matthewmuller8117
@matthewmuller8117 18 дней назад
Bruh, I have never had a card that could handle a demanding title on anything above low settings at 60 fps. I'm currently rocking a 6500 XT (a terrible card) and have a history of using crap Nvidia cards like the GT 210. Scrolling down this comment section and seeing people debating whether cards with 8GB+ VRAM and 2500+ CUDA cores are "keeping up with new gen" or "falling off" is the most infuriating shit I've ever witnessed. Either get your 4090 so you can play Fap Queen at 4K 120fps, or manifest some brain cells, And get a decent 1440p-for-cheap card like the 6700 XT, and stop complaining about useless features you don't even notice while playing the fucking game.
@NathanaelHunter
@NathanaelHunter 17 дней назад
I feel so bad that you spent time reading the comments here. Interacting with the 'well aktually' crowd has rotted my brain.
@BlueLightning
@BlueLightning 18 дней назад
why would you WANT only 60 fps in a competitive shooter? i totally get it the 2080 can definatly game at 1440p and 4k with the proper settings in a very optimised game like COD and even cyberpunk, but trying to say that 60 fps in COD is good is a joke. you want a minimum of 120 fps in any online shooter and that isnt just eliteism talking, i would never trade my 240hz monitor for a 60 hz even if it was 8k resolution. and the fact that you can get a 1440p 240 hz capable monitor for the same price as a 4k 60 hz display and pair a 7800xt or a 4070 super with it and get almost double the performance of a 2080. all you are sounding like is you have never experianced high refresh rate gaming before. which is fine, but also realize 60 hz isnt the benchmark most gamers shoot for anymore, especially with so many cheap yet good options out there in terms of monitors. for casual games that are story driven? sure 60 fps is fine. cod/cs2/battlefield/valorant? 120fps is the MINIMUM and even that feels meh.
@NathanaelHunter
@NathanaelHunter 18 дней назад
It's hard to explain how you're wrong here but suffice it to say time and time again many studies show that the people just don't get massive benefit in performance in games at 120fps over 60fps quite like they did with the jump from 30fps to 60fps and it has a lot to do biologically with how the human eye works. I don't have time to get into all of it but the summation is that the leap from 30 to 60 is massive but 60 to 120 is still twice as many frames per second but due to how the human body handles visual input, the actual benefits are diminished at a higher rate. Some people do get more benefit from 120 over 60, but the vast majority of people have diminished returns for their ability to process 120 than with 60. What's more, it's not even easy to distinguish 2K and 4K on a monitor for most people, the number of pixels is just too high and on a small screen or at a distance the fidelity just doesn't make any noticeable change for people to get value from. I know this is very dumbed down but it's effectively more of a statement that people who make claims that 120 is the best way to play competitive shooters are saying it but not comprehending the greater scope of information around that. They either don't know or don't care about diminished returns and either way they're both correct and wrong depending on their individual ability. That said, higher FPS does not inherently make somebody better in a shooter. An experienced player with solid fundamentals playing at 30fps is more than capable of out performing a mediocre skilled player who has the advantage of 120fps. Framerate cannot compensate for ability and simply saying the advantage of higher FPS is outright better doesn't hold true either. Skill plays a massive factor and individual capacity to process the higher frame rates and a developer reaction time do much of the work. Saying higher FPS is better is a broad generalization that simply isn't founded in objective reality. It's true, but only to a very limited extent and it's in a very specific way, so it's not as if saying 120fps is better than 60fps can be said as an absolute as you do. In short, 60FPS is still the industry standard and it's for a massive number of reasons, from biological to the availability of monitors and TVs that can make use of higher frame rates. The long and short of it is that no, 120 is not out right better and 60FPS is perfectly fine for a competitive shooter broadly speaking.
@BlueLightning
@BlueLightning 18 дней назад
@@NathanaelHunter i never said it makes you inherently better. im saying it just feels better. i play dota 2 most of the time and i have it maxed out at 240 fps, but if i cap it at 60 it literally feels unplayable. hell i can even perceive drops to the 100-120 fps marks. like i said, if all you want is to hit that 60 fps mark, than fine, you do you, but dont come in here telling everyone else to limit themselves that way when there are so many better options out there. i will never reccomend someone buy a 2080 used for 4k or even 1440p when they could get a 6700xt or 6750xt used for not much more and have 4 more gb vram and roughly 20% more performance across the board. and if you wanna argue that 8 gb is still fine (which for many games it still is) than the 3070 and 3070 ti are both better options. like get real, nobody wants to game at 60 hz no matter the resolution. Why not go for 1080p 240-360 hz? it literally just feels infinitely better to play and a 2080 can hit those numbers or close to that in 1080p. now if you want to bring DLSS into the equation for the higher resolutions you can, but there are tradeoffs there as well. im not saying it cant work for you, im just saying it isnt optimal and you making it sound like your way is THE WAY just rubbed me and clearly several others who watched this the wrong way.
@NathanaelHunter
@NathanaelHunter 18 дней назад
"...trying to say that 60 fps in COD is good is a joke. you want a minimum of 120 fps in any online shooter..." I'm not playing this fucking game with people anymore, if people want to pull this nonsense, I'm going to shut it the fuck down. There are factors at play that people are either intentionally or unknowingly ignoring with frame rates and gaming. Higher FPS is not even SUBJECTIVELY better. There is no fucking data that certifiably backs up that claim, there's no conclusive evidence that it's true. There's not even a valid reason to say that higher FPS is worth going after for a competitive game because, as I fucking explained, it's hard enough to biologically handle 120FPS let alone frame rates well beyond that. This is not a conversation I'm having anymore. So many of you shitlicking mongs want to have this conversation and it's always in bad faith. I'm sick of the elitism, it's why I called Matt Miller out and had to make this to showcase the settings I was using to base my judgment in the first place. The fucking absolutely disgraceful FPS whoring and hardware elitism is driving me up a godsdamned wall. The incessant incorrect nitpicking over what 2K refers to is fucking retarded. The arrogance that some of the people here have put on display is beyond insane. So no, there's no justifying higher frame rates as being competitively advantageous, it's literally not founded in reality. The small kernel of truth in it is outshined by all the caveats and niche situations that make the statement true at all. I'm not telling ANYONE to do anything, but all of you fucking disgusting elitist pricks keep coming in and repeating these decade old fucking talking points without even the slightest hint that you understand where they come from. I feel like I'm talking to a bunch of fucking animals because the absolute lack of understanding I'm seeing is just inhuman. FOR THE FINAL FUCKING TIME THE HUMAN EYE IS ONLY CAPABLE OF HANDLING 60 FRAMES PER FUCKING SECOND. YOU ARE NOT FUCKING SPECIAL. YOU ARE NOT EXPERIENCEING SOME INSANE COMPETATIVE ADVANTAGE WITH THESE MASSIVE FRAME RATES. YOU ARE ALL FUCKING DUNCES AND SHOULD BE ASHAMED.
@BlueLightning
@BlueLightning 18 дней назад
@@NathanaelHunter i mean you are just wrong. but if you want to die on this hill for no reason go ahead. the only one angry here is you. the only toxic one here is you. you made a simply incorrect statement, we called you out, you dug your heels in and refused to listen. you dont need to SEE the frames to FEEL the difference. it affects imput lag, and overall responsiveness. i never once said i can see each individual frame, i said i can feel the difference between 240 and 120 and 60. and trust me 240 is just plain amazing. i guarentee if i tried 360 or higher i would still feel a slight increase on overall smoothness. and yes i will take that over a slightly crisper image when lets be honest, 1440p is plenty crisp as is, especially for 27 inch gaming monitors.
@NathanaelHunter
@NathanaelHunter 16 дней назад
For the last fucking time, you're not getting any value beyond that 60-90FPS range and here are a bunch of qoutes and the articles they come from. "...at 13 milliseconds per image, it demonstrates that we can see and discern data from images at up to 72 frames per second." "Although experts find it difficult to agree on a precise number, the general consensus is that the human eye FPS for most individuals is between 30 and 60 frames per second." "Research suggests that the human brain can process visual stimuli in as little as 13 milliseconds. This is the time it takes for the brain to process basic visual information, such as detecting simple shapes or colors." "However, for more complex visual information, such as facial recognition, it takes the brain longer to process the information. Studies have shown that the brain can recognize a face in as little as 100 milliseconds, but it may take up to 170 milliseconds to fully process facial features and emotions." "That’s what the researchers in the 2014 study did to determine that the brain can process an image that your eye only saw for 13 milliseconds." Nobody has been able to cite anything so far that refutes this. The one person who tried did such a bad job in saying 500Hz that I could literally recite the sentence just before and after where it outright said this was not possible in any real world circumstances and only was possible in testing conditions. The only places that say ANYTHING about higher FPS being usable are in forums and conversations. No study ever conducted has found evidence that the human eye can make value from higher than the 60-90FPS range. You want to tell me I'm not listening and then cite numbers to me? Give me the articles and sources that back your claims. Here are mine, top three on Google, and there are dozens more on it saying the same stuff. I'll even toss on the NASA study that says basically the same stuff but for light. caseguard.com/articles/how-many-frames-per-second-can-the-human-eye-see/ www.sighthound.com/blog/human-eye-fps-vs-ai-why-ai-is-better#:~:text=But%20our%20eyes%20can%20only,and%2060%20frames%20per%20second. www.healthline.com/health/human-eye-fps ntrs.nasa.gov/api/citations/20140013442/downloads/20140013442.pdf So yes, I've dug in my fucking heels against this nonsense fucking asspull bullshit that you and a bunch of other elitist fucktards keep trying to say. It's not even an argument that has any backing for your claims. Sure, you can LITERALLY see the higher frames but the human eye cannot produce usable information from that stimuli. The context around the word 'see' in my comments is where the connotation of SEE, RECOGNIZE, AND PRODUCE A RESPONSE exists. Deciding to take the word see literally is where this all gets hung up. I cannot even fucking say something in a short hand way because you fuckers decide to use the literal definitions of words like connotation and context are nonexistant. Seeing and comprehending are not the same. You can see without being able to do anything with what your eyes receive. Don't try and play this fucking game with me. You're objectively fucking wrong and you're never going to provide sources to back what you're claiming. Saying 'but I can see and feel the difference' is not a fucking source.
@yelnatsu
@yelnatsu 18 дней назад
i stopped caring about these debates when i said i'd buy a 20 series gpu last year (2060 or 2070, idk) & got shamed for it. made me delete my comment 💀 i got recommended a 1080 ti. the goat, but still...not what _i wanted_ . btw i got a 2060 super instead. i see it lasting me a good while.
@NathanaelHunter
@NathanaelHunter 18 дней назад
I got my 2080 back in 2018 and I will not be buying new card likely until about 2028. These RTX series cards are all about as capable as the current generation of consoles and if games are on both PC and console there's no reason to drastically over pay for better hardware if I don't get any actual benefit. Slightly better graphics sure, but in a single player game or a twitch shooter, decent enough graphics at a decent frame rate is good enough and a lot of people can't understand that. Your 2060 Super should hold out a few years easily without any problems.
@ugurinanc5177
@ugurinanc5177 18 дней назад
But 2K is 1080P 😢 1K is 540 and 4K is 2160P 👉👈
@NathanaelHunter
@NathanaelHunter 18 дней назад
Incorrect and I'm rather tired of people trying to have this discussion because not only are they almost always wrong but it's always a moronic attempt to pull a gatcha like it's even a conversation at all. 2K and 1440 are not literally the same but 2K and QHD are the same thing and the industry does not care for the LITERAL regard of what 2K is, it's a generic statement and it's not even a debatable topic. But if we want to get a real gatcha moment here, you're not even right in saying 2K is just 1080p because it's not 1080p either. 1K resolution is NOT 540, it's literally 1024x768 while HD refers directly to 1280x720. 1280 x 7201920x1080 is 1080p. 2048x1080 is TRUE 2K resolution, not 1080p but is what cinema projectors call 2K. 1440p is what the general display industry refers to when saying 2K. There are 2 forms of 4K, 4096x2160 and 3840x2160. There are very few consumer TVs that are actually 4K because the 3840 version is the general display standard while the 4096 version is true 4K and is used in the cinema industry. So no, don't say 2K is 1080p because 1. it isn't 1080p and 2. in pretty much all marketing and consumer facing regards, 2K is interchangeable with QHD or 1440p in the same what that 4096x2160 and 3840x2160 are both 4K. You weren't even just wrong, you were so far from correct in your statement that you didn't even bother with the math to check if what you were saying was right, you just assumed take 1080 and divide by 2 and that's 1K resolution. If you're going to try and have this conversation the least you can do is realize I said over and over DO NOT DO IT for a good reason, you didn't get it right, and thus far nobody else who tried this got it right either.
@i3l4ckskillzz79
@i3l4ckskillzz79 18 дней назад
don´t say 2k ist´s just wrong. it´s also not right to say 4k because true 4k is a standard only found in cinemas. just say 1440p or UHD.
@NathanaelHunter
@NathanaelHunter 18 дней назад
2K is QHD and 4K is UHD. I don't know where you got those wires crossed but 2K and 1440p are the same thing, 2K = QHD and QHD = 1440.
@i3l4ckskillzz79
@i3l4ckskillzz79 18 дней назад
@@NathanaelHunter i know "2k" is 1440p but it´s still wrong get your shit together and infrom yourself. to make things clear 2k or 4k are numbers created by idiots and if you want to say it correkt it´s 1440p or qhd not 2k !
@NathanaelHunter
@NathanaelHunter 17 дней назад
You literally said QUD and ignored the fact that QHD is synonymous with 2K you fucking twat.
@femz4952
@femz4952 19 дней назад
Few years late there pal
@NathanaelHunter
@NathanaelHunter 19 дней назад
Late to what? Prove the RTX 2080 is still capable of handling modern titles? I would think that isn't exactly something you can ever be late in doing but sure, whatever I suppose.
@femz4952
@femz4952 19 дней назад
@@NathanaelHunter it could do 1440 just fine when it came out
@NathanaelHunter
@NathanaelHunter 19 дней назад
You realize that none of the games I showed existed in 2018 though right? I'm not too late in saying it can run modern games because those games weren't even out yet. Did you not realize this video is in response to something else and I was all but forced to prove that the card can do what it's supposed to do? Yeah, of course the RTX 2080 can handle 2K and 4K gaming, it's on par with the current generation of consoles, unfortunately, there are a lot of people who do not understand what an RTX 2080 can handle and thus this video needed to be made to show that reality.
@vasili9524
@vasili9524 18 дней назад
Real "GaMEuRsHs" say 1440p instead of 2K
@NathanaelHunter
@NathanaelHunter 18 дней назад
Not a single person says 1440p when saying 2K gets the information across faster.
@vasili9524
@vasili9524 18 дней назад
@@NathanaelHunter I was joking
@NathanaelHunter
@NathanaelHunter 18 дней назад
I couldn't tell and with the sheer volume of people saying that unironically it's safer to assume it's not in a ha ha joking manner rather.
@arnoldtheturtle
@arnoldtheturtle 20 дней назад
Dam these liberals be hating u in da comments unc
@NathanaelHunter
@NathanaelHunter 20 дней назад
They're all wrong but in a very moronic technicality way they're right. They lack nuance comprehension and that's why they are wrong here, not because of the actual reality.
@thedarkcr0w
@thedarkcr0w 17 дней назад
Yes human eye can see more than 60 fps
@NathanaelHunter
@NathanaelHunter 17 дней назад
Yes, it can, though the ability to see doesn't mean it can handle that information. This is why I get so heated when people try and use higher FPS as an explicit reason to buy a better GPU. The average person cannot produce value from a tenth of a second. You're not going to win a fight in a shooter because of a tenth of a second better response time, not to mention likely not be capable of even reacting that quickly anyways. F1 drivers are trained to respond to stimuli like that, military personnel are trained to react to split second stimuli, professional sports athletes are trained to act in a fraction of a second. The average person is not an esports sweat who has the capability to act in the literal fraction of a second that a higher FPS grants for input delay or react better to 120FPS than 60-90FPS. There has been extensive testing and while humans can see higher FPS they can do nothing with that. It's effectively useless beyond the 60-90 range. If you think you're special and you can, then live that lie if you want to.
@thedarkcr0w
@thedarkcr0w 16 дней назад
@@NathanaelHunter dont think im special, you can defo see over 60 fps so can everyone else. Im ok with you being happy with 60, or 90 but dont make yourself a fool saying all this sht in comments xd
@NathanaelHunter
@NathanaelHunter 16 дней назад
You are literally a brainlet. Definitionally moronic. Every single article and research study says you are wrong and you cannot handle that. I don't know how you can't, but you can't. It's not even like there is a study that's not in agreement, it's literally the end result of every single study on the subject. This aint a 9 out of 10 doctors kind of shit, this is literally 10 out of 10 studies show you're a fucking F Minus ass dunce who isn't listening. caseguard.com/articles/how-many-frames-per-second-can-the-human-eye-see/ www.sighthound.com/blog/human-eye-fps-vs-ai-why-ai-is-better#:~:text=But%20our%20eyes%20can%20only,and%2060%20frames%20per%20second. www.healthline.com/health/human-eye-fps These are the first three articles on Google and there is page after page stating this information. If you're just gonna say I'm making a fool of myself and then say shit like higher FPS is better then eat my fucking white as cream cheese hairy ass.
@thedarkcr0w
@thedarkcr0w 16 дней назад
@@NathanaelHunter How can you tell me that when i didn’t insult you at all and all you can use to defend yourself is insults 🤣 There isnt a single article that says that human eye is limited and can only see 60fps, you even edited your original comment on your video to include the world “functionally” because you saw that you were wrong when you did 10 minutes of research. Like i said in the video before i dont need to insult you to say this, I’m glad you are happy with the fps you are getting on your pc but please dont make yourself a fool like this in the comments youll think about this and regret it a few years in advance. Thank god delete and edit buttons exist right ? 😂
@NathanaelHunter
@NathanaelHunter 16 дней назад
Damn fucking right I've just decided insults are easier and fucking shit yes I added the word functionally so people will stop using this technicality bullshit, being about to see does not mean that it's discernable. On top of that, don't fucking dare claim you 'didn't insult me at all' when the comment you left outright says "dont make yourself a fool saying all this sht in comments". That's an insult, made worse by that fucking pansy ass choice to censor what you said. If you're going to fucking swear you'd damn well better fucking do it in full. Nothing gets me fucking heated like candy ass little shits wanting to sound like big fucking people then pulling out letters so it's this PG playground ass fuckery. Cut the fucking shit and say it with your fucking chest. Did you read the articles? Because they may not explicitly state 60, but it DOES note that 60-90 and while everyone admits that, yes, you can SEE higher FPS you cannot actually produce value from it. Here are the literal qoutes where this is stated. "...at 13 milliseconds per image, it demonstrates that we can see and discern data from images at up to 72 frames per second." "Although experts find it difficult to agree on a precise number, the general consensus is that the human eye FPS for most individuals is between 30 and 60 frames per second." "Research suggests that the human brain can process visual stimuli in as little as 13 milliseconds. This is the time it takes for the brain to process basic visual information, such as detecting simple shapes or colors." "However, for more complex visual information, such as facial recognition, it takes the brain longer to process the information. Studies have shown that the brain can recognize a face in as little as 100 milliseconds, but it may take up to 170 milliseconds to fully process facial features and emotions." "That’s what the researchers in the 2014 study did to determine that the brain can process an image that your eye only saw for 13 milliseconds." If you want to read the particular study that found that 13 milliseconds number, that is this: mollylab-1.mit.edu/sites/default/files/documents/FastDetect2014withFigures.pdf NASA did testing in a similar field, that can be found here: ntrs.nasa.gov/api/citations/20140013442/downloads/20140013442.pdf Every single article, EVERY SINGLE ARTICLE, A L L O F T H E M, state rather clearly. Even when the article doesn't want to give an exact number, they cite the sources that ARE giving the exact range. 60-90 is the range, it's not even up for debate. I just picked the first three on google, and the only places that say ANYTHING to the contrary is on forums about the subject. It's almost like the people who did the testing are telling all of you over and over that you're wrong OBJECTIVELY about this. Note that you're NEVER going to be able to find anything on the subject stating FPS numbers or Hz over 100 is capable of general application by people. It's not and it's a biological limitation. Your brain can SEE the information but cannot make usable reactions based around that. Seeing literally and seeing functionally are not the same, and I cannot believe I needed to make this distinction. You may see it, but you will not be able to do anything with it nor retain useful information beyond basic shapes and colors let alone react in a useful way to that information. So again; I did in fact change my comment to add the word functionally because the literal meaning of see has been taken over the connotation of see as in capability to recognize. It was easier to say functionally instead of the word salad that is 'cannot produce functional applicable value from higher than 60 frames per second'. Clearly, this distinction was needed because people like you have tried to use the word see for the literal meaning rather than what it means in the context of the conversation, so yeah, I decided insults were an easier way to express my disgust at the dishonesty behind that intentional choice to not have reading comprehension. I know that next the argument you're going to try and pull is that the input delay goes down, and while, yes, it will become smaller, this is in the tenth or shorter of a second times that the human eye is still not able to make value from so those microscopic amounts of lag are functionally unusable to gain the edge in a gunfight. You and 99% of people are not training to work in those kinds of split second moments. You just aren't and there's nothing wrong with that. It just is what it is. So don't keep pressing me on this shit like you're magically going to be right just because you tell me no articles say this and that I only did ten minutes of research. You did zero minutes of research and clearly didn't bother reading the articles I sent nor checking if I was even wrong. You just decided to say it like the articles either didn't exist or didn't say what I claimed. They DO exist and the DO say what I'm telling you as the qoutes I put show.
@allenrachal
@allenrachal 18 дней назад
Copium
@NathanaelHunter
@NathanaelHunter 18 дней назад
It's not.
@allenrachal
@allenrachal 18 дней назад
@@NathanaelHunter In all seriousness its a great card if you dont expect too much out of it. but im a guy that likes to crank up graphics on ultra with a 3440x1440 resolution and still have a high framerates. (monitor is rated at 165hz) My wife has the 2080 and a 2k 90hz monitor and it does well reaching 90fps in most games with little tweaking.
@NathanaelHunter
@NathanaelHunter 18 дней назад
That's entirely fair. I think many people buy these higher end cards and then don't realize they aren't getting performance benefits because they have a monitor incapable of actually dealing the the framerates they're going after. If people would buy the hardware for what they're doing rather than just what's newest and make sure everything matched properly to get the benefits of what they have, people would generally have a much better experience overall. If people want higher FPS, they do need a display capable of handling that, and it's concerning how often I see people not getting that when talking about hardware and frame rate. Even though my monitor can work with higher frame rate, I make explicitly sure to lock games at 60fps because I've had such bad experiences with frame rates and displays. There's a lot of older titles I play that break horribly at high FPS and it's become so ingrained in my default settings choices to lock games to 60 that I don't even really consider anything higher to be worth it because it's so consistently achievable and smooth that I just don't need more.
@JessicaFEREM
@JessicaFEREM 20 дней назад
2k is 1080p not 1440p
@NathanaelHunter
@NathanaelHunter 20 дней назад
For staters, as I said over and over, on a literal level, 2K is not 1440. It technically isn't 1080p if we are really going to keep having this conversation. It's literally 2048 x 1080 and that's not the 1920 x 1080 that 1080 refers to. Yes, 1440 is technically 2K, however, the common understanding and what the average consumer knows as 2K refers directly to 1440. 2K Resolution is considered to be a general term for 1440, it's not literally 1440p, but it is treated as though it is in conversation. I said this every single time I noted the resolution and that I wasn't having this conversation, but again, in general, 2K all but means 1440p in the same way that saying Google It means search for it on the internet.
Далее
Hey Nintendo, the eShop Sucks.
19:33
Просмотров 234 тыс.
They got a Golden Buzzer 🤣✨
00:46
Просмотров 13 млн
Выпускаем трек? #iribaby
00:14
Просмотров 137 тыс.
What Happened to the 9th Generation?
22:18
Просмотров 1,3 млн
25 Hardest Easter Eggs That TOOK YEARS TO DISCOVER
30:23
It's 2024 and Ray Tracing Still Sucks
4:44
Просмотров 3,2 тыс.
Razer's new keyboard is basically cheating.
7:42
Просмотров 1,7 млн
How NVIDIA just beat every other tech company
9:20
Просмотров 1,3 млн
Black Ops 6 Is TOO Good to be True...
22:55
Просмотров 947 тыс.
PC Gaming: Is The 4K Difference Noticeable vs 1440p?
4:43
How Bad is This $10,000 PC from 10 Years Ago??
22:00
Просмотров 3,3 млн
Where Did Fallout 3's Bombs Actually Hit?
19:00
Просмотров 655 тыс.