Тёмный

Matt Miller PCs is (Potentially) Ruining PC Gaming for Newcomers 

The Long Long Mann
Подписаться 351
Просмотров 2,2 тыс.
50% 1

• Is the RTX 2080 Still ...
I'm not sorry. This guy is a blight on PC gaming. His Benchmarks, if they're all like this RTX 2080 one, are potentially causing issues for new PC buyers. If they believe his videos actually show LOW SETTINGS and that an i5 is a decent powerhouse CPU in 2024, they may buy things they otherwise wouldn't or shouldn't. Bad information like what Matt Miller PCs puts out in this video are without a doubt harmful. This video only got a response from me because I wanted to find a video talking about 2080s in 2024 and it was top of the You Tube search. It's nonstop lies and underhanded choices for no good reason that I can see. Do no listen to Matt Miller PCs. He is SHOWING himself lying in his own video here and is potentially causing real damage to the PC gaming space for newcomers looking for good advice on what to buy.
The RTX 2080 is more than capable of 1440, 1080, and 4K gaming at 60 FPS. You do not need 240 FPS. You should not target 240 FPS without very good reason. The RTX 2080 can do Medium Settings with Raytracing. This is a very capable GPU and it's priced rather decently on the 2nd hand market.

Опубликовано:

 

15 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 196   
@NathanaelHunter
@NathanaelHunter 3 месяца назад
ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-RukOeP3Mkt4.html This video shows that the RTX 2080 is no slouch and it's more than capable of doing everything I'm claiming it can do. Because a solid number of comments I have seen keep saying 'the RTX 2080 cannot handle 4K 60FPS' I feel the need to put what my system has in it that it's able to do this. I don't think it's anything crazy, I think it's pretty standard stuff for a desktop that's about 4-6 years old depending on what part. CPU - Ryzen 7 5800 (Chosen because it was better than the i7 I had before, it does what it needs to do, it's been decent enough) GPU RTX 2080 (8Gb of VRAM is more than enough if you know how to work with the settings in games) RAM 32Gb DDR4 (@3200MHz because I don't need higher speed ram, I need a lot of RAM in total to handle the dozens of Chrome Tabs I leave open) I run Windows 10 still and don't plan to use 11 at any point. I have something like 80Tb of storage, some is HDD, some SSD, and a few External HDD. It's not the highest spec hardware on the market, and it wasn't even the highest spec back in 2022 when I built this Desktop. It gets the job done, handles modern titles at very respectable FPS and graphics settings.
@TriPBOOMER
@TriPBOOMER 3 месяца назад
@NathanaelHunter So the 12600k basically performs identically to your 5800 so I'm guessing your CPU is too weak to run that 2080!? And bottlenecks it alot, since you think the 12600k is too weak to push the 2080 properly 12600k is faster than the 11900k!!
@archoridlolarchorid4477
@archoridlolarchorid4477 3 месяца назад
Bro The I5 is a decent processor which he paired with a decent DDR5 RAM. (Like you say dd3 wtf). There is still I3 out there and they are from the new 14000 line. His setup will push the 2080 to the max. The guy pushed Warzone and other multiplayer games on the lowest because this is how any competitive player plays them. The 3060 Ti beats is the 2080. ( by not much) (The simple 3060 beaten by the 2080 by 20%) (just noted) He said multiple times the card performed awesome. Texture filtering barely costs any performance. He wasn't fair with Starfield this card can run that game on med setting decent 2K. and no other card except the 4090 can really do better on that game without DLSS. So he made the car look bad on single-player games?! He made the bad conclusions because we can say the RTX 2080 is still a very decent card if you can buy it cheaply on the used market but it's not future-proof because to day games use way more video than they should.
@TheIvercon
@TheIvercon 3 месяца назад
Whats ruining pc gaming is the hardware hype cycle brainrot. 8 yrs of not caring to keep up with this stuff and i come back to crazy town. People act like their cpu and gpu choices are life and death lmao
@NathanaelHunter
@NathanaelHunter 3 месяца назад
They have always been around and they're just as obnoxious now as they ever were. The mongo PC people are always so incapable of hearing that anything older than the latest tech is possible to be decent. It's insane how fast they go to tossing specs rather than actually caring about real application. I work in the IT sector (networking mainly but I do hardware repair too) and you never usually run into these kinds of people in the wild. It's an internet thing. They feel safe behind a screen because if anyone started acting like that in real life they'd be laughed at and mocked for it.
@nusession
@nusession 3 месяца назад
This is the first time that i heard the first 5 seconds of the video, and knew I needed to jump to the comment section. seems like a lot of people in the comment section are giving out logical answers and all I am seeing is the host of the video telling someone that their reply is "retarded" Not a good look to gain traction on here (I know your response will be " I dont care about what people say or comment, I dont care about the traffic or numbers" Id say just get an up to date rig, and maybe do a build video, or a review on it after the first 30 days? that would be interesting.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
I gained my initial traction on social media through being aggressively unflinching in completely honest speech. I don't hold back in what I say or how I say it and I expect the same of anyone near me. If people feel a certain way I don't just ask it of them I demand it of them that they hold nothing back and say what they feel without hesitation. It's how I have led my teams in work and it's how I go through life. I would never place that standard on somebody else's comment section or in their video creations, but I have that standard here. I know it's not seen by many as respectful and that it comes off as being confrontational and I say damn right. Will I grow faster falling in line? Probably. But I don't do what I do online for anyone but myself and my own enjoyment and because I can. That means people who disagree with me will come and lock horns with me over it and that's just the way I like things. I've lived with a very literal fight for your beliefs mindset and many of the people who I've literally traded punches with eventually became really good friends. It's been my experience and it's held well enough in my life. I don't expect everyone to understand, but I wont tolerate people thinking they need to be weak willed in my presence, say what you feel and say it with your chest. Say what you mean and mean what you say and stand behind it. Around here, this is do unto me as a I do unto you kinda place. I'm gonna go at it full on with no hesitation in calling something retarded and I don't want anyone to feel that they need to police what they say either.
@kush2023
@kush2023 3 месяца назад
2080 cannot run 4k 60fps on any recent title, maybe in low graphics comp games like rainbow six siege and stuff but you are not running a single AAA or prob AA game at 4k 60fps with a 2080
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
I have a 6650xt that I got for $270 brand new and it either outperforms the 2080 or is relatively even with it and even at 1080p some AAA games are below 60fps for me and I have a 12600kf so im for sure not cpu bottlenecked.
@iitzfizz
@iitzfizz 3 месяца назад
@@peik_haikyuu2265 i have the 6750 XT, going for 300 brand new now and outperforms the 2080 by a decent margin...the only game that i had to turn a couple of things down was alan wake 2 (at 1440p) but that's known for being hard to run but I can still do it on a mix of med-high just flick on FSR which ain't too bad at 1440p and 4k.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
As I said, footage on my channel is filmed on a base model 2080. Everything from 2022 is in 4K and the FPS counter built into Steam in the top left shows 60. That's 4K 60 on Modern Warfare 3. You either don't know what you're saying or are a liar.
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
@@NathanaelHunter you are using the worst possible game to prove that your gpu can run 4k lol, 4K Dlss enabled you can easily get 60fps, I get 120 at 1440p with a 6650xt. Try running cyberpunk at 4k no dlss high setting rt on and try telling me you have a 4K gpu🤣
@itsNightShine
@itsNightShine 3 месяца назад
@@NathanaelHunter you could easily show your footage during the video why do we have to go look at your footage on your other videos … seems pretty low effort work to me.
@itsNightShine
@itsNightShine 3 месяца назад
The higher the resolution you go, the less CPU-demanding the game becomes (usually). The only exceptions might be competitive shooters, where you are often CPU-bound, even with high-end processors like the 7800X3D or 7950X3D with 8 cores disabled. I’m sorry, but you don’t even know what’s in your rig. How can someone believe you when you don’t know your own system? It seems like you have good intentions, but you don’t understand basic concepts. You mentioned the RAM speed, saying it could be DDR3, but the dude clearly stated that XMP was enabled and it was running at 5600MHz, which is DDR5. In his Warzone benchmark, he is GPU-bound, and his CPU usage is barely hitting 60%. Yes, there might be an additional 1-2% performance gain with the best CPUs today, but he’s GPU-bound in almost every game he showed. What are you arguing? No, the 2080 cannot handle Cyberpunk at native 4K on ultra settings. I had a 3070, and even it couldn’t do 4K. A 3070 is on par with a 2080 Super, which you correctly noted. Anisotropic filtering at 16x does not significantly affect performance, perhaps only by 1-2%. I think Digital Foundry touched on this, but don’t quote me on that. You clearly don’t know what that setting does. Apex Legends is a game where I cannot get a stable 300 FPS regardless of the resolution; it’s just how the game engine is. Textures also do not affect performance if you have enough VRAM. I stopped watching at 32:40 when you told him to get off the internet. You should be the one to get off the internet because you have no clue what you’re talking about. Clearly, you either are not interested in benchmarking and have no will to learn, or you are just starting to learn and don’t realize you are wrong. I hope it’s the latter rather than the former.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
Alright I guess I have to go through your shit and point out everything you said that was retarded. I knew it was an AMD something or other. The point was that I have a decent enough CPU and RAM that I can game at 4K 60. My statement about only saying speed was that he made a point of explicitly saying what CPU, GPU, and Motherboard but didn't say anything but the speed for his RAM. The implication was that that could be ANYTHING. Yes, only DDR5 can handle what he said it was set to, but that also doesn't say HOW MUCH he had in there. It's not normal to state all the hardware by name and then not say one part. I was highlighting that it's very odd behaviour to leave out what and how much RAM he had in there. What he is doing is not capable of being classified as a Benchmark, please don't treat it like it is a Benchmark. Benchmarking aims to measure and assess the maximum performance possible at a given task or function. He put the settings at low. This fundamentally is a showcase, not a benchmark. He was targeting very high FPS. That's no longer a benchmark, that's a showcase. There is a very important difference and it's critical to make the distinction. This is why I have such a problem with his video. Correct, the 2080 cannot handle Cyberpunk at Ultra settings 4K with raytracing. By all means timestamp the exact moment where I said it could. You can't because I never said that. I said it can play Cyberpunk at 4K with raytracing at 60fps. I pretty clearly stated that my settings of choice are in the mid/high range and that I usually turn raytracing off and raise the standard graphics settings because I don't particularly care for raytracing over higher fidelity. As you claim that Anisotropic filtering at 16x doesn't impact performance, I have to ask, what are you fucking talking about. If the graphics settings are set to anything but potato low, 16X absolutely will drastically impact performance, more and more heavily the higher you get with the Anisotropic filtering. I don't know why you think this isn't the case, but it unquestionably is so. I have no idea why you are saying this about Apex, this isn't even important. As I said, shooting for high as shit FPS isn't rational and has no genuine benefit. If somebody is just OK at a twitch shooter, a higher FPS isn't going to make them any better. The general term is 'diminishing returns' and going beyond what you monitor can handle provides no benefit to visuals. Saying 'textures don't impact performance' and then putting that IF in there means you understand why I said what I said about anisotropic filtering being at 16X with the graphics set to low was a very deceptive move. The 2080 can handle higher graphics settings and maintain a solid FPS. It will struggle if the anisotropic filtering is set that high because that directly impacts the performance. You clearly just believe I don't have any experience and that I'm just flat out wrong. The reality is that this video was filmed at 1am with zero effort in one take. I work in IT and have for nearly a decade. I understand fully what is considered standard practice for benchmarking and the limitations of hardware like the 2080. I phrase things the way I do for the every man. That's why I say I play at 2K and I'm referring to 1440p. I'm not taking the time to script shit and I never will because the general concepts of what I say are almost always correct. You are claiming that I'm wrong and that I don't know what I'm saying while you should be more than aware that I outright said I use an RTX 2080 and that I can literally state first hand the card can handle 4K gaming. If I'm as wrong as you seem to think, it's in the very nitpicky aspects of how I phrased things and the tiny details like word choice and not the overarching statements themselves. I like to use the term 'nuance is dead' for things like this because while everything I said is very much true, people seeking to make me look like a liar will use the fine details that I was even slightly wrong or misleading with as proof that everything I said was wrong, regardless of the validity.
@WD_Unieles
@WD_Unieles 3 месяца назад
@@NathanaelHunter Bro "namesdrops" that he works in IT and thinks that this argument is gonna convince everyone after saying some dumb shit like 16X anistroscopic filtering has crazy performance impact.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
FOR THE LAST TIME GUYS, IF YOU SET YOUR GRAPHICS TO LOW, THE FILTERING WILL NOT HAVE A LARGE IMPACT. IF YOU HAVE THE GRAPHICS SETTINGS AT HIGHER LEVELS, 16X ANISOTROPIC FILTERING WILL HAVE A DRASTIC IMPACT ON PERFORMANCE. What anistroscopic filtering does is that it takes the textures on objects and warps them to pretty them up an viewing angles other than head on. I would think that the concept of 16X being more impactful to performance on higher graphics settings than on low would be obvious but we're still at the point apparently where this isn't well known. If people think that 5-10 frames of performance difference with higher AF levels on higher graphics settings is a small impact that's fine, but that's definitely not a completely unnoticeable performance impact. This is something that literally anyone can test for themselves and they will get a consistent result of a FPS drop with every step up in AF they go.
@jacobfritkin4121
@jacobfritkin4121 3 месяца назад
I haven't watched the video. I just know that when someone on the internet has an opinion it always goes bad. I am here for the comment section...
@leechair6256
@leechair6256 3 месяца назад
If the garbo 4060 ti can do 4k then a 2080 ti can do 4k at higer performance. That low bus bandwith really kills the low end Nvidia gpus. Nvidia is so tone def ngl. I can't belive they dropped the same 4060 ti with 16gb of vram instead. The thing is a waste, they should have made it 12gb or 10gb vram with a beefier bus bandwidth like 192 or 256 like the RTX 3060 ti had. Honestly that would have been better for the gpu, but they will never learn.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
A lot of people in this comment section would probably call you a liar and discredit you entirely, but you're 100% correct. I don't know half the time if it's NVIDIA bootlickers or if it's just hardware elitists but they're all incredibly persistent and annoying about what the 2080 can and can't do all while being wrong.
@auggieaxiom5726
@auggieaxiom5726 3 месяца назад
this video is stuck at 0 fps
@NathanaelHunter
@NathanaelHunter 3 месяца назад
Click the play button.
@davidbae5615
@davidbae5615 3 месяца назад
ok show me your I7 with a 2080 running cyberpunk on 4k with raytracing an show the 60fps that you claim it can do please... i will wait.
@itsNightShine
@itsNightShine 3 месяца назад
I have a 4090 and I get 60-80fps with raytracing at 4k ultra native no way the good old 2080 is doing that. ( just tested in dog town btw)
@NathanaelHunter
@NathanaelHunter 3 месяца назад
I used to have an i7. I had to check, but I currently am using a Ryzen 7 5800X. So while I cannot show that, I am more than happy to go ahead and prove the 2080 can handle this. Mind you I also said repeatedly that I don't run Cyberpunk or anything at 4K anymore, I play at 2K on a 4K screen. I also outright said that while I play with the graphics turned up I don't turn raytracing on because I prefer the higher resolution textures and don't think raytracing is all that worth it over better overall settings. Since this has become such a common trend of everyone in the comments section here saying 'it's not possible' I guess I need to get the footage and prove that it's 110% not just possible but that it's what I actually do. Not that I didn't already say just go look at the footage on my channel, it's all 2K 60FPS mid/high settings.
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
@@NathanaelHunter 🤣 you are on here trying to flex that you have a 2080 that can run 1440p medium settings and get 60fps with rt off🤣 a gtx 1070 can do that no problem😂
@NathanaelHunter
@NathanaelHunter 3 месяца назад
Right. I must be a troll. Not you, saying that a GTX 1070 can do everything I'm making an RTX 2080 do. It literally cannot. It's not even capable of Raytracing. Fucking graphics whores I swear you're all a plague on the population who want to have rational conversations in tech. Even funnier that you really don't see fucking mongs like this in the IT space, it's only ever online. Funny that. I like to believe it's that you jackwads never go outside and frankly it definitely seems to be the case.
@TriPBOOMER
@TriPBOOMER 3 месяца назад
@peik_haikyuu2265 the 2080 is 'slightly' faster than the 3060 12gb and slightly slower than the 3060Ti all 3 cards within 10-15fps of each other.
@iitzfizz
@iitzfizz 3 месяца назад
gotta be trolling
@thomasburchsted3287
@thomasburchsted3287 3 месяца назад
100%
@NathanaelHunter
@NathanaelHunter 3 месяца назад
Me? No. Matt Miller? Also probably not. All the footage on my channel is games using a 2080 at 4K or 2K. There is not trolling here, just reality.
@myamaha62
@myamaha62 3 месяца назад
Play Alan Wake II with a 2080 and see what happens. It won’t be pretty. The 12600k is faster than your 5800x in gaming. If anyone is holding back the 2080 is your system.
@lordstorm8555
@lordstorm8555 3 месяца назад
8gb is limiting, my 3070 ti it hits the max or near it almost all the time, if my 3070 ti had at least 12gb it would be ok fir like 1-2 years but 16gb would be were it needs to be. When it maxes out fps tanks. If your thinking about bf2042 you cant use that for gpu comparison...
@NathanaelHunter
@NathanaelHunter 3 месяца назад
I play 2042 somewhat often on my RTX2080 and I have no issues with getting a stable 60FPS at 2K medium/high settings. I think you have a bottleneck somewhere beyond the GPU if you're having issues with a 3070 TI, I would consider seeing what settings you have going on and try making some changes.
@waynetuttle6872
@waynetuttle6872 3 месяца назад
Hate to tell a LOT of people out there, pretty much every reviewer skews results and for the most part, are bs. I can watch the 3060 ti under a major reviewer go from a decent 60fps 1440p gamer to barely holding onto 1080p with DLSS upscaling in performance mode in the EXACT same games. I did exact same benchmarks with as identicle system as this Greatly Named computer reviewer and I, on average, got 15% higher frame rate with the 3070 ti. Don't even get me started on the major payouts that must be coming from big water cool, cuz how else do you explain the Noctua NH-D15 getting 70c with a 5900x in 2021 to barely able to keep the exact same CPU under 90c in 2023.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
There's skewing results, and then there's what Matt Miller is doing here. Yes, hardware can have some variation in the production line that can alter performance and yes, a different CPU and RAM choice can make some games run better or worse with any GPU. Matt Miller is intentionally for whatever reason putting a very low end CPU into this rig. That's going to hard limit the performance of any game he plays right off the bat. Then he doesn't bother saying how much or what RAM he's using, just that it's got XMP on and the clock speed is high. It could be garbage RAM and making the results even worse. On top of all of that, he literally says he's setting the games to Low and has them set to 16X Filtering. It's one thing to have a paid deal to make something look better or worse. It's different to do a bad job because you have a bias. It's entirely unacceptable behavior to be as deceptive and bad at your entire stated objective as Matt Miller is. This is, for reasons I cannot even begin to understand, lying. There are zero acceptable excuses for intentionally lying to your viewers. There are zero reasons to put out a video and say with such confidence that X Graphics Card is only capable of Y output when it's blatantly false. It's not something I can just chalk up to being 'skewed'. Matt Miller is, as I said, outright lying and deliberately misleading potential consumers with this video. I can only assume he has done this with ALL of his content. He's an FPS elitist who's also completely moronic and incompetent. He said with what he believes to be valid authority that 240 FPS is the golden sweet spot for gaming as if it's based in any possible reality. Yeah, go ahead and turn VSYNC off to get that 240 FPS and call it 'competitive settings' like that's in any way going to make it true. He's a mong who doesn't even have the brain capacity to comprehend just how stupid he actually is. He disgusts me.
@cmeier360
@cmeier360 3 месяца назад
There are many factors for why your cpu is running hotter, it could be your games you are playing, or a new gpu which could create extra heat in your case, dead fans, worn out thermal paste, windows scheduler, older/bad drivers, failing cpu fans. You might want to check your voltage settings, if you updated your bios some settings can change, perhaps clean out your case of any dust that might have built up. I have a NH-U12A which has been stable for 3+ years nows, you could always reach out to Noctua, they have great support and might be willing to replace to cooler.
@demonpandaz8246
@demonpandaz8246 3 месяца назад
LTT and Nexus DON'T do this (Nexus called out LTT and then LTT corrected themselves). If you aren't getting the same performance that is a you problem. Try finding the programs that cause your problems.
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
@@cmeier360 and on top of all of that amd has pushed out constant chipset updates so more than likely its now providing more power to the cpu for better performance than it was getting 3 years ago.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
This. Literally this. Showcasing and Benchmarking are different. If you aren't going to do what is considered proper benchmarking, then you should not be able to call it a benchmark. I'm not a benchmarking kind of guy. I don't care much for tossing out numbers and hardware specs when I can more easily just show what hardware is actually capable of doing. LTT and Nexus are generally trustworthy people for saying the fine details of what hardware can do. I do not think that the way Matt Miller was showing the RTX 2080 is anything CLOSE to what is expected for a benchmarking and that was the main thing I was trying to get across, though clearly the small details seem to have become the focus rather than the severe issues with the way Matt Miller was handling the demonstration.
@leechair6256
@leechair6256 3 месяца назад
Someone give this guy a 7800X3D as a baseline gaming CPU to use in GPU benchmarks.
@Skitzotech
@Skitzotech 3 месяца назад
You have 1 good point and I'll give you that... He could have tested max graphics along side lowest graphics... With that said... 1. 60 fps should never be your "its a good card" decider. I get motion sickness anywhere under 80 fps. And I start losing accuracy in my tracking and flicking under 100. 2. I have an 8gig 3070 and with chrome open in the background I cant run bf2042... It hits 7.8gigs and starts stuttering its self into oblivion. 3. Textures and Anisotropic filtering have almost 0 effect of FPS once you have enough VRAM to fit said textures. If you this mad about Antialiasing then I'd totally understand but not Anisotropic. 4. Vsync reduces your FPS and ads substantial input delay. (also all settings that add things to the image AFTER the image is made all add their own amount of input delay EACH. Examples of these settings: post processing, antialiasing, effects quality, ambient occlusion, VSYNC, etc. Turn these off and you will have a snappier more connected feel.) 5. RAM... ram brand does not affect its performance... only its looks, quality and longevity. What dictates rams performance is speed, latency, din, channels.
@itsNightShine
@itsNightShine 3 месяца назад
Everything you said was dang right !
@NathanaelHunter
@NathanaelHunter 3 месяца назад
Alright, let's go through this in order: 1. 4K 60 is what consoles are shooting for. This has been a debate for years now and most people either say 120 or 60. I believe 60 fps and 2K resolution. The average untrained eye can't really tell 4K 60 vs 2k 60 let alone actually make real value from 120FPS. You personally getting sick viewing 60fps is a personal issue, not an issue with standards. 2. I usually have around 30 - 50 tabs of Chrome open at a time. I admittedly usually am hitting 40% of my 32gb of RAM, but I can (though I rarely do) play Battlefield 2042 at the same time. I think that's a deeper issue to look into with your RAM as Chrome isn't really doing anything to your VRAM. Sounds like you're bottleneck is there. 3. 16X requires a massive amount of VRAM to handle. This is the same deal with settings like Texture Resolution and other VFX heavy settings. him dropping the normal settings but cranking the filtering all the way to 16X is, with no question, lying about saying his settings are at Low. The 2080 can easily handle mid/high settings on all the games he said with 4X filtering if not 8X. There was no reason he should be dropping the video settings that low and cranking the filtering. 4. It CAN add input delay. The amount of delay isn't worth the screen tearing he's got going on. I would think it's impossible to argue that somebody should play with screen tearing rather than a very small handful of milliseconds of input delay. I've been playing twitch shooters for years and I have no clue how anyone could argue that VSYNC off if it's causing screen tearing is a good idea. It isn't. 5. Note that when I said he didn't tell us the RAM on his build, he didn't say ANYTHING about it. Only that it has XMP and the speed. If I said brand, it's because it was 1AM when I filmed this, but the statement isn't any less valid. He didn't say how much he had in there. That could have been 2 sticks of 4GB RAM for all we know.
@itsNightShine
@itsNightShine 3 месяца назад
@@NathanaelHunter 1. Consoles are not a good standard; they cost $500 for a reason. If Sony and Microsoft could, don’t you think they would offer 8K at 240fps natively? Most of my friends can easily tell the difference between 1080p, 1440p, and 4K, as well as between 30fps, 60fps, 120fps, and 240fps. They aren’t superhuman, but in my experience, the people around me can differentiate. However, that’s my reality, and it can vary from person to person. 2. I’m not judging, but I don’t understand how you can be productive with so many tabs open. Chrome can use a lot of VRAM depending on what you are doing. 3. Anisotropic filtering does not use much VRAM at all; texture quality does. I invite you to read the wiki on how anisotropic filtering works. If 100% of the screen is at an oblique angle, then I can understand the VRAM and bandwidth impact. Keep in mind that filtering only happens when needed, and most textures do not need 16x constantly. So even if you set it to 16x, very few portions of the texture that are at an oblique angle will use x16 filtering. 4. It absolutely does add input delay; you are correct about that. I personally agree with you that it’s smart to use G-Sync, FreeSync, and other sync technologies to negate screen tearing. I never noticed the difference in latency, but some people swear by it, and I can understand that they might be more sensitive to latency and less to tearing. 5. A DDR5 module has a minimum of 8GB per stick. He had two, which means a minimum of 16GB.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
1. If you want to use PC has the standard then I have even worse news for you, the vast majority of PCs per the Steam Hardware Survey are about on par with consoles. Just because better hardware exists does not mean that they are the standard and they are not what game companies target. Just because you don't like it doesn't mean it is not the truth. 2. Chrome uses regular RAM, it doesn't really tap into VRAM, that's why I have 32GB and should I ever find I have too many tabs open, I can close them or simply get more RAM and go to 64GB or heaven forbid 128GB. I use my computer for a lot of things other than gaming and many things, such as going through technical documentation on things like Cisco Switches that I work with, means that I can't simply close tabs all the time. I have a very non standard use for my computer, and for that I'm well aware of what kind of impact dozens of Chrome tabs can have on performance. 3. All textures are impacted by AF at all times when it's enabled because there will always be a texture at an angle to the perspective of the player. The impact of 16X AF on performance is most notable with higher graphics settings, most notably the textures. Higher resolution textures are going to need more power to handle 16X AF than if it was lower resolution textures at 16X AF. Anyone who thinks AF has basically no impact likely doesn't notice the impact as they have hardware that can handle the frankly insane levels of detail modern western AAA games have. They do look pretty, but the hardware needed to have things like 16X AF, Raytraced, and set to Ultra while also getting stable frame rates is pointlessly expensive and for almost anyone, it's not even worth it. 4. For most people, that input delay is a non issue. They likely are not winning or losing gunfights in shooters because of a few milliseconds of delay. Screen Tearing absolutely can impact an engagements outcome. Yes, very bad input delay can decide a fight, but the delay from VSYNC being on is only going to be noticable if the screen and FPS locking are wrong. If VSYNC is set up correctly, the delay is going to be so small as to not be a genuine factor. It's added to games because screen tearing is THAT bad of an issue. I do not, nor will I ever, say that Screen Tearing is acceptable to want over lower more stable frame rates. You only even get Screen Tearing if your monitor isn't able to handle that high of a frame rate anyways so most people would be better off with VSYNC on anyways. 5. I hate that I need to keep saying this, but while yes, he was using DDR5, his video is claiming to be a Benchmark (it isn't, it's a showcase by every possible metric and definition) and as such my point in bringing up things like 'it could be DDR3' is to highlight that specific notable lack of outright stating what he was using. If this is to be treated like a Benchmark, as he wants it to be, then the lack of an explicit statement of what RAM was in there and only saying the speed of it is unacceptable. That said, 16GB of RAM is rather low for a Benchmarking computer, that's a concerning choice if that was how much he had. When doing a GPU Benchmark, you need to far surpass the power of that GPU in all other hardware to make explicitly sure no other component is impacting the GPU performance. 16GB is considered to be just about the minimum for modern gaming and I would hope he isn't using only 16GB in what he claims to be a Benchmark. That is also why I was so harsh on the choice of CPU, even if I ended up being wrong about the i3 still being sold (though with the way Windows 11 isn't even allowing i3s as an option and many other factors that led to my incorrect though about them they likely will cease to be available for sale soon.).
@benthehuman8503
@benthehuman8503 3 месяца назад
word of advice, if you're going to make an argument against someone's opinion don't start by attacking their appearance. You lost any little credibility you had in the first 50 seconds.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
It's a good thing you're not in any position to validate me then because I'm not interested in looking good in the eyes of anyone. I say what I mean and I mean what I say, if I insult somebody you can damn well bet I felt pretty fucking strongly. I don't give a rats fucking plague riddled ass about if you or anyone else wants to take me seriously. Insulting him doesn't invalidate the rest of the things I say.
@benthehuman8503
@benthehuman8503 3 месяца назад
@NathanaelHunter no one takes you seriously. You're openly being mocked by everyone in the comments, including the guy you tried to insult.
@ZA_Scorch
@ZA_Scorch 3 месяца назад
I would much rather stick to buying a new GPU. I don't want someone else's worn-down card. Besides, in some countries like South Africa, you can only find new 30 series and 6000 series cards.
@robotsix6268
@robotsix6268 3 месяца назад
ddr3 with a modern cpu? bro you've lost it. Bear might be wrong, but you don't even know a thing about what you're talking about.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
You are either intentionally ignoring or just didn't understand the meaning behind me saying DDR3 there. He made an explicit choice to name all the other hardware down to the model and NOT the RAM. He only said XMP and the speed. While yes, it is obviously DDR5 RAM and not DDR3, the point of saying 'it could be DDR3 for all we know' is to highlight the fact that he was not wanting to say how what RAM he had in there. People have chosen to take my words for what I literally said rather than the deeper context of them. I was showing the deceptive nature of what he was doing and saying, not explicitly claiming it was DDR3, though I can understand if this was not particularly easy for people to grasp. I did rather swiftly move on after saying it.
@DanielCardei
@DanielCardei 3 месяца назад
2080 TI Though... 🤔 That 11GB of Vram its still flippin a lot of cards. Do remember you will barely have the chance to fill up 11GB of VRAM. Only one game taht i manage to see on 4k at maxim details it was The Last of Us Part I using 9.9GB of Vram. But majority of them are using 7-8GB of VRAM. If you use max vram on a 8GB RTX 20 series, you can activate DLSS and you are good.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
People genuinely don't believe older cards are worth buying, and you hit the nail on the head, it takes a LOT to actually max out some of these GPUs and many modern titles aren't even doing it. With things like DLSS available older cards can still have a lot of life in them if only people weren't so quick to make sweeping inaccurate claims about what real world performance they can reach.
@JackPCBuilds
@JackPCBuilds 3 месяца назад
Before any of this, I want to say that I am not claiming that the RTX 2080 is some unusable piece of garbage because it is not the latest generation. In fact, I quite like the 2080 (so long as it's the right price obviously). However I just want to give you some honest criticism. You're calling out another creator for lying, spreading false information, and ruining PC gaming - those are strong statements. You're also claiming his benchmarks are flawed. All of this would be ok and actually positive for the community if you had a means of backing this up. But that's where the concern arises. It's perfectly fine not to be a full time hardware nerd, but what I am saying is if you're going to make a video calling out another creator for misinformation, then you'd better bring accurate information to support that claim and prove that what the other creator is doing/saying is actually misinformation. Does that make sense? What I am saying is if you are going to do a full 50 minute long video criticizing another youtuber, then your own information better be accurate. I'm not saying the 2080 is bad and anything less than the RTX 4090 is unusable. But only 8 minutes into the video it's already apparent that you're critically uninformed when it comes to CPUs. You claimed that the 12600K isn't good enough to benchmark with the 2080, and that using it is going to heavily skew the benchmarks. But you are using a Ryzen 7 5800 yourself, which performs on par with the i5-12600K. You mentioned that you used to run the 2080 with an older i7 but upgraded to this Ryzen 7. Then you said that the i5-12600K is not good enough to benchmark the 2080. Except, it is... Everything you said there contradicts itself. Then you said they don't make i3's anymore which is just false. That right there calls into question the credibility of your entire video for anyone watching. Do you see what I am saying? This is not hate, just an honest criticism. It's okay to not spend a ton of time researching this stuff because for most people it's just not interesting. But when it's not okay, is if you're going to be making a video criticizing someone else for spreading misinformation. That's when you need to do research first and ensure that your own information is actually accurate. Like I mentioned earlier, making a video calling out another creator comes with high standards by the nature of things, because videos like this have the potential to do serious reputation damage. I would have no problem with this video if all you said was that you believe his video is misrepresenting the RTX 2080, and the reasons why you think so are X, Y, and Z. Backed up with accurate information that shows you really put time into researching this and really did it for the good of the community. But some of your responses to comments have just been immature, and honest advice man, I'd advise you to drop the ego and stop insulting everyone in the comments for correcting you on things like what you said about the CPU. Especially when it comes to things like this, insulting people when they disagree with you is just not the right way to go about things. Even if people insult you first, returning it back is still just a bad move because if your information was accurate you wouldn't need to resort to that to "win" the comment war. And honestly we're all just wasting our time arguing about this. To the tech nerds like myself, this video is going to (and did) come across badly simply because of inaccurate information. And to the less informed viewers, this video is going to come across badly because of the way you're going about things. I'd really just advise that you at the very least go about this a different way. I hope you can take this to heart. Have a great day
@NathanaelHunter
@NathanaelHunter 3 месяца назад
This is gonna be another long comment... You and a lot of people seem to have chosen to take my words as literally what I said rather than the meaning of what was said. For a start, you will note that I lumped all i series CPUs into the number and nature of it rather than naming anything more specific. This was to state rather generally that in each release, the i5 will always be weaker than an i7 and that will be weaker than an i9. This is how Intel does things. While I was wrong on the existence of the i3 and don't have time to restate all that I said to Matt about this when I explained what things led to my incorrect assumption that Intel had actually stopped selling the i3 line, suffice it to say, the i5, as a generalization, is objectively weaker than the i7 and i9. I was not interested and am still not interested in hashing out petty debates over if Y model i5 is better than X model i7. That's a non issue and has no factor in the reason I brought attention to his CPU choice. Picking an i5 for a Benchmark is already a very concerning choice as it goes against standard practice. You can't just pick whatever part for no reason, there are specific reasons to go with specific parts on a machine that's to be used for a Benchmark and he picked that specific model of i5 and said he was going to use it ALL YEAR. That's not a normal choice and it's not in line with what is acceptable for Benchmarking. Generalization is acceptable for making a point, you can lump i5 as worse than i7 and i9 and say that it's not normal to ick that CPU for a Benchmarking PC. It's not wrong, it's not inaccurate, and it's not even really rational to say it's a problem or to bring up specific models in the conversation here when I didn't do that myself. I said it was bad and not a good choice for Benchmarking the 2080 and I stand by that, it's not good practice and it's a shameful decision. Anyways, skipping over all the other things you noted as I've addressed these points over and over many times now; I will always say that people who have financial incentive to mislead doing anything to mislead are disgusting and I will always call that out when I see it. I didn't say anything about it until the end and I didn't explain in any detail, but PC flippers disgust me most of all. They have direct financial incentive to be as misleading and underhanded as possible to maximize profit. The reality is that PC hardware is a razor thin margin for profits and it is NOT a market that the average person can thrive in without doing very shady and frankly manipulative things. Flipping as a concept is not inherently bad, but the way it almost always ends up going IS bad. Used car dealerships are a perfect example. It's in their best interest to put as little money in and do as little work as possible and upmark the price as high as they can, almost exclusively to the customer's loss. There is a LOT that can be done to scam PC buyers and things like saying Higher FPS is better is an easy way to con less than intelligent buyers. I have no respect for people in the PC flipping market due to the way it almost always goes and I have no issue is saying they are potentially all nothing more than scammers because it very often can be found that the engage in the same tactics as scammers. Moving on from the flipping aspect of why I hate misleading people, Matt is trying to pass off this Showcase as a Benchmark and he rather intentionally made the choice to go with strange hardware selections and put all the games to their lowest settings despite the incredibly deceptive nature of those choices. I have nothing to gain from saying what he was doing is disgraceful, nothing to gain at all, calling it out was done in an act of rage and disgust, and I stand firmly behind what I said and how I chose to say it all as I always do. I do nothing and say nothing that I don't stand behind, people who don't say what the believe and stand firm in their convictions disgust me just as much as people who aim to deceive. I admit when I say things that are wrong, as I did with the i3 many times both here and in his comment section. I will not say that I was wrong to say what he was doing was disgraceful and deceptive. It was, without a doubt, deceptive and potentially had real impact on people who were looking for information on the RTX 2080. His showcase being treated like it was a method of displaying unbiased information was shameful and disgusting. I was enraged to see such behavior, especially when I was looking for an unbiased showcase of the RTX 2080 for just that reason, and it's something that I could not allow to stand. The way I act in my comment section is how I act in all places at all times, without any filter and without any desire to protect people from harsh statements and insults. I hold one thing above all others as my rule for how people should and can act when around me and that is to hold nothing back and to speak your mind and do so without hesitation. If what you think and feel is that somebody is an arrogant retarded fuck up, you should say it. It goes back to what I said, say what you mean and stand behind that. I live by the age old saying of treat others how you wish to be treated. You can damned well expect that I would be insulted more by somebody being polite if they feel otherwise than them saying how they really feel. I believe strongly in unfiltered raw conversation. If people feel they need to walk on eggshells in a conversation they will not speak nearly as often or with the passion and emotion that is needed. I don't blame you for thinking it's boorish behaviour and that it could be off putting to people. It's just how I've lived my life and how my experiences guide my actions. I don't hold others to that standard in their comment sections, if they want it to be a place of non aggressive conversation, than I have no issue with them having it be that way, but in my videos and in my comment section, I don't want people to feel like they have limitations, they are held to the same standards I hold, if I was to have them be chained to standards I don't hold, I would be a hypocritical bastard, and that is not the kind of man I am. So while I can agree, my way of speaking and decision to not hold back and go directly for what is objectively a conflict seeking conversation style may come off as seeking a 'win' as you say, just know that it's not wanting a 'win' in any way, I simply speak as I believe and in a manner that holds nothing back.
@FilthyWeeb69
@FilthyWeeb69 3 месяца назад
He is so obviously trolling. Can't be held accountable for words taken literally. If you didn't want to be taken literally why did you make a 52 min stating that he is skewing results, why are you commenting on peoples comments. Words you say and type are meant to be taken literally your just trying to use this as a cop out. 12600k is plenty fast enough to fully saturate the RTX 2080 on 1440p 4k and 1080p - 1080p being more CPU bound obviously. "Picking an i5 for a Benchmark is already a very concerning choice as it goes against standard practice." - this makes absolutely no sense. The reason this make zero sense is because the CPU is not the limiting factor the RTX 2080 is. The 12600k can handle a 2080 no problem. He could have paired it with the newest cpus and the cpus would still be bottleneck by that card at every resolution. He picked that part because it made sense, clearly you lack that.
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
bro your 2080 is not a 4k gpu for one, and for two an i5 12600k is a 10 core 16 thread cpu that can get 20k points in cinebench, the i7 9700k which came out the same year as the 2080 scored around 9k points in the same benchmark. The i9 9900k scored around 12k in the same benchmark. Those cpus are slow and are a bottleneck in any modern title with any modern gpu. A modern day i5 will double the performance of those cpus and use much less power while doing it. You can pair a 4070ti with an i5 12600k and not be bottlenecked. And where are you hearing that you cant buy an i3? Did you do ANY research before trying to shit on another creator? You can buy an i3 14100f for about $100 and out perform a 9900k which is still selling for around $400 and has double the cores.
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
and you obviously don't know much about pc's but xmp being disabled can make you lose up to 10% of your performance, and once again an i5 12600k is capable of running a 4070ti perfectly fine so nothing about this is rigged, the 2080 in his system is the slowest part by far and is a bottleneck to the rest of that system. How can his system be rigged when the gpu is literally the slowest part in it? at 16:04 you can literally see that his cpu is only at 50% usage while the gpu is maxed out and then you immediately say that he gimped the cpu, are you like trolling or something? And you saying that "the every man isnt going to own a 200fps capable monitor" when you can buy a 1080p 240hz 1ms curved monitor brand new off of amazon for $130, it is quite literally entry level pricing. You are stuck in the past, tech has upgraded so much in the last few years that old hardware is pretty much null and is insanely cheap. When you can buy a budget cpu that doubles the fps of the highest end cpu from 5 years ago, that hardware is no longer useful.
@iitzfizz
@iitzfizz 3 месяца назад
"can't buy worse than an i5" what is this guy smoking
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
@@iitzfizz exactly lol he’s trying to say the 12600k is why his 2080 isn’t a good card for 4K🤣 when that cpu can handle a gpu 4x as powerful as a 2080
@iitzfizz
@iitzfizz 3 месяца назад
@@peik_haikyuu2265 i thought it was legit but when he started talking about cpus deffo seems like a troll or he's a lost cause. Also 50TB storage wtf 😂😂
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
@@iitzfizz yeah 50tb of hdd’s flopping around in his case🤣 I just can’t believe he thinks a 12600k is holding back a 2080😂
@TheOne214
@TheOne214 Месяц назад
rtx 2080 can handle mostly native 4k 60 fps in older AAA titles but not in the latest AAA titles from 2021 and onwards but it can do that in lower graphical presets just not at ultra, high and even medium settings excluding raytracing now that will just not happen with the latest titles even on low rt settings the rtx 2080 wont even be close to 30 fps at 4k especially on the latest and a little bit older AAA titles like shadow of the the tomb raider from 2018 you get late 40s to a little above 50 fps on very high preset on native 4k with raytracing shadows disabled with the rtx 2080 so i guess it depends on what kind of game you are playing and how well optimized it is as well i myself play AAA titles at locked 60 fps cause i cant tell a difference between 60 to 120 fps because i play these games with a controller only fps competitive games like COD, BattleField etc you need that higher refresh rate especially if you are playing with a keyboard and mouse than it does matter i myself have a rtx 4080 super and a i7 14700k and a lg ultragear 27 inch 4k 144 hz g sync panel and i play AAA 3rd person and story driven titles at 4k 60 fps and its a buttery smooth experience with a controller and even with a mouse and keyboard.
@MMillerPCs
@MMillerPCs 3 месяца назад
Aww just in time for my birthday 🎉 🧡
@JustinDorff
@JustinDorff 3 месяца назад
Lol happy birthday Matt!
@iitzfizz
@iitzfizz 3 месяца назад
I want an i3 for my bday but apparently they don't make them any more
@itsNightShine
@itsNightShine 3 месяца назад
Happy birthday !
@MMillerPCs
@MMillerPCs 3 месяца назад
@@itsNightShine thank you 🧡
@itsNightShine
@itsNightShine 3 месяца назад
@@MMillerPCswatched some of your videos keep up the work I enjoyed them ! Dont listen to this guy
@myamaha62
@myamaha62 3 месяца назад
12600k is faster than what the 2080 can handle. The i5 isn’t what they used to be. I3 12100 exists. Can’t be a tech tuber if you don’t know this. A 13th gen i3 outperforms a 9th generation i7 or i9. IPC improvements are just as important as core count.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
1. I'm not a tech youtuber and I never claimed to be. 2. I didn't bother adding anything about it here, but when I said they don't make the i3 anymore there was several things that all were in my head at the time about the i series that meshed into the i3 not being for sale anymore. It wasn't true, but given Windows 11 not supporting i3 cpus is a thing I would start to believe that the i3 is likely going to be phased out. That and Intel outright said the i series naming is going away so in a way the i3 dies with that anyways. 3. I was clearly generalizing every i series cpu into it's section. That may not be the tech tuber thing to do, but it is the 1am borderline asleep let me slam this video out with layman terms and generally correct statements thing to do. So while yes, I am more than happy to say that the i5 is not objectively worse than the i7 in all cases, if you consider the most up to date hardware as the metric, saying the i5 is worse than the i7 and the i7 is worse than the i9 in a very general way that's entirely true. It's not per spec and model true, but it is generally a factual statement.
@TriPBOOMER
@TriPBOOMER 3 месяца назад
@@NathanaelHunter WIN 11 is supported on the 10100, 11100, 12100, 13100,14100 that's 5 generations (4 years!) of ''i3'' processors, all officially WIN 11 ready so again your wrong! and I think you need to learn about PC hardware before complaining about someone else's selections.
@JessicaFEREM
@JessicaFEREM 3 месяца назад
my i7-6700 in real world gaming can still just about keep up with a 3050 so I mean you don't really need a massive CPU compared to GPU unless you're streaming or multitasking but if you're testing the GPU I get you want to make the GPU the point of the bottleneck not the CPU
@niksi01
@niksi01 3 месяца назад
@@NathanaelHunter What are you on? Intel is making i3 you have 14th gen i3 and you can buy them. Please educate yourself before posting stuff like this.
@s0-s08
@s0-s08 3 месяца назад
a 2080ti is still worth getting imo if you can get them for around 170-200$ anything over that you can get a newer card on the sec hand that will perform better. but you do have to know they did have memory problems from some venders running the hinext memory chips so you do have to watch out for that.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
I say it rather often when talking about hardware but if the price is good and you don't have an explicit reason to buy a more powerful piece of hardware, buying something weaker at a good price is always a good idea. If people would just buy what they need rather than what's the newest and take strong consideration of what they want to get out of their hardware, they would probably save a lot of money and barely notice the difference.
@Ligby
@Ligby 3 месяца назад
Just picked up at 3080 for $250 but I really didn’t even need to upgrade from my 1070 but I can’t pass that deal up.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
This is how PC gaming used to and IMO should still be, buying upgrades when the price is right and there is a sizable increase in performance options. Going from a 2080 to a 30 or 40 series card doesn't provide a worthwhile performance jump for the price people would pay. Going from a 1070 to being able to take advantage of things like DLSS and Raytracing for a good price is great. If only more people would think like you did and buy what makes sense rather than spend time saying anything worse than the latest tech is bad and shouldn't be bought.
@cmeier360
@cmeier360 3 месяца назад
i5's can be really solid, look at the i5-13600k that thing is a beast for the price. However the i5-12600k is way worse even though its only 1 gen behind its massively different in performance for the price. No one should buy a 12600k in 2024, since it uses the same motherboard as the 13600k so you wont be saving much by buying it instead (maybe $30 at the max). I understand he wanted to pair the 2080 with something you might pair it with but this is just a sick joke, who tf watching that guy for advice. I also would like to note that not all i5, i7, i9 are the same for example a i7-11700k is worse then a i5-13600k. The newest intel is 14th gen which isnt a huge improvement from 13th gen, I would argue a i9-9900k is better then a i5-12600k but is worse then a i5-13600k. 12th gen intel came out in 2021 so using it over 13th gen which is about $10-20 more while they use the exact same motherboards in 2024 while saying "its decent" is completely evil.
@itsNightShine
@itsNightShine 3 месяца назад
No an i9 9900k is not better or equal to an i5 12600k. The 12600k is at minimum 10% faster
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
@@itsNightShine 9900k has a score of 13k in cinebench r23, a 12600k gets about 18k in the same test, thats almost 30% faster.
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
12600kf can be found for $150 brand new on amazon when a 13600k is almost double that. The 12600kf scores 20k in r23 with just an undervolt and people are overclocking the 13600k and only getting 26k so is double the price worth about 20% more performance?
@itsNightShine
@itsNightShine 3 месяца назад
@@peik_haikyuu2265 yeah, I just checked a quick bench and the first title was overwatch and thats where I saw the 10% that’s why I put a minimum 10% because I was sure it was much faster than that thanks for the info !
@itsNightShine
@itsNightShine 3 месяца назад
@@peik_haikyuu2265I don’t really expect most people to under volt. I do it because I enjoy the process but none of my friends do it.
@demonpandaz8246
@demonpandaz8246 3 месяца назад
This has to be the most atrocious thumbnail I have every seen. I'm not good at them but you can easily do better than just a headshot. Best of luck to you and your journey tho
@TriPBOOMER
@TriPBOOMER 3 месяца назад
I agree with not hating on the 2080...👀😏😎😂... But don't be hating on the 12600k ''10c,i5'' like its a bad chip, I wouldn't recommend it as a test bench, but you are also very wrong about it, and miss leading people about this chip. as for ''lowest'' available recent-ish Intel CPU, that's the 12,13&14-100+f ''i3'' all readily available, also the 12400+t/f ''6c,i5'', 12500+t/f ''6c,i5'', all currently available for that socket and are lesser CPUs than the ''10c,i5'' the 12600k, Also the 12600k ''i5'' is stronger than a 11900k ''i9''... So I agree as a test bench for 2024 the 12600k is not a good choice, but to push an RTX2080 to its limits, then yes this CPU being stronger than the 11900k is more than enough for this GPU.
@itsNightShine
@itsNightShine 3 месяца назад
I don’t see how you agree with him on many points… this whole video is almost completely wrong
@TriPBOOMER
@TriPBOOMER 3 месяца назад
@@itsNightShine Ok... this is true but firstly... I just didn't want to put any hate on the 2080, its still viable card... And Also I started typing that as soon as he made his outrageous comment about the 12600k... So obviously couple of minutes in.. then half switched off after that so my bad. 🤣 The boy obviously has no idea about Intel CPU's, marking it as just some ''i5'' even saying he should get a previous gen ''i7'' instead... 😂😂 This is obviously dumb as the ''12600k'' out performs all previous gen ''i'' CPUs including the 11900k, with the possible exception of the 10900k. also all the benchmarks shown above, have the GPU at 100% so why he would think the 12600k is bottlenecking the 2080 at 4k, if it can max it out at 1080p?!... does he not know 4k is more GPU bound than CPU? The 12600k can max out a 3080 Ti at 4k & 1440p and only shows a 10-15% bottleneck with a 3080Ti 12gb!! at 1080p!.... but is going to struggle with 4k on a 2080 8gb somehow?!..... 🤣🤣🤣 That being said, the 12600k still shouldn't be a test bench CPU in 2024, if it bottlenecks a 3080 Ti at 1080p, As basically anything over a 4070S at 1080p will be bottlenecked by this CPU, but plenty for a RTX2080 benchmark, also plenty of room in that CPU socket as well, so platform is ok.
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
@@TriPBOOMER he probably had an i7 9700k which came out the same year as the 2080, it only gets a score of 9000 on cinebench r23 when the 12600k gets 18000 with a good cooler. He is just under the assumption that any and all i7’s have to be better than an i5.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
I hate that this was understood that way. I meant it as in it isn't the top of the line for what Intel sells. Matt Miller was treating this as if it was a Benchmark when it's not, it's a showcase. He didn't follow what is generally considered standard practice and even explicitly commented that FPS being high instead of graphics is a personal choice. Personal taste has no place in showing the capabilities of a piece of hardware, i.e a Benchmark. So yes, I generalized very heavily that an i5 is weak because comparatively to the other modern Intel CPUs it is weak. People are very mad that I condensed all the CPUs like this and while they are not wrong in saying that the i5 he picked isn't garbage, the sentiment about it being very far from the top is not incorrect. I also don't understand how the concept of proper procedure in a Benchmark not being followed is potentially an issue. Sure, the i5 he did the video with is solid enough. That does NOT mean that it would be considered by any means to be a good choice for a BENCHMARK. This is why I have been so up in arms about the video. It's a SHOWCASE and not a Benchmark. Matt Miller could easily have done what other groups do and made sure the GPU is in zero ways impacted by any bottlenecks elsewhere in the system and raised the graphics to a reasonable level. I have a suspicion that if he was pushing the RTX 2080 as it would have been for a proper Benchmark, that the CPU would have started taken on more load, as well as the RAM and that could have potentially impacted the results. I am not a hardware benchmark expert. I don't bother with benchmarking tools. I go off my actual experience and what I have first hand seen hardware do. That is why I take people saying Benchmark and hold them to a high standard. If you are potentially influencing purchases of consumers and the information you put out is bad, that's not cool. Matt Miller's 'benchmarking' of the 2080 is so far from what any other reputable group would have done that it simply HAD to be called out. I'm rambly and I lump things together and I don't bother with the very fine details. I'm a big picture guy. I feel like the fine details are a right time right place kind of deal and that a video saying that a showcase of the 2080 is shit is not the right place for getting down in to the specific models of CPU over just lumping i7s and i9s as being more powerful than an i5. So, to be clear, yes, there is more fine detail that I did not go into. It isn't important that the i5 he is using is more powerful than some of the other higher i model CPUs available because the point of lumping i models together like I did was to generalize. Fine details do not make the point of what I was saying any less valid no matter how much some of the people here want to claim.
@TriPBOOMER
@TriPBOOMER 3 месяца назад
@NathanaelHunter regardless of how much you ramble on about it, that 12600k would NEVER under any circumstances bottleneck a RTX2080 so I agree that for a standardised test bench for all hardware then the 12600k is a bad choice, but to benchmark an RTX2080 then their is nothing wrong with it, as there is NOTHING the 2080 can do to out perform the 12600k, also you say its far from the top?... their are currently 8 Intel CPUs higher than this CPU only 8! 2 of them are newer 600k!! So 6 none i5 CPUs also for gaming the 12600k outperformed the 12700k so..... My point is that you dont have a clue what you're on about with CPUs, and again for benchmarking just a 2080 the 12600k is plenty, again more CPU than that GPU can handle so would never bottleneck a 2080 in any situation ever. I think you should go learn about hardware and its capabilities before complaining that hardware that is more than enough for a job, and the ONLY way it would cause a bottleneck is if somehow the Rtx2080 was more powerful than a RTX3080TI so unless the 2080 becomes atleast 60% faster stop complaining about a CPU that is getting bottlenecked by a 2080! You talk about him lying ect, and you start your video, lying about the 12600k being too week and the 2080 is stronger than it is. Again there is NOTHING the 2080 can do to create a bottleneck on that 12600K, you are spitting bubbles lad!
@7lool125
@7lool125 3 месяца назад
me when someone trash talks my specs: (no but i totally agree with u)
@NathanaelHunter
@NathanaelHunter 3 месяца назад
A lot of people online shit talk hardware. I have come to realize it's buyers remorse. They bought the latest tech and when the price to performance isn't super amazing they feel the need to tell everyone with older hardware how much worse the stuff they have is. It's often not even a real argument. There is nothing wrong with older hardware and the Steam Hardware survey makes them look like idiots because LOTS of people have chosen not to buy 40 series cards and are rocking older GPUs. I've gamed on bad hardware and super good hardware and played on PC and console. I know what garbage graphics and frame rates are like and I know what good graphics and frame rates are like. I think people deserve a voice saying that older hardware is good and in some cases, better than people are claiming. The RTX 2080 is amazing for what it is and people shoot out to try and say it isn't because the numbers say blah blah blah. Game how you enjoy and ignore the mongs running around the internet saying your hardware isn't good enough. They usually don't know what they are saying anyways because they don't have the hardware in question.
@jamesgriggs6432
@jamesgriggs6432 3 месяца назад
8 gb is low for a lot of modern games. I use up on average 15gb on most game. There is so many things wrong and false information in this video. I think you should do more research before giving false information. There is so much wrong in this video I'm not gonna sit here and type it all out cause id write a comment that would take a hour to read. I really don't want to bash you but you have no clue what your talking about and you say it a lot.
@marz367
@marz367 3 месяца назад
10:10 bro doesnt know i3’s and ryzen 3’s existed
@NathanaelHunter
@NathanaelHunter 3 месяца назад
From a conversation I had explaining this on Matt's video: I had to go check, I will admit, I was wrong about the sales of the Core i3, and I figured out why I had thought it wasn't available anymore. For a start, Intel stated they were moving away from the 'i' naming scheme. There was an article I can no longer find that I recall mentioned the i3 was no longer being sold as an option by some manufacturer (I think it was Dell or HP). There is also the most recent information that i3s are apparently not supported on Windows 11. I don't know the scope of that, but I now know that they are still around and that there were a multitude of things that led me to believe they were no longer available. Regardless, the i5 is Intel's lower end of modern CPUs and even a powerful i5 is questionable over other available options, particularly given the i9 has been available for years now and current i7s and i9s outperform all i5s. I was generalizing in my statement by lumping all i series into just the most recent ones, I can see how this was unclear.
@kanavyre
@kanavyre 3 месяца назад
2k is 1080p. 2.5k is 1440p. Even if his video is garbage, to call garbage out you have to get facts correct yourself.
@FiveMissiles
@FiveMissiles 3 месяца назад
always thought that 2k was 1440p. why would anyone call 1080p 2k? genuinely??? especially when its been called 1080p for years and years
@kanavyre
@kanavyre 3 месяца назад
@@FiveMissiles the K system is the first number in resolution (rounded). 3840x2160p is 4k, 2560x1440p is 2.5k, 1920x1080 is 2k. Of course, other weord resolutions can fit into 2k/2.5k/4k, but those are the most common ones. And people call it 1080p and not 2k, just like how people call it 1440p and not 2.5k. The only one people say is 4K. It's better to be specific and say 1440p, but if he's gonna say 2k he should get it right with 2.5k
@GODFADED
@GODFADED 3 месяца назад
nobody calls 1080p 2k Wtf? lol 1440p is 2k
@kanavyre
@kanavyre 3 месяца назад
​​@@GODFADEDyou clearly didn't read my above reply explaining how it works. Calling 1440p 2k is factually incorrect. If your friend jumped off a bridge, would you jump too? Just because some people are wrong, doesn't make it right.
@GODFADED
@GODFADED 3 месяца назад
@@kanavyre and calling 1080p 2k is factually incorrect as well your point doesnt hold up wtf are you yapping about? if 1440p is 2.5k right?, which is 78% more pixels then 1080p then hows does 1080p add to 2k if 1440p is 78% more pixel dense then 1080p ur math aint mathing
@kush2023
@kush2023 3 месяца назад
a 4070 isnt even good for 4k gaming and bro thinks his 2080 peforms well lmao
@NathanaelHunter
@NathanaelHunter 3 месяца назад
It does perform well. I have now posted proof that it does. If the PS5 and Series X are considered 4K capable consoles than what the 2080 can do is also fair to consider as 4K capable. Shut up you filthy basement dwelling graphics whore.
@NimVim
@NimVim 3 месяца назад
@@NathanaelHunter Grow up, no need to trashtalk you clown. Hes just giving his point of view. Dont be a bitch.
@arnoldtheturtle
@arnoldtheturtle 3 месяца назад
Take a deep breath in 🧘
@NathanaelHunter
@NathanaelHunter 3 месяца назад
Naw
@zxqu2622
@zxqu2622 3 месяца назад
cringe
@arnoldtheturtle
@arnoldtheturtle 3 месяца назад
@@NathanaelHunter but but but like- damn 🦫
@NathanaelHunter
@NathanaelHunter 3 месяца назад
Ah yes, I'm cringe. That means alot coming from somebody with a Roblox PFP who posted a bunch of Fortnite clips. I swear children should be banned from internet access.
@arnoldtheturtle
@arnoldtheturtle 3 месяца назад
@@NathanaelHunter nah bro ur not cringe the only Chad member from anime fan club discord group
@sensi7593
@sensi7593 3 месяца назад
I have to break it up for you bozo... 8 GDDR6 isn't enough anymore, Boohoo. GTA 6 is aroun the corner so better upgrade
@itsNightShine
@itsNightShine 3 месяца назад
Facts
@NathanaelHunter
@NathanaelHunter 3 месяца назад
Oh I cannot wait to hear how everyone is going to rant and scream this in the coming months. The RTX 2080 is pretty much capable of everything the current Gen consoles can do so if GTA 6 is on there the RTX 2080 is going to be more than capable of handling it too. Also, you really want to call me a bozo then not even manage to spell the word around correctly or punctate that sentence? REALLY? I'm the bozo here?
@Jutepest
@Jutepest 3 месяца назад
GTA 6 is not relasing in 2025 on pc 2026/2027
@sasquatchcrew
@sasquatchcrew 3 месяца назад
Most cards can "handle" 4k Its THE GAMES that can't be played as smooth/nicely Like go play XCOM 2 in 4k and tell me most cards rx 580/1070 on up could probably do it. But then go try BODYCAM (an fps) in 4k and my 7900xtx can barely keep up with it
@Justanotherbeautifulday
@Justanotherbeautifulday 3 месяца назад
Listen to this guy at 34:16 to 35:00 + Man this video is not good man, and you have got to do your research. Use Google and RU-vid they will help you out so you can learn. So many things you been saying were absolutely crazy wrong.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
You are not about to try and say something about how VSYNC is bad and screen tearing is good to my fucking face. I know damned well you aren't going to try and make an argument for screen tearing being GOOD. Of all the fucking retarded, shit for brains, nonexistent neuron activity statements that I have heard, the ones trying to DEFEND SCREEN TEARING are the ones I am NOT about to give even the smallest amount of respect towards. That is just outright fucking bullshit. Don't even try it.
@Chazenn9
@Chazenn9 3 месяца назад
the meme pronouns you put in your twitter accurately represent your thoughts on this topic (and all the other things you constantly try to hate on on social media). No wonder you have no subs
@jessepatrick7471
@jessepatrick7471 3 месяца назад
i use a regular 3070 and i run shit fine with a 14th gen i7. Im actually about to test lords of the fallen in 2k. I play shit in 2k and i average 90 fps on even heavy games on ultra at 120-144 fps
@itsNightShine
@itsNightShine 3 месяца назад
Do you also average 120-144fps on cyberpunk ultra ? Native 2k
@NathanaelHunter
@NathanaelHunter 3 месяца назад
You are the kind of person I made this video for. Ignore the hardware whores screaming that you're not using the latest stuff and thus you must be lying about what you're getting fps wise and that anything lower than 120 or whatever fps is just unacceptable. It's just nonsense. Most of them say you're poor for not buying the newest GPU or whatever but they're just having buyers remorse. They're mad you can get decent FPS and graphic settings without paying a billion bucks for it. They get even more mad when I say that my Steam Library is valued at about 35 grand. I can easily buy a newer GPU, I just don't because I don't need to. I never lied about what my GPU can handle nor did I lie about the settings, but note how mad they get at me saying it.
@peik_haikyuu2265
@peik_haikyuu2265 3 месяца назад
@@NathanaelHunter you are lying🤣 your 2080 is worth about $200 used, a 6700xt gets anywhere from 15-30% more performance than a 2080 and can be bought brand new for $299. Your gpu is not a 4K gpu, a 4K gpu can run something like cyberpunk at 4k native high with rt off at 60fps which a 2080 absolutely cannot do. 4K high preset in cyberpunk gets about 25fps avg.
@NathanaelHunter
@NathanaelHunter 3 месяца назад
You are the person the 'well aktually' joke is making fun of dude. You keep saying Cyberpunk 2077 4k ultra with Raytracing like it's something I said the 2080 can do. I never said that. You can't timestamp me having said it. It's not even something anyone is claiming. I never even brought up the price of an RTX 2080 because it's not even an argument to be had. It's literally sounding like you bought a 40 series card for way more than it's really worth and you feel the need to say anyone on older hardware sucks because its not the newest tech.
Далее
The Verge's $2000 PC Build Reaction Supercut
19:11
Просмотров 15 млн
Outsmarted 😂
00:20
Просмотров 2,7 млн
Silent Hill 2 - Мульт Обзор
07:26
Просмотров 260 тыс.
The Video Game Industry is Run by Morons
13:07
Просмотров 157 тыс.
How Mario Kart 64 Cheats against you
12:21
Просмотров 169 тыс.
Transform office Laptop into a gaming with TH3P4G3
6:09
This Is The Best Selling "Gaming PC" On Aliexpress...
14:52
Have You Noticed Everyone Only Plays Old Games?
27:01
Linus Tech Tips Bought My Computer
47:36
Просмотров 7 млн
Outsmarted 😂
00:20
Просмотров 2,7 млн