Same and it always looks fantastic and I love it... Having Shadows set to high or very high or Ultra looks great but medium shadows look good too and you don't notice the difference too much when the rest of your settings are set at a mixture of high ,very high, and Ultra textures Etc at 60fps in 4k
I set the shadows and motion blur to off most of the time and only focus on textures and anti aliasing because i can see my enemies easier with these settings but sometimes I need to see the shadows then I set them to lowest possible. I even play burnout paradise without shadows.
And then there's Doom (2016), where you can jack up the settings and still rock 120-140fps in heavy combat. I would advise, though, to put sharpening at about 20%-25% to avoid sharpening artifacts that actually degrade the image when up high (forms white, noise outlines on details, ruining the color gradations in many cases).
It's been proven in the past that you don't need a beast of a system to play FPS games. Remember how awesome Half Life 2 looked? The specs for that game were minimal and it still looked incredible. Even running Crysis on its release at the lowest settings still looked impressive.
@@aiden_macleod As someone who played Crysis in 2007, I can assure you the game looked like utter dogshit on Low. Medium was where it was at. Anything beyond that, you needed to either get a loan or consider selling a kidney on the black market to get a computer that could run it lol
A part 2 with various Anti Aliasing Filters in various Resolutions (for example: How high does AA need to be at 1440p with a native 1080p screen) would be very appreciated.
That's a general true answer, but every game acts differently. Playing a game like Borderlands 1 with its comic design is way better in 1440p. But the performance hit is another question, wether playing in 1080p, high settings with 16xAA or 1440p with moderate settings and lower AA is worth it.
Depends on the game and/or engine. At 1440p I can play Wolfenstein 2 without AA but GTA5 absolutely needs at least 2xmsaa or fxaa to get rid of them jags
Lee R Definitely agree. Mid range is the sweet spot. With low end you end up having to sacrifice too much, with high end you end up paying too much for something you can barely tell apart unless you're carefully comparing 2 things side by side, unless we're talking 4K on high/ultra, but then that's almost like comparing apples to oranges and the price at this point just isn't worth it for most people.
adm It's true graphics in video games just aren't really improving at such a rate that the difference is that obvious. It's a very different situation from 10 or 20 years ago. Hardware makers have just managed to convince people they need all the tiny little details cranked up to the max to enjoy a game, and at a resolution which is far too high for the size of their screen. I mean PC gamers today seem to think they need as many pixels in their 21" computer screen they view from one foot away as in the 70" television in their living room that they watch from across the room.
He was being sarcastic, satirizing certain people who don't play games, but brag about how much money they have or spent on a PC and its components. Some do play games and brag though too. It is sad, but to each their own.
I am kinda shocked that nobody is pointing out that names like "low", "medium", "high" and "ultra" are just arbitrary names chosen by developers. I can add my own graphics setting above ultra and call it "deez nuts". It changes the depth of field sample count to 10'000 samples per pixel. Afterwards people will complain about how unoptimized my game is cuz they can't max it out on STATE OF THE ART HARDWARE. This is a constant grievance of mine as a developer. However, in defense of all the users: Most gamedevs do a *TERRIBLE* , ABSOLUTELY GARBGE SHITJOB of the following three points: - Conveying wtf the graphics option you're changing actually does. - Conveying the visual difference of said graphics option's quality levels. - Conveying the performance hit of said graphics option's quality levels. If they actually got their goddamn shit together for once, maybe we wouldn't need these damn videos and tweak guides. How many more times do I have to read something like "SSAO: Adjusts the SSAO quality." or "Use compute shaders". However seeing the code said developers write and what comments they put, I have little hope. They're exactly the same. "GetOcclusionFactor(); // Gets the occlusion factor"
Damn straight dude. I've been doing this stuff a long time, and when I was younger it was fun keeping up with all the new terms and technologies; but generally we just had simple stuff like AA, AF, dynamic lights and bump mapping to worry about. Then there came a point a few years ago, when I realised I just had no idea hat the hell all those sliders were doing any more, and what's more, I wasn't sure I cared enough to find out. Every new game I installed would take a term I recognised, and throw a random new letter into the acronym. As I get older, I want to screw around with the settings less and less- Don't get me wrong, I still love to tinker and find that sweet spot. That's WHY I'm a PC gamer. But Ihate spending hours flipping every setting on and off again in that trial and error ritual, when instead, the devs could just tell me what in the goddamn fuck the "post processing" box actually, specifically, does.
It might seem silly given what you take for granted as a dev, but you actually gave me quite the epiphany with this post for contextualizing just how much my selecting those menu options despite having less and less understanding of what they're even doing as the years go by is ego. So, thanks for that. :)
That was never the intention of this video. He acknowledges that Ultra settings are great for future proofing a game, but for a mid range card they're a waste of valuable GPU power.
I am using a GTX 1070 amp extreme with an i5 6600k on dual 1080p monitors. I normally play on high settings and I have personally not seen any big difference between high and ultra and the fps difference between both is massive so high is ok for me
This is still one of my favorite DF videos. So sensible, and backed up with ample visual evidence, but really helping out most PC gamers understand how to tweak their games.
I really like the look and feel of classic PC games, there's something particularly enchanting about retro graphics, but, after more than 20 years playing games, I've come to the conclusion that what keeps me going it's the gameplay. I have to admit that for some people ultra settings are an addiction once you get used to be able to run them in your hardware.
I’ve always advocated for this. My cousin always looks for best story but can’t stand playing some games because it was so boring to play. Gameplay trumps every aspect. The story and graphics could be amazing, but if I’m not having fun, I won’t touch the game
The video game at ultra in the video isn't even remotely close to how it looks like in reality, this is due to the low bitrate of the uploaded video, enforced by youtube's heavy compression. For refrence, the Witcher Ultra in this video looks like Medium settings, if you want actual represntation then look in youtube for "The witcher Ultra," and look for videos that have 4K in order to ease the compression and increase the bitrate.
Publisher/Developer now more focus on trends....unlike 10-20 years ago where we actually have more developer experimenting with varies type of games (nowadays most of them been acquired by bigger companies).....
The problem is that many PC gamers have in fact been brainwashed into this "Ultra" or nothing mentality. They think they're somehow "hardcore" but the reality is that they've been doped into buying into something that offers almost no return on investment. The marketing campaign has worked on these poor souls. They erroneously think for example that a GTX 1060 or RX580 is not a "legit" 1440p/60FPS card because they can't run EVERY game on "Ultra" at that resolution. It's is the same story for the GTX 1070, 1070ti, 1080, 1080Ti. I know with my previous card a GTX 1080 I could play the vast majority of games at 4K/60FPS just not at "Ultra" settings. I now have an EVGA GTX FTW3 1080Ti @2025MHZ paired w/ 8700K@4.8GHZ OC/D - H100iV2 - 16GB DDR4 RAM@3600MHZ and still can't play EVERY game at 4K/60FPS on "Ultra" settings but I would still consider the 1080Ti a legit 4K/60FPS GPU.
Difference between 1440p and 1080p is not that great that you should sacrifice 144fps experience for it just to have higher resolution. 1080p @144fps just gives greatly better gaming experience compared to 1440p @60fps so that is just wrong with your comparisons in my opinion. In general yeah I guess I admit that there are many people "brainwashed" that they "must have ultra", but I don't really care as it doesn't affect me. :D Me and my friends have decently high end PC's but we are not obsessed with the ultra settings but rather not having fps drops (at least this applies to me if not to all my friends lol). I play every game 1440p @144fps and if I see dropping much below 144fps I turn down few settings to get that stable 144fps. My next upgrade will be when 4k @144fps comes actually doable with decent investments so I think I will keep my current setup (updated last year) for good 3-5 years. If the games start require much more then I just have to adjust the settings lower and lower so that I can still get the 144fps experience.
What bothers me more than anything is unoptimized games. I typically turn settings up to max but that's just because I can. If I'm trying to 1440p 144fps I'll turn settings down to hit it. But when I have to turn down my settings to low or medium to come close to that while only having 50% GPU usage and 30% CPU I get annoyed as my $1500 setup shouldn't be forced to run at such low settings even for 60fps at 1440p.
"but the reality is that they've been doped into buying into something that offers almost no return on investment. " You're wrong on that, I've bought cards with the idea of achieving ultra settings when in reality I could just achieve it. My investment was returned about 6 years later when more demanding games came out and I could play them at 60 FPS on medium high. Now I have a gtx 1080 and use Nvsurround. Basically good for it for the next 6 years.
Not really, I just hate aliasing and like to get the most texture and model resolution and best lighting out of everything while maintaining at least 60 fps. You're way overcomplicating this.
Good to see that this channel isn't just about making performance vids and actually explains shit and doesn't leave the average viewer guessing and can actually learn a thing or too about computers
Ultra = use it if you can maxed out everything High = beauty and FPS Medium = more FPS but looks good Low = not that great but it gives you FPS 15% resolution scale ultra low 15FPS : well it runs
Eh, even if I CAN technically "Ultra" a setting, if I can knock it down to "high" and get 10+ more FPS with a barely noticeable visual hit, I will take that extra smoothness/responsiveness over subtleties in the image that will be lost in motion ANY day, but that's me.
This sums up everything without making such a huge video !!!! :P :P :P Few things aren't correctly presented in video where there is Diff & they trying to say its similar. Well its "Ultra" is Ultra and your comment sums up everything.
the ultra is not enough, some games u have to put in the manual and put everything to the maximum and go into nvidia control panel and set the texture filtering on high quality and negative LOD on clamp
A bit disappointed that this didn't even mention AA. I built a Ryzen machine with a 1070 to use with a 4K TV in my living room rather than buy a console. A lot of people told me this would be "unplayable". It's not. At 4K I usually turn AA off completely. It affects performance and at this resolution, a few ft away from the TV, aliasing is not noticeable. As mentioned in this video I do play with graphics options too. Most of the time 40 FPS is perfectly fine too. It doesn't have to be 60 FPS or go home.
Thank you. So many retards think you need 2 gtx 1080tis for 4k, while you can VERY well play any game at 4k with only a 1070 or even 1060 if you use your brain and use proper settings.
Hey just for the record once you're playing in 4k you don't actually need anti-aliasing at all and therefore you should always have it off when you're playing in 4k cuz I just Hawgs up massive amounts of your GPU and PC resources while giving you no returns
@@anotherfan2870 not that I don't believe you or anything my dude but I find that extremely hard to believe in general... The 1060 series is at most a 1080 P card or maybe 1440p at extremely low settings and frame rates... Not saying that you haven't been able to accomplish this but if you did you're barely running at 30 frames per second if even that and almost all of your settings are turned down to medium or low otherwise there's no way you could be running 4K anything on a 1060... and for the record you guys if you're playing in 4k you absolutely do not need to have anti aliasing of any kind this is a fact check it out online just Google it it is a fact... When you're playing in 4K resolution the only thing anti-aliasing is going to do for you is hog more sources and put even even greater stress on your gpus in your entire system as a whole which is completely unnecessary... So turn off anti-aliasing at 4K and use that vram you just saved to turn up a couple other settings like Shadows for medium to hide and so on and so forth etcetera
@@bobmarl6722 you're not going to be able to play any game at 4K and 30fps on a 1060 unless you have all your settings turn down to medium or low including anti-aliasing and everything else but then again you don't need anti-aliasing in 4k so that doesn't matter none , my son has a 1060 in his laptop and he can't do anything but 1080p Ultra because the 1060 series is a 1080 P card.
My personal feeling on this is that tweaking game settings for your most beloved games for that best balance between graphical fidelity and frame rates is just as fun as tuning hardware for that optimal efficiency point on the frequency-voltage/power scale.
Ultra and high settings usually doesn't look that different from medium nowadays, but back when Crysis released, the difference could be night and day. Baseline visual fidelity is so friggin' high nowadays, dialling down a couple of settings doesn't feel like giving up key parts of the experience anymore.
For me it should atleast be high - ultra settings. I'm only gonna do Medium settings if the games is really demanding and doesn't get 50 - 60 FPS average.
I thought so, too, because his German pronounciation was so perfect at the beginning where he said "Angst", and also in the Wolfenstein part. But then I hestiated again, since his English also sounds so good. Since I'm not an English native speaker I might not hear it as well as German, but sounds great to me. And certainly, great work in general! Awesome video!
I think developers should focus more on the scene composition (and the manual crafting part) rather than pure resolution and detail. Things like lighting,shadows, contrast and colors are much more important during gameplay (which is 99% of the time) than having 4K textures, which you can only appreciate if you stand and stare at them.
That's subjective. For me, it's worth it. For you , it may not be. A Rolls Royce will get you to the same places as a Hyundai, but some want the extra comfort and trimmings.
At around 10:30 you used the word "poignant" to mean something like "clear", "noteworthy", or "salient", but the meaning of "poignant" is tied specifically to how emotionally affecting or sentimentally charged something is, not simply how salient it is. Just a tip in case you use that word regularly in your videos. BTW I found the content in this video on the whole well-written and very informative so don't take this as any sort of blanket criticism of the script.
These setting are resource intensive but you don't lose much visual fidelity by dropping down a peg and you gain a lot of performance. >Leaves motion blur enabled, which looks worse and costs performance.
NovaPrima listening to the "und auf Wiedersehen" at the very end of the video, I'm 99% sure that he is either a native German or has lived for many many years in the country at least. It's just way too perfect :D
Personally, even if I can run a game, I like turning the settings down so my fans stay quiet. I COULD run Overwatch with high texture quality but my GPU would be really noisy.
Ultra setting really are beyond pointless. The increase in visual fidelity is marginal yet the performance difference is insane. In this video we had to stop an analyze a scene carefully just to see the difference. If your hardware can run ultra then great run it but if it can only do high then their is no reason to upgrade. Vast majority of people wont even notice or care to notice even a mix of medium and high. Once you are immersed into a game and are having fun the last thing you really care about is if you can see a higher resolution shadow 5 miles away on this obscure tree in the background.
And yet, if you really are immersed and don't just run around pew pewing people, you start to notice every little detail. I've lost count of how many times i was in the air in far cry 5 looking at the horizon and thinking, wow incredible foliage. Maxed ofc.
That is because most games are designed around Medium-High settings due to consoles. Which is a good thing since it increases the life of our PC components before we absolutely have to upgrade. Usually the best performance to fidelity is achieved on the base console settings because that is what the game, levels, models, set pieces are designed to run on. There are very few games like the Battlefield, Witcher 3 or Doom that actually take advantage of your PC hardware. I have a GTX 1060 and most of the time I just lower the settings to all high instead of ultra just to get a stable 60. Or just use the Geforce experience's recommended settings for smooth 60. Ultra settings are usually just overkill features like MSAA or just experimental features like VXAO or Hairworks, that tank to performance hard.
On older games I use Ultra settings newer ones I usually have to blend between high and medium just do to not having the super gaming pc that is required for ultra or high all the way through. Mine plays Skyrim Special Edition on ultra and it looks great and the FPS never dips below 50 frames which is good, I usually get concerned if while playing your FPS dips drastically like in one area you have 60 fps during any situation and all of a sudden your in combat or what not and it dips to below 30 usually tells me a setting is too high.
@Transistor Jump No, I agree with him. I've seen people saying "I'm switching to console gaming" or "PC gaming is a waste of money" all because they can't run games at 4K max settings lol.
I expect to be able to play any game I own at ultra and get 60fps but that's cause I have one of the best GPUs on the market and my system should be able to handle it. However I do have a 1440p 144hz monitor so I do turn settings down to be able to hit 144fps. But really what gets me is unoptimized games. When I have to play at medium settings to get 60fps with 50% GPU usage and 20% CPU just bothers me.
Really good descriptions and sound advice in this video, great job. What bugs me the most is when I see people using "high" or "ultra" as reference points, like, "can machine X play games at ultra?" or something to that effect. Or admire games that run slow on "ultra" because that means the graphics are amazing, right? All those labels are completely arbitrary. They literally have no meaning outside what the developers assigned to them in that one specific case in that one specific game. And those "ultra" settings can really get unreasonable, but that's kinda their point. So don't feel bad if you can't run the game well at those settings. It's really not hard to make the game more CPU and GPU intensive by just setting a couple of values to unnecessary high numbers. Like, you'd probably need to change a number or two in a reasonable modern game to make every individual blade of grass always fully render in 10 mile radius and call it "super-mega-ultra vegetation" or something. Does that mean that the game has amazing graphics just for that? No. Should you feel bad that your PC can't run it nor possibly any PC that will be made during your liftime? Absolutely not. Unless it's a very old or undemanding game, dial those things that you can hardly even notice back a bit, it will be more worthwhile to ease the strain on your hardware, making it last longer and consume a bit less power, especially if it's a laptop.
This was incredibly informative, now I won't have to worry as much about missing out on graphical details on ultra settings when my rig can only support smooth performance on high or lower. Thanks for alleviating that.
What I've discovered through getting sucked into chasing the frame rate and having the highest end motherboard/CPU/GPU. that it's great if you're of the type who are into building systems for bragging rights and have the money. Watching all the bench marking and over clocking videos started taking me down that path. What is hard to find is videos stating what makes a great rig for real world playing. With the RX500 series finally coming back down to a more reasonable price it's possible to get a 580 with 8 gigs of high bandwidth. Take that and put it with a 144 hz 1080p freesync monitor. Buying the two together is close to the cost of one mid high end Nvidia GPU. From my experience I wish I had figured this out sooner. How much money and trouble I would have saved myself.
@@nexxusty Past games do support it. Some with excellent scaling, and others may require an alternative SLI Profile. However, as of the last 2 years much less games support it and it's been dwindling.
I've found that the difference between graphic presets has become less and less drastic, especially after the current console generation launched. Do you think that there will be more of a difference in the future, say around the end of this console gen?
Differences ae becoming minor as time goes on. U can only make the gpu cores so small b4 they will break easily. We need a new method in making processors if we're going to see a big leap again.
We're reaching a point of diminished returns in terms of graphics in my opinion. I think we'll see more of a push for CPU development in the next generation; games over the last year or so have become increasingly more dependent on CPU resources. I hope the Ryzen/PS5 thing is true because that is a fantastic chipset.
I clearly see the diference between games before 2015 using CPU and from then on. I have an i3 6100 and almost every game (even open worlds) uses my GPU (RX 470 Nitro+ 8GB) to 100% and the CPU isn't the bottleneck in games before 2015, but for games after that is really rare for me to play a game where the CPU isn't the bottleneck. (I know it's only a dual core HT and it's weak but I think I can only see that difference because I've been playing with a low end CPU)
Depends on the game/setting, but in general you see the most noticeable difference going from "low" to "medium", with the variance becoming increasingly subtle after that.
Yeah agree with this. I realized it with the witcher 3, on my gtx 660 mostly I set to mostly medium and few high. When upgrading to 1060 6GB I set mostly very high and few high.And despite the fps difference(30 vs 60), visually I didnt mind at all with witcher 3 on my GTX 660. It looks good enough (at least on my 1080p Monitor) and doesnt feel the differrence that much
a little tip, in rise of the tomb raider....just set everything to very high (like , beyond very high in some settings) and set antialiasing to SMAA......solid 60 all the way on i5 6400 and 1060 6g..12gb ram.....even set whats on very high even higher like i think pure hair which is just on, on very high... even works on other games. soooooo antialiasing hits all harder and just putting it to SMAA..not SMAA×4 or more will increase frames from an average of 45 fps..to 60..even reaches 70 in some areas...Tested on AC Origins, ROTTR, Kingdom Come Deliverance, DXMD, and Crysis 3 ( Edit: SMAA doesnt look much different from x2 but looks a bit noticable against SMAAx4)
For me, very high textures led to shortage on VRAM on GTX 1060, which should not be the case in a 6GB GPU. There was definitely a memory leak as I ran MSI afterburner and saw that VRAM usage slowly climb up despite me entering in new areas. This disappeared when I set the textures to High. Maybe they patched it now. I should go back to it and check it out.
@@ColdieHU did that before.. but it was on the 1366 socket with the i7 920.. and that was a tripple channel plattform.. nowadays it's usually either dual or quad.. so yeah 12 gb seems stupid..
Sometimes the expensive settings are worth it. Like VXAO in RotTR. Looks so, so much better than HBAO+ or even SSAO. You notice it in every scene, because the lightning and shadowing seems much more accurate and realistic.
But the massive hit to frame rate it has means it is worth it only for the those with expensive hardware to match. Plus you can kiss DX12 goodbye since it doesn't support it.
I hope all future games will have a more user-friendly graphics settings UI, which clearly shows you the visual differences that each setting will bring and instantly shows you the impact on framerate. I think it will be easier for us to find a balance without testing over and over again by ourselves.
Depends on what settings are available. Geometry and textures and lighting are the most important factors and do make a difference in some cases, I've found textures look better on the highest setting when you go beyond 1080p.
Very insightful video, Alex. Thank you for this. When it comes to PC Gaming, I usually want the best performance and visual fidelity, and find it a real hard battle to balance those two factors. Sometimes going full Ultra really doesn't do anything, we're talking about a really small, marginal gain in visuals, almost to the point where there's a diminishing return. Lately I've been on the quest of trying to reach 144 fps in all games at max settings on a 1080p 144hz monitor. So far it's been impossible. I went from a 970, to a 1070 Ti, to a 1080 currently, and I'm thinking of going for a 1080Ti. I simply cannot max out the fps without sacrificing a lot of settings. It's funny when people say things like a 1080 is overkill for 1080p. Maybe for 60 fps it kind of is, but anything over that and you'll be struggling.
Remembering what your screen resolution is actually is the biggest thing people forget to do. They run ultra textures, which are sometimes like 4k textures, on a 1080p monitor, when textures half the size, or even a quarter of the size would look exactly the same in most circumstances. It eats up more GPU power for literally no visual improvement.
i cannot thank you enough alex..i was psychologically tormented cause of the new releases coming up and i fear if my hardware can meet their requirements... i have a strix 980 and i7 6700k, and a 1080p60hz display, and at the moment due to rise in gpu prices and bit of struggling financially i am currently unable to afford a new GPU for 1080p gaming...i was so shellshocked as to new titles been announced every day and that my gpu might not be able to keep up with them...but fortunately after watching your video i tweaked onto some graphics settings and now getter a better performance at no cost and energy efficiency. I ll be truthful...I use to be so obsessed with MAXING out every game...later I learnt my lesson that maxing out not only is not required everytime but how taxing it can be on my hardware.
Gopal Chatterjee the 980 is still a damn fine card. It's pretty much a 1060 and will still play at 1080p 60fps no prob for years to come. Sure maybe not at ultra, but like you realized, lowering a few settings here and there to high, it'll be capable of 60 for a good while, at that res. Even lowering to medium after a year or so after that will extend its life even further. Med settings aren't too bad these days. Definitely still very playable and the differences aren't that drastic, night and day, like in the old days. It's a lot more subtle. Sure still noticeable but nothing huge. You'll be fine. I got a 1060 so I'm in the same boat. If I want higher fps or wanna up the res in the future, then yea, might wanna shell out the cash. But I'm satisfied for now. Oh and don't forget, there's also the used market if you wanna upgrade in the future for cheaper. Although I would be wary of it esp for this gen stuff, as a lot of them would have been used to mine. But next gen amd navi might be a lil comforting. It's rumored to be 1080 level for a midrange price of $250 next year, so maybe that'll be something to look forward to, given your budget. Same price as 1060 msrp for 65-75% more performance. Sounds pretty good to me. Idk about nvidia but I'm sure they'll also have something similar, for their midrange. As of now tho their next gen stuff disappointingly looks to be quite exp and not much of an upgrade, same price for the same performance e.g. 1170 = 1080 = $500, 1180 = 1080ti = $600 etc. We'll have to wait and see.
soopasoljah appreciate your response man.. makes me feel a lot better. Thanks a million. :) , And you're right , this is hunger for ultra settings is absolutely ridiculous until unless ur on a 1440p or 4k monitor where it matters the most. I get a pretty good visual experience with high to medium settings and its a lot less taxing on my GPU, so its a win-win. You won't believe how much obsessed I use to be.. I didn't even think twice.. even the Msaa settings...8x... I know... I know..then I started to hate my Pc then I delete the game..then another game releases..same vicious circle lol.
Also just remember. Maxwell (aka GTX 900 series) was a silly good overclocker. 980s reference from Nvidia run at about 1100mhz, but I have seen them get up to the 1500-1550 range with good cooling and some clever tweaking. Useful to know if you need to elongate the life of your card for just a bit longer!
@DigitalFoundry Ultra settings are great and all, but I think it's more than enough if you can get 1080p with locked 60 fps *at least* on PC gaming. You're not really going to actually notice much of these graphical settings when actually playing the game. What you will notice is frame rate and uneven frame pacing. So in short, I think that overall it's more worth keeping settings at Medium or High.
I agree with everything except I think no matter if the resolution is 1080p or 1440p, you should aim for locked 100 fps or so (and obviously invest that little bit for having a 144Hz monitor if you don't have one already). My mind was blown when I moved from 60Hz to 144Hz and I find higher framerate so much more satisfying than ultra settings. I mean I do like graphics quite a lot actually! But still..
@@65EKS65 I feel like it HEAVILY depends on the type of game. If I'm playing a more cinematic game then ultra settings are a lot bigger deal than high fps, however, if I'm playing a shooter then high fps trumps amazing graphics.
@@GaiaGoddessOfTheEarth Yeah I get your idea but I just don't know many "cinematic" games you mentioned. At least for me pretty much every game I play or have played have just been feeling better when it has high fps. Tho I don't have 4k monitor so I haven't invested that much into the image quality, for me 1440p is already good enough and I can play pretty much any game maxed too nowadays. I just love the smoothness of +100fps compared to ~60fps but I guess it can depend on the person too.
Not trying to be cocky, this channel is great and you actually learn a lot from his videos, but the fact that this channel his one of the few on YT to have metal music as background (besides metal music channels obviously) was the main reason for me to subscribe to him. Yeah, I love metal. Kudos, my fellow heads!
I am mainly a PC gamer.. for many many years! And I learned something new here and there. So.. thank you for these kind of videos. Keep it up with your great work guys!
Most important to me is the ambient occlusion in most games where there is a lot of elements around you. Sometimes it can make some scenes more beautiful than they would be with realistic linear light. The way it fades the shadows is very pleasant. I always have it maxed out
i suspect deus ex is using the same textures for most objects, but increase texture size for details like dirt, newspapers, graffiti, very similar to how the witcher 3 deals with textures, where medium uses high quality textures for faces, but low quality textures for clutter.
For Deus Ex example, i would like to point out that instead of checking the floor textures you should check the billboard textures ( which look pathetic on very high). One may argue that they may not matter but in case of a stealth/exploration style you are forced to see them up close and they look gross. Otherwise a very good analysis and really you have proven to be a good addition to the DF team (y)
meanwhile i played newest deus ex at 1440p full ultra with 2x MSAA and i got 80-ish fps AVG :D gtx 1080 ti FE + 6700k 4.6ghz (i have a vid on my channel)
Yea I'm not sure why he was focusing so much on the ground. Felt like he should be looking at walls or just objects in general, above the ground. I always thought that anisotropic filtering was the setting that had more control over ground textures and at different angles, not texture quality. That and tessellation. Whereas texture quality affected more the walls, doors, buildings, tree trunks, rocks/boulders, general objects, clothing, etc, you know, things that stand up vertically, up + down along the y-axis. I'm probably wrong tho. But that's what I noticed. Maybe it depends on the game too
this has been one of your best videos yet since it does tailor to the majority of the people with mid range cards. You guys should really consider starting a series where you pick a mid range card eg. 1050ti/960 or somewhere along those lines and try to optimize AAA Games on it for 1080p gameplay with the best graphics-to-performance trade off.
I always like to try Ultra settings on every new game I get just to see how my PC can handle it, but settings usually get turned down if I can't hit a consistent 60 FPS at my monitor's native resolution.
Nice, but one important feature was left out of this one : Resolution scaling and the visual/framerate impact. What resolution scale to choose & is dynamic resolution scaling good or too extreme ? Also what upscaling methods are best and which ones to stay away from ? Update: Also, is it worthwhile to invest in HDR displays and what impact does HDR have on gaming frame rates?
Ultra settings are typically made for future hardware. It’s one aspect I really like about PC gaming. Returning to an older game with new hardware is really fun.
I grew up playing in the lowest settings, low resolution and got 15-20 FPS. In some cases below 10 FPS. I got a GTX 1070 when it came out. You can bet your ass I want Ultra Settings.
This should honestly be required viewing for everyone that plays games on a PC. I've lost count of the times I've explained this concept of settings optimization to people. You take it to the next level with this video though. It was really great to learn about what is actually happening "under the hood" with certain settings, and why they can sometimes be such strains on performance. This is top notch content right here! :D
I don't care for ultra setting but for my first rig I would want ultra setting to be playable at high fps so I don't have to worry about my PC not lasting a good while, basically I would rather spend more money for something that can last me a long time then actaully buying the budget parts that might drop in value.
Great video Alex, i run an i74790k and a GTX 970 and must admit i rarely go into the graphics options as i don't really understand what most of them do.I kinda find the whole massive range of options a bit overwhelming.Thanks to your video i feel able to explore these settings a bit more :).
I currently have a pretty good rig (asus ROG VR72gs laptop) but i don't even play recent games in "ultra" for 2 reasons: fluidity and gameplay > beauty and ... i don't like to make the hardware heat up for so little bonus details! The goal is not to to push the rig to its limits, but to be able to push it but use it efficiently while not having to push it! Just like you want a LOT of audio power, to get a good sound while under-using it! It's the key to stability, that's just my opinion, however.
I'm upgrading my pc for the first time in 10 years, after playing lots of console or new pc games at really low settings. This video really helped me to stop stressing about needing the newest and best graphics card to really 'experience' the games. Thank you, friend.
having a pc that can run ultra settings is one thing and enjoying the game is whole other thing. sometimes we are too distracted with this settings that we forget why we are playing at first place.
I play for sweet visuals. I barely used fast travel in odyssey or witcher 3 i always like to travel on horse and enjoy the view which i can’t with low/medium graphics
Sometimes having settings on low can have a great advantage rather than setting them on high. Take PUBG for example Ps. Players on a distance can be spotted easier. For me i guess.
Yep :3 there are multiplayer games which low/minimal settings means less objects on the screen. So our opponents cant hide behind thick tall grass for example, because the grass become thin or even disappear with lowest settings XD
Man, this is one of those videos everyone should look at before playing any game. Thanks a lot. Very informative. I am sharing Everywhere, and I am subscribing
Actually thats because you are used to seeing terrible motion blur. Good quality motion blur actually looks good. There are some good videos that explain this very well. When its done right, you dont even realize thats what it is, the game just looks good. Too many games, until very recently just used techniques that made everything super blurry and muddy and holy hell did it look like complete crap and should never be turned on. But with better processing and techniques, it now can look really good and is definitely something you want turned on. Again, its good when done right which is rare but getting more common, and its crap otherwise which is what most everyone has experience with.
First option that flies out the window if a game has it. I want crisp and clear image when i play. Same with shadows, i set those to mid-low. Ambient occlusion, most useless thing that eats a shit ton of processing power, turn that off too. The change in quality is almost zero, but you get a good ammount of extra frames. @Borisblade, if motion blurr is so good that you don't notice it, then why should i have it turned on? That is the whole point of this video. If anything doesn't give you a noticable positive effect, then why waste processing power on it?
You'll never see "natural" motion/movement in a video game, regardless of the cost or performance of your rig and its individual parts. It's simply impossible to create something that has a natural motion without any motion blur added in a digital, virtual, pixel-based setting. There's a video I saw recently (though cannot remember the title) where this was explained pretty well. Motion blur is added to enhance the viewing experience and make it look and _feel_ more natural. But as BorisBlade7 said, it's got to be done well.
Every game. Select Ultra, lower shadows to medium, turn ambient occlusion off. Reflections on medium. Framerate limited to 3 frames less your monitor. Vsync off in game. Vsync on in Nvidia Panel. G-sync on. YOU'RE WELCOME.
*and here I am perfectly happy with a GTX770* haha Witcher 3 (and other games) still look and run amazing to me and I'm cool with silky smooth 30FPS. I used to chase top-end hardware and max setting but its just a money sink and nothing is ever quite good enough - it never ends - its kinda weird and a little sinister...but yeh
Yeh I know its old in the tech world, and I didn't say it was perfect or could do anything, I was just stating a little experience I tend to have when watching DF videos.
I also have a 770 and share your thoughts, especially with today's hardware prices. I bought an Xbox One X and haven't regretted it, much lower price, longer relevancy/lifespan and close enough graphics when compared to a good gaming PC.
Satyasya Satyasya That doesn't sound right, I have a GTX 960(~as powerful as a 770) and Witcher 3 easily runs @60fps w/ a mixture of high and ultra settings on my rig.
Have a 2gb 770 but instead of 1080p at 30fps, I lowered the resolution to 900P in order to get a constant 60fps with medium/high settings no hairworks. Hopefully now that 1070 ti have come down to prices closer to msrp, I hope I'll get one soon
The "post processing" bundle option annoys me tremendously. I'm with the late TotlaBiscuit on this one - give me separate options for separate effects. Don't bundle them together into misnamed sliders. I HATE depth of field, motion blur, chromatic aberration, film grain and all the other ugly effects passing as "ultra settings" but I still want to keep anti-aliasing, ambient occlusion and HDR. A "post processing" slider which has depth of field on its first setting and AO on the third or fourth is ANNOYING. The simple fact of the matter, though, is no - you don't need Ultra settings. As you've showcased, they're not always worth the performance decrease and some if not a lot of them end up making the game look worse. That's why options menus with individual settings matter.
You can dumb options down for the average consumer and still offer an "advanced" menu for the more technically-savvy. I lot of games have done this, too. Offer basic options like Low/Medium/High/Custom graphics, then offer either a separate menu or an extensive dropdown when Custom is selected. There's nothing wrong with having presets, but not at the expense of actual customisation.
I do tend to just push everything up to ultra/highest, as long as I can maintain 4k/60fps I'm usually happy. If not, the first thing I turn down is resolution and not settings.
Cmdr Flint Yup. 4K isn't worth it. 1440p isn't worth it especially on my laptop (which is equipped with a 1920x1080 monitor). I max the settings out and thanks to the i7 7700HQ 16Gb of DDR4 at 2.4Ghz a GTX 1070 and an SSD I'm still comfortably running nearly all current-gen games above or at the worst very close to a locked 60 FPS in 1080p. Why would you turn the visual quality down, but increase the resolution? Seems just stupid to me. Still rendering even at 512x512 using Cycles (Blender) can be painfully slow... 😭
Cmdr Flint yeah man 4k is not worth it if you cant have ultra settings. Who wants to see blurry textures in full res 4k anyway. Atleast in 4k you dont need antialiasing
I like that answer. I too value better settings over resolution. ( 4k at low setting is still awful I dont care what anyone says ) Just play in window mode one resolution down for the best results imo.
Speaking as someone who went all out with a PC rig about a year ago on a 65" Samsung MU8000, there were times when I cranked up the resolution and the settings to try and get the best picture possible, but it didn't matter because I had already hit the point of diminishing returns. I found that at my viewing distance, 4K offered nothing over 1080 and that I couldn't tell between Ultra and High for some options. I've decided that performance matters more than graphical perfection, and as long as the game looks decent I'll have a blast.
What I do is turning on everything on ultra and check the framerate. If framerate is lower, than 60 fps, it's time to lower something here and there (mostly aliasing, hairwork, shadows etc.). Once my game reaches something around 60 fps, I'm done.
Same here. Usually lower post processing a tad and maybe drop down to SSAO from HBAO. I always keep the effects on ultra because I need meh explosions.
@@ColdieHU Literally all they said was just how they handle graphics settings in games. They may have a $3000 setup with an 8700k at 5ghz and SLI 1080ti's and if they do they should be able to play their games at ultra 4k at 60fps. You made such an assumption about them with no info I think you're more of an issue than them.
Ambient Occlusion is pretty much my worst enemy here. Most games run under 20fps at 720p when enabled. If there's no option to disable it I can still decrease the resolution to 480p. Some games let you do that in the ini files if it isn't possible in the game options. My target frame rate is always 20 - 30fps.
That usually works for most games. But some games (Yooka-Laylee in my case) won't let me do that. There is a mod that disables bloom, DOF and AA via F-keys. But so far no options to turn of AO. At least it runs just fine at high settings and 480p. Kinda feels like a Rare game for the original Xbox.
Depends on the game. If it’s a first person shooter, MMO, or a game like rocket league, I crank the FPS up to as high as possible. If it’s any other game, I crank the graphics to whatever settings give me consistent 60fps.