1 FPS per Mile haha, bro I really do Appreciate you showcasing the use of the 4090 on 1080p resolution, I guess it's safe to jump to 1440p from here on for people who like that steady +100FPS with max settings. Plus the GPU didn't show everything it got under the hood.
It wasn't the best, but I think considering the main lighting method of the game is raytracing, that puts it into a whole other category where you can only compare it to OTHER games with raytracing, ya know?
You are right, I don't NEED more than 60, but I WANT more than 60 ;) I don't use high refresh rates for any "competitive advantages" like lower response time, I use it because it is much more immersive for me when it's smoother
Why use 1080p if you're not willing to go below quality upscaling? You could very well get a 1440p monitor and achieve better image quality with balanced upscaling at the same performance cost. DLSS Quality at 1080p looks horrible and you're not even achieving or gaining anything there.
Such a beautiful game that's so underrated. Imo it's such an improvement over the first game. It's a shame we'll most likely never see a third installment in the franchise.
So far I've been able to crank almost every game i play to max settings 1080p with just a rtx 3070 and a Ryzen 7600x Usually 120fps - 144fps. Games such as helldivers 2, horizon zero dawn, horizon forbidden west, jedi survivor, etc have all run max settings at 120fps+
Fixes actually helped a lot. It's so funny having a 4080 which runs Cyberpunk with RT just fine but this game doesn't work at all. I don't know what they were smoking but you'd think a fix would happen.
Do you have to get above 100 or 150 in certain the specific season of the loot you want or is it once you get it to that threshold in one seasons you’ll just start getting master 4 and 5 no matter what episode you play?
Yeah you have to reach that Mastery tier in each episode to get the new or old loot drops at the different Mastery levels, it's not once in total unfortunately
For a lot of games, I prefer to run my games at 1080p! Especially if I'm streaming from my Gaming PC to my HTPC. Either way, 1080p in most games looks Great to me!!
Thank god i only play older games (3-4-5 years old), i usually achieve the same with a 2080ti, 1080p, usually max settings (plus 1440p supersampling on), lows are usually over 90fps, on average about 120, and it maxes out at about 180fps. The card cost me 250 euros last year :D
When you are playing games like Cyberpunk 2077, Star Wars Outlaws, and other big AAA games at max settings with raytracing enabled, yeah you need frame generation to get close to 100 fps
1080p is totally fine by me, i go after fps and performance just like you, so i too dont care about 4k or some other shit. Most games arent even optimalized that good to run in 4k with max settings :D
Now if it was only paced like Doom and I would have liked it much more. Old Doom. Not new. I enjoy the labyrinth key and switch hunt. Not lock me in and kill waves of enemies every other room.
So, this was with a 5950x, right? I have a 5900x and am not hating at all. Love my CPU. Got it and the RAM tweaked to the max. But a 5800x3D would put you even higher, like 7700x performance. And then you've got these 7800x3D CPUs. You can actually push even more fps with those CPUs than the Zen 3 non x3D CPUs. But I'm sure you know that. I used to think I was going to sell my 5900x and get a 5800x3D but then I saw that it really doesn't do any better when you game like I game at high resolution and max settings. Plus, it's hard to give up 4 cores/8 threads. The 5800x3D is only better at gaming. It doesn't do better than a 5800x at any work related task, and certainly can't compete with a 5900x/5950x at work related tasks. Compressing/decompressing files, encoding, editing, etc. Which, you can encode videos with your GPU like 10+ times as fast, BUT you don't get the same quality and the file is bigger, too, so it just depends how you wanna do it. Same with streaming/recording. Having the extra cores/threads lets you use the CPU to stream and record at a higher quality than using your GPU, plus you don't lose any performance (not that it's much) from recording with your GPU. I like to use OBS and record with my 5900x because... why not? Got find something to put these cores/threads to use for. The most CPU intensive game I have is Battlefield 2042. In a 128 player multiplayer server, it'll push about 45% CPU usage. When I chose the 5900x, I knew it'd be overkill, but I had NO idea how overkill it was going to be. I should have went with the 5800x (the x3D wasn't out at the time) and went with the 6800 or 6800XT preferable. But there was no way I was going with the 5600x. I just don't think 6c/12t is enough. I mean, BF2042 would practically push it to 90+% on just a 6700XT 12GB @ 1440p Ultra. Plus I'd be waiting twice as long for all the shader compilation for the games that have it. AND... there's the 5900x has 64MB of L3 cache while the 5800x and 5600x have 32MB L3 cache. Some games perform the same, but some games really benefit from that extra cache that the 5900x has. But who cares, my GPU is the issue, now. I'm wanting to just get the 7900GRE 16GB. I think that'd be a good performing GPU, especially since I only have a 2560x1440 144hz monitor and I like to max out the graphics + ray tracing.
Guys I still have huge fps drops, with msi afterburner I notice that when this fps drops occur my gpu usage goes to from 80% to 0% also all this time while my gpu usage is 80-90% cpu usage is 3-5 is that normal ? also I have notice engine file sometimes is missing and the text doesn't get saved in it.
i don't have good pc but like I always could run new games on medium/good quality without issue. 12th Gen Intel(R) Core(TM) i5-12400F 2.50 GHz - Geforce RTX3060
We should all do this more, 1080p is normal, I mean, we all watch 1080p videos most of the time on internet anyway. Or at least 1440p, but not 4k, it's great, but not necessary.