Thank you for this. I'm blown away by how many people still play at 1920 X1080, 3440X1440 is possibly the most meaningful "upgrade I can feel" I have ever done in like 25 years of PC gaming.
did you skip 1440p? cause there's numerous vids showing the 1440p and 4k can be almost identical in some modern games. if there aren't any 4k textures then its the same . LTT did the same with 4k vs 8k and theres not difference and theres no 8k textures out there in gaming right now. Upscaling 1080 textures to a 8k tv is not 8k. 4k gaming is relative expensive on hardware and it will probably be next gen where 4k high gaming really start to kicks off.
@@kenbladex It was so frustrating to see the people in that 4K vs 8K video look at stuff that obviously wouldn't look better in 8K (textures). Aliasing & moire patterns are where the real differences are.
@@nathancernovich5509 oops 😆 but yea 120hz was needed for 3d gaming glasses but 144 and 165 really help smooth and sharpen the image. I think for me oled was the biggest thing that impressed me. I had a non hdr 50 inch 4k lcd and I thought that was cool but when I got an oled 65 hdr...it's a new world. Every detail- the blacks and color it's pops . And most oleds are native 60hz or 120hz with some fancy 240hz post engine so that says alot. I'm just buying time waiting on a good 1440p or 4k oled monitor with hdr 1000. Samsung odyssey ces 2023?
I have an alienware 3440x1440p 120hz and I originally ran it on a 2060 which was managable, but upgraded to a 3080 10gb and its perfect for me. It lets me hit 120 fps or close to it in almost every game I play (mostly sim racing). Upgraded my r5 3600 to 5700x recently and it made a huge improvement on my frame rates and overall experience.
THANK YOU., I am so annoyed by being ignored by all the so called review sites that completely ignore the ultra wide monitor and always go 1080p HD . ridiculous i left that era 12 years ago..
@@johnboylan3832 i have 2 3440 x1440 monitors , the first one i bought in 2015. annd also 2x 2048 x 1152 monitors i bough in 2008 and yet my gaming informationn is completely wasted in favour of those refusing to buy a decent monitor or think they belong to the 0.01 % of toip gamers fast enough to see any diffference over 240 fps. Why cater to these people that dont buy a decent monito or those bread players. Time that someone actually does something to actually get people into decent monitor resolutions belonging to ther 2020s instead of the mid 2000s
Getting an ultrawide 1440p is the best upgrade Ive made. And it's not just for action games, if you play strategy games it's also really good. All that extra space for the main map is really good for paradox games where the interface can feel a bit encroaching. The only games that don't support ultrawide is older console ports, it's pretty much supported as standard for all modern titles.
Same. Im UW all the way. The only game iv ever not been able to play ultrawide is starcraft 2 and all that happens is you endup at standard 1440p…which is fine!
4K is a joke, I have a 32:9 monitor and gaming on that thing is just incredible, It still gives me the wow effect after years. Stop listening to the 4K pixel competition, SUW is the way to go, just try it, you won't go back. Also for work is top, it's the end of multi monitor setup, finally. StarCraft 2 doesn't support it to avoid competitive advantage, I suffer the same problem 😓
Yup. I was early adopter of uw and 6 years ago not all games supported uw but now almost every game can go uw. Since I got my monitor long time ago I am limited to 60hz so 4090 doesn’t make sense for me. Having been on sli early days never thought we would be at cpu limit for this resolution but we are now.
@@unpacker9521 Keep in mind that he's benching 5 year old games lmao Why isn't he benchmarking new games like MW2? RTX2080 gets 25-30fps in that game at 4K. 3080 gets 65fps in this NEW game. Just imagine playing Atomic Heart, Stalker2, Starfield next year. These reviews are unrealistic and doesn't show the full picture just yet. 3080 only seems to be getting 80fps @1440p in MW2. this is all dlss off fyi.
@@n9ne can you please watch the video first , he tested the most demanding games to date , MW2 is last gen game plus its unoptimized at the highest settings so not worth his time
This video is what I've been looking for. I bought a 3440x1440p monitor with a 100hz refresh rate. It is sometimes incredibly hard to find benchmarks for my resolution to decide on a GPU, but this definitely helps put things into perspective.
yeah, i recently bought a UW monitor as well, but at 165hz. I only have a rtx 2070 (waiting on the new AMD GPUs), so i have to downgrade the resolution in some games to get a stable 60fps, but its doable. I just hope that more devs actually supports it and dont have to rely on 3rd party tools to fix them.
Thanks for the video! I play at 3440x1440 and have a 1080 Ti currently. I've been eye-balling the 3080 12GB and the 3080 Ti and this video makes me feel certain either would be a good choice. But I'm also awaiting AMD's RDNA 3 announcement about 4 days from now!
I've got the 4090 and I'm using it with a 175 Hz 3440 x 1440 monitor and I don't regret buying it. I had a 3090 before and in most games that I play the difference is very noticable. I just wish there would be more reviews considering the ultra wide resolutions.
might be a while before that happens if it even happens. People love copying each other trying to get views doing the same shit because some one looking at one type of thing will probably watch it again with a different face in the thumbnail these days.
@@jamesadkinsiii7925 Steady 167 FPS with all settings maxed except for motion blurr, which I turned off. DLSS is set to quality and FPS are capped at 175. Vsync is on. This was through the campaign. The ingame benchmark is not different. But I have only played one match in multiplayer yet. I didn't notice any difference there. I'm playing on a Ryzen 9 5900X and a Gigabyte Gaming OC 4090 with stock settings. Bios is set to OC which only affects the fan curve afaik.
I can't express strongly enough how insightful this video is. I have been using 3440x1440 for years now, but when it came to determining the performance demands, I would usually guesstimate between 1440P and 4K because most GPU performance reviews do not include that common ultra-wide resolution. But adding the 4090 value/performance assessment was VERY helpful as well. Based on this, my next GPU will probably be a 6070. :) In a world where you don't have to buy a power hungry brick to be able to max demanding games? I could only hope.
Ty for this video. Last PC I built was 6 years old. I don’t upgrade often so always go with a top end build and let it last a long time. New build is 13900k + 4090. Should last me another 5-7 years easy.🎃
I do the exact same thing, had a 6900k and a 1080ti but the 6900k died literally a week before 12900ks launched. Was planning on doing a full upgrade to the 13900k but I couldn't wait that long without my PC so got the 12900ks. Still using my 1080ti as I am waiting for the ekwb water blocks for the 4090. Have you had any issues with the e-cores being priority over the p-cores when video encoding or transcoding?
@@allankelly-watt4633, apologies I will be no help here. I do not currently do video transcoding/encoding. Primarily used for ultrawide/4k gaming at the highest settings and so far no issues on that front. Congrats on your build.
Thank you for this! I've found a few videos but only did side-by-side comparisons without commentary. I love my 1440p ultra wide and really want to see which gpu to upgrade to.
3840x1600 is the best ultrawide resolution, IMO. Can be had in an IPS panel with a LG or Alienware, or what I finally went with, a 43" QN90b mini-led QLED which is able to do 3840x1600 21:9 @144hz in VRR mode (can also do super ultrawide 3840x1080). You lose the curve and some response time, but gain much deeper blacks and have the ability to run a console at 4k 120hz..
I upgraded from an LG 38" Ultrawide at 3840x1600 to a 43" Samsung QN90B and play most of my games at 3840x1600 and it's awesome. Much better blacks and color, clean, cinematic look at the same screen size. Miss the curve somewhat, but trade off is the full 3840x2160 resolution for games that don't support Ultrawide, or when I just want to play games at full screen.
I recently got a 4090 aorus master and play on a 3840x1600 lg ultragear and I can say that's a good resolution to stretch it's legs on an ultrawide before going up to 4k. Great video btw.
@@krisztiannagyszeder1095 Yeah, most def I think the 4090 would be a great pairing with the Alienware 38." I don't think my case is big enough for any 4090 model other than the Founder's Edition though plus the issues with the 8 connector . . . It's probably gonna be a while till I get one. IF I get one.
@@KK-eg3em yeah man it's definitely the best gpu for this resolution. I used to only get just under 50 fps on cp2077 everything ultra dlss quality with a 3090, now I get the full solid 144hz that my monitor can display at the same settings. Plus it's paired with a 13900k so there's that too.
This is me. Went from a 1440 16:9 and 3080 to the alienware 21:9 QD-oled with the 4090. I agree with the conclusion but my tought here was also that the 4090 is way more future proof than the 3080 and i love to max out evey game with high fps. Therefor not going for the 4k screens
Ultrawide!!! Finally. No more doing math between 2160p and 1440p to guess about ultrawide. Im really loving my 3080 12gb @ 3440x1440, I'd not recommend it at 4K. Plague Tale: Requiem was slaughtering the 3080 though, the framerate dips were painful. Had to manually set 85% resolution because dlss made walls and stuff jitter.
Thanks Daniel for focusing also on the 1440p ultra wide, excellent video! Regarding the "is the 4090" worth it? I feel that many reviewers focus only on how is the 4090 today or this Gen. What about my scenario, where i plan to skip the next gen of GPUs (i do a tik-tok approach, every 2 years i change either CPU platform then GPU)? So i don't really care if the 4090 today is limited by SOME game engines or my 5800X3D, because in 4 years, time that i plan to use the 4090, for sure game engines will evolve (also the CPU i will change to in 2024 will use this GPU better). I do think that if you go for a top of the line GPU, like the 4090, it is feasible to think that you may skip the next GPU generation, so limitation of game engines now is not the main point anymore.
Great point i am doing same thing. Spendingmore money now but not changing every year will still save me money. Got 1080TI and was able to use it for last 5 years slowly lowering settings. now i need to play on lowest so time to upgrade
Really planning for 5-10 years of use is pretty practical. beasty spec now just means mid or low tier in a couple years. meanwhile game specs only continue to increase and be insane.
It still blows my mind that 1080 is the common res to game on. I will say i play at 1440p UW and coming from an RTX3080, my enjoyment has definitely increased. My monitor is 120hz so I have so much room for smooth gameplay now with no more fps dips.
Good one Daniel. I'm getting a bit better results on my3440x1440 Cyberpunk. I use RT Ultra DLSS Quality with my i9-10900/3090(both liquid cooled/not OC'd), FPS are locked to 60fps and I've not seen any dips below that. I am also running at 144mhz on my monitor and it's all very smooth. I do have a very clean OS, specifically for gaming...and except for my drivers, I am only running Steam and Afterburner.
Thank you, i thought that 3440x1440 was more popular, had this monitor for three years now, used it with 1080 Ti, and now two years with 3080... I didn't plan to upgrade and still don't... however day before 4090 release my RTX 3080 10gb artifacted just in desktop, doing basic browsing :/ year and half ago i changed thermal pads, because 110c on memory aint right.. and 30c drop was big improvement after i used thermal right oddysey pads and thermalgrizzly thermal paste. However even though i didn't mined, kept gpu undervolted while gaming. And it still got issues. Problem is since i opened gpu myself my warranty is gone :/ And even more sad that i overpaid for this gpu. Planned to use it for atleast another 4 years... But artifacts keep appearing time to time.. I will probably go for 7900 xtx this time...
You could open the card again and see if the pads make good contact, but given the fact that it happens on desktop and not under load that should not be the issue. I assume you tried to change display port exit too, some time one of the exits can fry. What else.. try to put it back to stock voltages or even add a little if the card lets you. Your warranty should not be voided unless they can prove you caused damage but that depends where you live. I feel your pain, good luck!
I'm currently running a 3080ti with 3440x1440p resolution and it's enough to run every AAA-game out there with over 60FPS and (nearly always) maxed out settings... the 4090 is imo more for 4k 144hz players and of course competitive games of relevance... anyway, thank you for making this comparison, i subscribe to your channel 👍🏻
I use my Windows computer only for gaming. (I use a MacBook, iMac, and iPad Pro for general computing needs.) Thus, the PC is in the living room plugged into the 77" LG G2 OLED. It's spectaculoso.
I recently returned to UW with a AW3423DW, I was rocking a 55" CX so size downgrade took a bit of getting used too, but keeping my OLED blacks and getting even higher peak nits for HDR is just great, I'm on a Strix 3080 10gb myself currently but planning on jumping onto AMD's 7000 series
I recently got a C2 and gaming on it with a 3080 is awesome! But amazing monitors have been coming out lately and im glad oled is more readily available to gamers so you can get the best of both worlds
@@Bdot888 Yeah I was going to go to a 42" C2, but decided to give UW another go, I sometimes miss the extra size honestly but since I only play at a normal distance I think the 34" is probably better, nice thing is I've kept my OLED blacks, and gained extra brightness over my CX
@@Zillawill I moved to OLED for gaming in 2020. I play at 1440p and not 4K for the better UI scaling. Been using the LG CX 48", LG CX 65", LG C2 42", & the LG G2 77" OLEDs in both my house and my dad's house, all 4 connected to PCs. Except for the LG G2 77" on which I play at couch distance, I play on the LG CX 48" and the LG C2 42" at less than 35" inches from the screen and on the LG CX 65" at less than 60" inches from the screen. I have an 38" ultrawide LDC for work which is great for productivity but I have been mulling the ultrawide AW3423DW OLED for gaming but sacrificing the size difference is a big ask. I know that one gets an ultrawide for immersion but a big screen TV is immersive too so I keep debating and ambivalent about making the jump.
Thanks man, great video! I've recently splashed out on the new Alienware 175hz 3440x1440 screen and I have a 3080 with a 5800x... I could basically double my framerate. Have to decide if that's worth £1500+ lol. maybe?!?
You've won me with this video. Currently I'm running a GTX 1080 on a 3440x1440 monitor ( playing at a resolution of 2560x1080) and I'm looking for an RTX upgrade. This video clears everything for me
how did you set the 2560x1080 resolution, i've tried pretty much everything in windows and nvidia settings, even games dont allow that resolution for me
@@frickzjee unfortunately it’s been a while since I had an nvidia gpu, but it if you open up nvidia control panel. I can’t remember which, but one of the options on the left opens quite a long list of options. On the top part of that list there should be an option called ”image scaling” after you enable that you should get more resolution options on games like 1224p and 1108p ultrawide
I'm currently on an Alienware ultra wide 3840x1600 and it's probably the sweet spot for UW. Like most other commenters, I'll never go back to a regular 16:9 screen and have little to no interest in going 4k. I want that lovely wide view, it's just more appealing.
Tips for everyone. USE DLDSR and upscale your games first before considering a monitor upgrade. 2880x1620 looks damn good on 1080p monitor. And now you won't feel like your GPU is being underutilized. Plus 1080 monitors get higher Hz up to 360 for less than $400 $150 Dell 240hz monitor 3 year warranty, they make Alienware . Upscale it yal.. Just do it. 1440p @240hz is around $500 for cheapo Amazon and $700 for Alienware or ROG.
Conclusion: Daniel's analysis heavily discounts the future and focuses on a present value with a high discount rate. Either way, this is the content that this niche community needs! So thank you!! Note: these conclusions will change if you have a better CPU. Also, as games become more demanding the 4090 will be more valuable. Further, as new CPU’s release such as the new X3D’s and beyond, the 4090, again, will improve.Thus, at this current moment, 3080 is a great call at this resolution. However 1-2 years (even as soon as a few months) from now when newer CPU’s can compliment the 4090 better, the 4090 will continue to stretch its legs, even more noticeably as more GPU demanding games release. Hence, the gap between these GPUs are highly likely to continue to increase over time.
So basically if u have a 3080 and are content but want the higher performance, wait, when next gen cpu comes out sell 3080, put to a new cpu, then buy 4090 or even 4080 cuz who knows if that’ll be cpu bottleneck too and will have increased performance and maybe a better price for either by then or even an AMD card
Counterpoint, very few people who are seriously shopping the 4090 are overly concerned with 'future-proofing' outside of rationalizing the expense. For people who are chasing high end performance but don't nessessarily want to spend an extra $800 that they don't have to, this is an interesting video.
@@TheGreektrojan true, and I agree, a fantastic video! I just did a new pc build and had to chose between the 4090 and 3080ti as well as the 13900k vs the expensive Ryzen 7000 series platform. I opted for the 4090 and will grab the new X3D in a few months. Hopefully get a long use of the rig. So, I was concerned with future proofing. I could have waited for the new AMD cards but I guess I was impatient. That’s on me!
Ok 🤡 so just wait for 2024 for cheaper motherboards and ddr5 ram then get the new faster cpus with your cheaper 4090 that could actually use it instead of forking a premium price today for less performance today. There destroyed your ‘argument’ 😂
I pretty much did this exact upgrade, from a 3080 10GB FE to a Gigabyte Gaming OC 4090, and game on 3440x1440. Ultra settings on any game, and playing CP is such a smoother experience. However, I will say certain games like COD MW or MW2 the experience feels largely the same after 120hz for me. Basically I'm saying you don't NEED a 4090 still. Purely a luxury item to have performance overhead.
Its easy to feel like you are missing out when new gpus come out but everyones gaming needs is different and im glad gamers have so many options nowadays that fit any type of preference
Great video! I am surprised at how few of us use widescreen monitors. I’ll stick with my 3080 for now. Next upgrade will be a 4K OLED monitor and then a new GPU. By the time I have the budget I suspect the 5090 will be out… 👍
I find ultra-wide to be much more immersive especially in single-player games and a better experience overall than having straight up more pxiels going to 4K. 3440x1440p @120-144hz really is the sweet spot in gaming at least for me anyway. I don't think I will go to 4K ultrawide for a long time yet.
I would go higher res if it meant a bigger screen, I recently return to UW with a AW3423DW and its great, but I do wish it was more like the height of a 32" so when your stuck at 16:9 it still is pretty big
Agreed, I don't wanna go any higher because having consistently high fps is also very important and frankly that's just not possible in many games if you go higher resolution, no matter how good your hardware is.
Is there a similar performance comparison for 5120x1440? I'm interested how these two cards perform in 32:9 aspect ratio. Is it closer to 4k in terms of workload or would RTX 4090 still be CPU limited, making it an overkill for that resolution?
Finally, no black bars on the side. Keep it this way :) I'm running a 5800X3D with a 3070Ti FTW3, on a 120hz 3440x1440 ultrawide. No issues in most games I play, when I'm dropping below 100 fps, I turn some settings from ultra to high. Not a big difference in visuals anyways. Update: 8 months later now, got an amazing deal on a 1 year old TUF 3080 12Gb which was essentially sealed and never used for 450 euros, and still 2 more years of warranty on it... Got the original receipt with it too, the owner paid 1200 euros for it in '22 July, just to never use it for one of his projects. Omegalol. So then I've sold my 3070Ti for 350 euros, not bad upgrade for a 100. To my surprise, games are so much more stable with 12 gigs of VRAM. CP2077 especially gets a huge boost, 3440x1440 Ultra graphics + DLSS Quality without RT, hovers around 100-110 fps. RDR full Ultra with all ultra advanced settings gets like 90. I really love this upgrade.
As a 21:9 nerd this is my comparison. At actual prices the 3000 cards between 3080 and 4090 ( so 3080ti, 3090, 3090ti as the prices are much more userfriendly now) would be interesting (especially in terms of powerdraw here in europe) i hope the 5000 Generation will be more developed towards power efficiency,
Honestly, I doubt that 3080 12GB gonna last for that long. The performance jump with 40 series is huge, that's not minor jump like it was between 10 - 20 or 20 - 30. Also 40 series cards are actually available and will be on discounts in next few months as usual. And still, all we seen for now are the "basic" 40's, we still have to see the real beasts - TI's. That said, if 3080 struggle right now to run some games in UW 1440p with stable 60fps, it will only be worse with all upcoming "next-gen" games. Meanwhile 4090 pretty much guarantee you fluid gameplay (at least 60fps) with ultra settings for at least 5-6 years up front, Just like Oc'd 1080ti still can run 95% of games at ultra 1440p with 55-60fps (without rt obviously). If you get like $200 discount somewhere in the future then I think 4090 will be a great and safe upgrade that will run anything perfectly for years.
Would you mind listing your personal settings for CP2077 3440x1440p? You said something about DLSS being better at one level with RT & only using RT reflections for better image quality... Im not in front of my computer but have a Strix 3080 with a decent memory OC, an AOC CU34G2X and a 5900x and if I recall it averages around 100fps with mid 40 minimums which seemed optimal for adaptive sync. I was thinking about updating my driver's but also play fallout 4 and the newer drivers tank the games settings.
I heard you bailed for 4K OLED from your ultrawide. I picked up the Alienware AW4323DW/F ultrawide OLED, time to ditch your 4K OLED. This thing is unbelievable.
Daniel, I play on ultrawide also and my Alienware cap at 120mhz refreshrate. I'm currently using an rtx 2070 that does serve me well still. Generally speaking I would upgrade every 2 to 3 generations but I feel the 2070 had underwhelming performance even for its time...do you think an upgrade to a 3080ti would be worth it? I'd go for the 40 series but the price is just too steep. I could also wait it out and lose on the Black Friday deals and see where the prices go from there. Would love to have your thoughts. Cheers.
I have a 3440x1440p 144hz monitor and playing on a Ryzen 3600 paired with RX 480 (couldn't justify upgrading my gpu these past 2 years with scalper prices after I bought my monitor) Been watching your videos for a week and I have my heart set on 3080Ti
I just got a 1440p ultrawide and idk what to get, its either an rtx 3080 or an rx 6900xt since they are roughly the same price. I Was thinking about a 3080 but after seeing this i dont think im gonna be using raytracing anyway since it makes games kinda unplayable at this resolution. So would it be better to get an rx 6900 xt for its higher rasterized perf even tho i would give up dlss for fsr ? ( I have a gsync chip on the screen btw ) need your advice guys
Also everyone should keep in mind that in the vast majority of games going from ultra settings down to high, gives a good performance boost and very very little fidelity loss, like almost undiscernible. You don't have to think that your already badass card is now obsolete bc a better one came along. It also shows that even at this resolution, if you don't yet have one of the top gaming CPUs then there can still be spots of cpu bottleneck making the $2k 4090 look even less worth it. IF you have a 3080 or 3090 and you want to upgrade something, how about one of the new QDOLED ultrawide panels? My tv is OLED and I can say I'd much rather have an OLED monitor at this point than more frames, and it would be cheaper. Then maybe see what the 5000 series (or AMD) have to offer. Just my 2c.
the 4090 matches perfectly the needs for VR :) :) (And I'm running super ultrawide - 5120x1440 -, the 3080ti holds up quiet well.... but it's time for a 4090 ^^ ) Thnks for your video ;) Cheers
Yes it's worth it. I'm coming from a 3080ti and I couldn't believe how much better Alan Wake 2 and Cyberpunk looked with RT on ultra settings on a 4080s. Have now sold those and am waiting on the 4090fe to be delivered. I'll be set real well for several yrs now.
Great video. Have been umming and ahhing about moving to ultrawide, for the more cinematic experience and using a 3080 10GB here so perfect timing. Currently running an OC 8700k which is doing ok at the moment, but highly likely to move to a 13600k DDR4 platform soon. Think that should give me a relatively decent experience as per the video.
Can I get some advice from the awesome community please. The xfx 6900xt is priced at 680$. With my 5600x and 34inch ultrawide, is it a smart buy? I don't live in USA and in my country the high end gpus cost 2.5 to 3 times the retail price. I was hoping to get RDNA3 gpus but my friend's flight is on 24th December and I don't think amd will be able to have something in the 1000$ range in market by that time.
Thank you for the great break down! Would your conclusion differ if the ultra wide ran at 3840 x 1600? (LG 38GN950-B) Not a tremendous amount of additional pixels, but there are more than 1440.
If you have to choose between buying on or the other instead of considering an upgrade I'd say an RTX 3090(Ti) would be a good option and still save you half the price of an RTX 4090.
My 3080 TUF is on the shelf, while a 4090 TUF is in the PC. It's all about RT these days and my main monitor is only 1440p/165Hz. You should revisit this when the overdrive patch arrives for CP2077.
I've had an ultrawide for about 18 months now and I don't see myself going back to 16:9. I just wish 38inch would become more common place because 34inch feels just a touch too small.
I use a Samsung G9 Neo monitor which has 5120x1440 at 240Hz. I came from a 3080 10GB and upgraded to the 4090. In many titles I've tested, while I do get noticeably higher frame rates, if the game doesn't have the highest of fidelity, my 5950X is the limiting factor in the end. But heavy benefactors such as Cyberpunk 2077, I can finally enjoy 90+ fps gaming without the many compromises I had to make with the 3080. For someone, who suffers from motion sickness in first person shooters such as myself, the 4090 is a godsend (albeit expensive one) to alleviate my motion sickness by increasing and keeping fps over 90. While the card with its price tag definitely isn't for anyone, it definitely helped me enjoy games more.
What happens in the scenario where your graphics card puts out more frames then your monitor can handle example. I have an ultra wide Asus 32 inch monitor at 100 Hz. What if the card pushes out 130 frames what is the negative effect and what should my fix be?
Nasty Screen tearing artifacts appears mitigation is to turn on Vsync setting in control panel for your graphics card and in game which would attempt to match frame rate with monitors refresh rate so it would cap the frames at 100 if it goes over preventing screen tear artifact. Just need to make sure monitor is vsync compatible which most if not all high refresh monitors are.
@@bobdole3251 so I own the Asus 32 inch Ultra Wide monitor and it does have Gsync but what I’ve noticed is that when Gsync is turned on the top portion of the monitor flickers from time to time and I don’t know what does that.
With gsync. Enable Vsync in GPU control panel. Limit frame rate in control panel to 3 fps below your refresh. Turn off Vsync and and fps limiter in game
I found it strange that the RTX 3080 12GB in the Cyberpunk 2077 RT Ultra comparison 8:11 doesn't draw close to 350W, even though it is 99% utilized. I checked the other games which are very demanding, like Plage Tale and there the RTX 3080 12GB is okay and almost every other scenarios when the GPU is fully utilized, the RTX 3080 12GB draws around 340W, which is correct. CPU limit because of the Ray Tracing? even there too? Just a theory...
I got a Samsung Odyssey G9 Neo on sale a while back. I'm looking to build a new system, and I was having this exact debate with myself. Do you think that having a super ultra wide would tip the scales definitively towards the 4090?
If you care about frames and play competitively. YES. I have the G9 240 hz monitor with a 3060 TI and it does very good for most games but what sucks is if I want higher frames I have to go to the 1920x1080p scale to get max FPS. So i am definitely going to be upgrading my gpu in the future soon. If you’re not like me then I would say no just get something like the 3080 or 3090 at the most.
Nice, but it would be good if those 4k monitors for pc's become cheaper. The lack of mass adoption is keeping the prices high. At the moment I am considering buying a Samsung or LG 120Hz OLED in the future, instead of a pc gaming monitor.
One thing to look out for on the OLED TVs is that usually about one year after release they get heavily discounted to clear space for the next generation of TVs. So while I bought my 48 inch LG C1 for around $1300 I saw them discounted to around $800 (if I remember correctly) right before the C2 came out. Not that $800 is really that cheap or anything lol.
@@danielowentech no, it's nothing to sneeze at 😁, I'm lucky enough to have a Microcenter within 15-30 minutes max, would likely go there get that, but this inflation thing is really spoiling my fun 😂. haha I do appreciate the tips though! 😎
I think a lot of it has to do with people wanting max graphics on 4k while playing the latest and greatest games. This means they have to update their GPU every generation to stay on top of this quality. Most people keep their cards 4-6 years. So a 1080p monitor will go much further is letting that card provide good graphics longer. Same with 1440p too. At least that is my thoughts.
with 4090 you can use dsdsr with 100% smoothness up to 2880p, which will give a much more detailed and clear picture, in games where the emphasis is on the CPU.
I've had a UW since 2019 and recently upgraded from a 1080TI SC Hydro to a 3080 12GB FTW3 Ultra Hydro, no issues with either at 3440x1440p. I just wanted to run newer games at higher to max settings. In single player games 60 fps average is acceptable in my opinion. Also the 4090 is not worth it right now especially at this res when you can purchase a 3080 for less than half the price.
@@bitbat9 I mean it's okay in single player, there's still some games with the latest consoles that runs at 30 fps, but I run way above 60 fps. For reference which not a real good one due to optimization (CPU hungry), but I have to lock my frames at 130fps both Destiny 2 and Division 2 which do not have DLSS. No the 4090 is not worth 2k or even a little less than that in any way.
My Asus TUF 4090 OC absolutely hammers running my Samsung 32:9 super ultrawide monitor. Getting super high FPS, on Ultra graphics, and not needing DLSS in all games bar Cyberpunk. You won't regret getting a 4090, especially on super ultrawide.
This matches my experience with 3440x1440 going from 3080 to 4090. The 4090 clearly does much better but is also held back by the CPU &/or the monitor refresh rate. I'm on a 5800x atm. I paid £650 for my FE 3080 and £1700 for my FE 4090, it's not 3x better at 1440 UW :) Now, I need to get a 4k UW monitor :) :)
I recently upgraded from ultra-wide to LG C2 42" 4k OLED tv. The image quality is INSANE. If you are thinking about upgrading from 30XX to 40XX - be sure to upgrade your monitor to oled first. It has a much higher priority then "a couple more frames". The visual difference is MASSIVE. Once you have an oled monitor you will never go back. That said - the 4090 is the first GPU in the world that can actually drive the most demanding games at native 4k at 60+ FPS. So to get the maximum out of a 4k 120hz display - the 4090 is a good fit. But the 3080 handles 4k just fine aswell. Just needs DLSS / FSR or a resolution scale down. In my opinion: Instead of buying a 4090 you should spend the money on a 4k OLED. Because most games will run at 120fps on a 3080 anyways. Especially 2d games or old games. But the OLED display will still benefit ALL games. Its such a massive difference in color, contrast and HDR capability... its much more worth it then just having more frames in the handful of AAA games.
@@theplayerofus319 it's not. Inform yourself better. Displaying PC games on an OLED is in no way different from displaying the same game in their console version. And even for desktop usage with 8h a day of work it's not a problem. See RU-vid videos about it. Many people have been using OLED as monitors for 4 years now. Without any burn in
@@m4ko288 thats just wrong. Linus tech tips showed the problem... the ui of games that is constant displayed is causing burn in and i never said pc games are the problem. All games are if from pc or console. The task Bar is burning in real quick
@@theplayerofus319 linus himself is using OLED at work and at home 🤣😂. Proof that it's not a problem. Also see here ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-QqVwlMmL4mw.html
@@m4ko288 he literally made a Video called "being an early Adapter sucks- trying to fix my burn in" hahahha and the TV in his livong room he said to that: i watch out to only watch movies etc on it to avoid burn in so no sorry pal and you forget 1 thing... He is rich he just buys a new one after a year haha
Thanks! I'm just wondering if replacing my 3070ti for an 4080/4090 would be beneficial for me, cuz in some game I've got to tweak the setting to have 100-120fps (like BF2042), ideally for my 120hz monitor.
I'm on Ultrawide 1440p 120Hz (Alienware AW3421DW) since I got my 3080 early 2021 (so an early 10GB one), and will keepthis format as I'm a RPG/strategy gamer, I really love the immersion Indeed the 3080 (paired with a 5900X) is just capable of a stablish 60fps on CP2077 and TWW3 (my preferred games) at close to max settings I won't get the ridiculously big and expensive 4090 but would get some RTX5080 or RX8800XT in 2024 and maybe then switch to an ultrawide 4K OLED monitor if they exist ! 😁