I love watching videos like these. It's like going into a toy store when you were a kid, just looking at all the toys, knowing you're never going to own them.
@@lukasg2786 yes we can do that just sell one kidney, No food just eat beef jerky once a day, no need to feed your family and voila suddenly u have all the money for a sexy arse gaming pc
My computer with a 2060 super and i7 9700f runs Odyssey fine but for some reason origins causes extreme stuttering to the point I gave up even playing it
@@saly1957 At launch prob not by much, as updates come out it'll steadily pull away though, and ray tracing performance will probably be superior as well by a decent margin
@Chaos Amv, no bc they're being played with rt and this card had ultra settings at 60fps in mind at bare minimum, so if they reached 60fps, then they are technically optimized at 1080 since it's new
@@stefanfilipov7254 That's not true, I went from a 24 inch 1080p monitor to a 24 inch 1440p monitor and sit maybe 2 feet away and it's a big difference, 4k would look way nicer, 1440p looks plenty clear it's just that some games without AA have nasty jaggies
@@MrZodiac011 I see...i am using a 32inch 1080p monitor but i sit like 1.5-2 meters when gaming,and i find no difference between my 1080p and a 4k monitor with the same size from that distance
i know...screen tearing is not fine, but i have no solution to this problem 😕 For maximum fps I used a 2nd PC with a capture card (elgato 4k60 pro) and vsync / gsync is disabled.
@@BENCHMARKSFORGAMERS I have the Avermedia 4K card. There is no solution for the screen tearing when recording with a capture card even when recording at 100/120 FPS. In person there is no tearing it's only in our videos so it's not a big deal.
Thats exactly what I thought, either he hasn't optimized the hardware completely or he doesn't have a 144-240hz monitor. One of the comments mentioned he has a lower-end monitor but I dunno.
Man, with this rate of computer technology development, it no longer make sense to buy new pc, because in couple of months that configuration is already old... I love gaming but that prices (at least in Bulgaria) are absurd...just ridiculous !
The problem with PC games is, it's always evolving ( hardware wise ). Developers either use old engines or just can't be bothered to make use of high end hardware ( niche market ). If like a console, where they had a 5 year development window, you would see some great optimised games.
@@chaobanh5003 I don't know if it's double or not but i think you will be able to play 4k around 110-140 frames for most of the games so it's a good value for your money
No, I used the same settings. And I know a lot of people say you don't need AA at 4k, but i need it for a fair comparison 😄 This video is very interesting (4K + AA) - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-y1W9QfJbEz0.html
@@dnnyed1229 i think hwmonitor doesn't work, but hwinfo64 or aida64 works. The overlay editor is very nice -> videocardz.com/newz/msi-afterburner-4-6-3-beta-and-rtss-7-3-0-beta-6-bring-overlay-editor-plugin
I'm still happy with my 8700k@4,9ghz, gtx 1080ti and 16gb@3200mh. I'm playing at 3540*1440. Maybe I will upgrade to 3080/ti. I think my CPU can run games 2-3 year's longer without any Problems.
Im here to see what 2080ti performance is knowing that im gonna buy 3070 which 500 bucks and still faster than 1200 2080ti lol And it looks like i will need 1440p monitor 😂
@@abhishekks3422 Im not really on fps games that required over 144hz..... The only competitive game i playing is pubg and im not really playing that game anymore
RTX in CoD MW just makes shadows more realistic... but it's buggy as hell. It can also be enabled for older cards like GTX 1060. Considering the game is mostly competitive (Meaning, more frames, the better), It's always disabled.
Spoken like someone who clearly never seen and played in 4k in person! 4k makes everything clear, textures are more detailed not to mention higher resolution naturally fights off aliasing!
It's like i lost hope on humanity there is no pc good enough to run this games with an satisfying result you imagine what you pay for 2080ti and the i9 10th gen you will have to sell your kidney to buy them and then boom 29fps 😂
@@gyathan8516 The high end cpu really doesn't make much of a difference. I bet you could get away with a wayyyy cheaper cpu and get almost identical results.
@@dustinbrite2422 so all the useing is from the gpu but the best one now isn't that good 😂i have the 960m😂😂😂😂😂😂 my Hart hurt's me when I remember that there is 2080 😂
Games are optimized way better on consoles if sony made something with this specs ot would destroy this pc even with the same components... well no because the game would then be made and optimazed for them bit you get what i mean right?
No matter how much power you have in a PC, it still just looks like a Xbox or PS4. PC game developers just flat out refuse to take advantage of all that hardware and instead cater to the mid to low teir PC specs. So all you wind up with is a console game that runs super high frame rates. Mind blowing right. That's why I'm ready for next gen consoles. The developers know every console is a beast so they won't have to hold back. Right now they kinda still do because most next gen titles on release will also be on last gen but once last gen is fazed out games are gonna start looking badass and so then will PC games.
is the gpu load in warzone extrem high or am i wrong? my gpu load is round about 20% with 2070 super.. is there a way to let the gpu do more work then the cpu for warzone?
dude is fortnite finally optimized or is it jsut that beast of a pc? (havent played fornite in about 18 months or more, right now its july 31 2020). reallly curious as i am about to build a pc with similar specs(same gpu and cpu tho).
Every game in the future should have dlss, it will, cause the new consoles use a method like dlss which will also be used in rdna 2 gpus, so win-win What a time to be alive!! Dlss 2 is the best deal in tech, it increases graphics and increeasea fps