I started learning computer in school using DOS 😅 then we got Windows 95 and then we could play Oregon Trail finally but with a 30 min computer class i could only ever get as far as start my family and then most of them die from dysentery on the way to the first river and the rest drown after my wagon crashed😢. Then windows 2000 and then xp shortly after was all that we had for the rest of the time i was in school 😅
@@mrlightwriter do you 2 think the 3080 could do the same cause this is what I have and I am coming from AMD graphic card. Still getting use to blender and UE5.3
Probably FIFA or Fortnite. With magnificent 19% gpu utilization peaks. I swear optimization is not devs' priority anymore. The game only needs to run up to the part where you upload your credit card info
I'm so glad I'm not the only one thinking this. I remember back when I was getting into PC it was all about "can it run Crysis 3?" That was like the ultimate benchmarking game.
To be fair it did take days to render single frames of animated movies with path traced lighting just 20 years ago. So the fact that this card can even do it 18 times per second just shows how far the power of computers have come.
@@soundwavesuperior6761 eh, they optimized it a lot. It's slow because it has full-on path tracing now, which is a feature that's arguably ahead of this generation of hardware's time.
@@soundwavesuperior6761 Crysis tried to be ahead of it's time, but it chose the wrong timeline. It focused more on singe core CPU performance, which didn't turn out to be where the technology really went.
@@Arsoonist you can't deny the MX160 i have IS worse than urs, man i love spending money on GPUs that my integrated GPU does better when using longevity gaming cuz it doesn't take that much RAM to process (mx160 takes 4gb of ram from ur pc) and i can play fairly smoothly in many games
@@lywoh I mean yeah, it would give me very good fps, unless it was cooled with a single fan the size of one of my balls(my balls ain't massive, the fan is small). It does better in the first 10 sec only to throttle and perform shit, so I can't even properly use it.
@@Arsoonist bro same, except I have the mx330 which is *barely* faster than the integrated iris plus graphic. I have another laptop with an mx550 which is sort of a decent chip
Actually kind true.. They know if they were just competing on plain rasteurization, they would lose every single time vs AMD. They had to manufacture their own problem to fix.
@@asd-dr9cfthis is beyond false. They under that we can no longer just keep is using Rasterization. Also hardware RT was not started by Nvidia they just made it possible to do. The gaming industry had already been on the raytracing train years before the 20 series was a thing. If Nvidia still wanted to do Rasterization they'd still blow the competition out but what's the point of keeping on with Rasterization I'd it isn't actually going to make different rendering techniques useful?? Everyone says they had to manufacture their on problem yet the only GPU manufacturer doing this dies off and for years after the 20 series everyone has being following Nvidia's wake and are still losing to Nvidia.
@@MrTasos64no you don't understand a damn thing about the rendering techniques being used. Path tracing a game and doing it efficiently in real time is taxing. These technologies are taxing because they are being done in real time unlike the cinematic field were they render in clusters and not in realtime.
So what's a good computer by comparison? Enlighten me. This is one of the most enraging shorts. Like:"your Ferrari is pathetic, it can't even come close to pulling 1200mph" Sooooooo what can?
I work with Unreal and to render a 1920x1080 with path tracing it takes about 90 seconds with a RTX 3060. So, I'd say 18fps with Path Tracing is insanely fast...
I'd rather have smooth framerates over some subtle lighting effects anyday. This is why I like indie games and older games, 60+ FPS fulltime with zero hitches on many of those games.
my Chromebook: I am not pathetic I am a being stuck in time and space that refuses to progress, I am a bastion, a fortress, a stronghold! against the new
I mean gpu are already so good.of course standards of people change and actually if you compare any top of the line graphic card to something it can't achieve of course it will look pathetic .it's like comparing a millionaire to a. Billionaire.doesnt means the. Millionaire isn't good .just means we got a high standards for judgements . My
@@Xenor999 rtx is the buzzword nvidia came up with as to not call it "path tracing" but is a scaled down version of path-tracing intended to make realtime "raytracing" possible, path tracing is the whole shabang that renderengines like Cycles,Vray, ETC use to make animations and photorealistic renders.
Obviously it’s a test and the fact that the 4090 can even get 18 FPS is amazing. We are far from native path tracing. But they are trying to show how DLSS can help improve performance
@@klippo1235 If you have a 144hz monitor and your 4090 is only putting out 115fps, then it's a waste of money and is indeed pathetc. With a 144hz monitor, you expect to see an expensive flagship card do 144fps.
@@luminatrixfanfiction like people surely did with crysis on new hardware at that time? lol. The red engine is extremely poorly optimized and games are getting more advnanced than our hardware, thus. DLSS
As a 3D Artist I agree, even 1-2fps for 100% PT is awesome especially in such complex game world. Normally the RTX 4090 still can take minutes and more ( 0.016fps if 1min ) to clean single 1080p frame using normal Path Tracing.
To be fair Minecraft isn't the best optimized game. I have an RTX 3050 16gb and it can run most modern games at over 100fps and the crazy thing is I bought it for only $700 off Amazon.
@@GREX-ve2wx that's not related, but the iphone can run a bunch of high quality games, plus if csgo DOES get phone support, yes it probably would be able to play csgo
(I don’t know computers well) about 2 years ago I got a 3060 and never realized the fans weren’t moveing and now I don know how to make them spin PLEASE HELP
It’s okay but the amount of expectation from me really left a sour taste, I also think the game should be better instead of being a “graphics bench”. Beautiful game but everything else is ehhh
Well, it sounds like he's saying that your computer sucks because a 4090 can't run cyberpunk over 30 fps without having some RT Overdrive/DLSS feature turned on. Yet makes says the opposite saying that not a lot of computers can use said new feature. Confusing
Ah shit, my integrated gen 7 intel Core i7 on a laptop with a broken fan and cracked screen can’t run Cyberpunk on the highest settings? This is a fucking tragedy.
that graphical feature is a DX13 standard feature, so seeing this in a DX12 game truly means the next physics platform is going to revolutionize the graphical quality of games. very good news for gamers.
@@k1lez ?, you don't sacrifice one part of a game for another part unless its an indie game. Triple A studios have legions of employees dedicated to the story, "graphics", and programming of the game that work in unison but independently of each other. Whether or not the storyboard is good has nothing to do with the guys working on the 3d models and texturing.
Bought a very high end PC back in late 2013, and has done me well this past decade. I'm in no rush to get another PC when I mostly just play on the PS5 nowadays or WoW and World of Tanks on PC.
@@hugopereira5640 those people don't know anything about computer graphic It will take at least 20 or even more years for computers to become powerful enough to run pure path tracing in real-time at 60fps
@@grass_rock Well okay, so RT really has been a gimmick so far. Got it. Seemed kinda obvious too that sacrificing half of your FPS to make graphics look shinier has been a stupid idea ever since RTX Graphics first came out tbh.
It's really weird everyone thinks a GPU is the only thing that can impact your FPS. Your CPU and RAM and Motherboard matter A LOT to this process. Especially in heavily multi threaded games like Cyberpunk.
Eh, I don't think the pathtracing and graphics stuff is running on the cpu. yeah mobo might matter for pcie speed. The higher res you go and the more graphics options you enable, the more it relies on the graphics card. But yeah the other stuff matters too. I don't really care about fps past 120 or 60 depending on the game, as long as it doesn't stutter.
Higher res images and rt is all done on the GPU. All that happens on the mb and CPU is arranging those images in way you can see. And even then modern changes to that pipeline try to limit the back and forth between the CPU and GPU. So you would be wrong. Now if you want more fps and less res then the story is different and the bottleneck can become your CPU though I believe there is a bigger push for latency reduction rather than higher fps. So it just depends on what you play.
Literally doesnt mean its pathetic???????? Its called that rendering things with such detail and percision takes A LOT of power, its not pathetic, its genuinly super fucking impressive that we can even sort of run it in realtime
@@BerryTheBnnuy I mean, it's about as optimized as it can be right now. Path tracing algorithms running in real time is a fairly new development, and previous path tracers could take minutes to render a single frame. It will take the development of new rendering techniques and mass adoption of more powerful hardware for this real time path tracing to be anything more than a tech demo though
Ray tracing means from the source of the light a bunch of lines of light come out in all directions and instead of their just being light it actually measures where the light is to give you accurate lighting and reflections. Path tracing is just that on steroids it tracks where the light is after it bounces so you get accurate refractions which is even more realistic