Тёмный
No video :(

Frame Generation 

Daniel Owen
Подписаться 195 тыс.
Просмотров 104 тыс.
50% 1

FSR 3, DLSS 3, AFMF, and even ExtraSS Frame Generation technologies are making headlines, and a ton of excitement is being generated. However, there has also been a lot of pushback against frame generation technologies with dismissive comments about "fake frames" being common since the Nvidia 40 series launch. What is the current state of these technologies? Are we ready to have a more nuanced discussion about the past, present and future of frame generation technologies?
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com...

Опубликовано:

 

29 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1 тыс.   
@IhsanNurhidayat
@IhsanNurhidayat 8 месяцев назад
Problem with DLSS frame gen is that Nvidia use it as a performance metrics to compare it with previous gen in their marketing materials for the 40 series.
@TheShitpostExperience
@TheShitpostExperience 8 месяцев назад
This pretty much sums it up. When it was first announced by nvidia (before 4000 series) both DLSS and frame gen were cool tech that could help people with lower end or old cards. Then these features get gated behind the new card models. Then these features get used for benchmarks to compare performance with previous cards. It's not that people don't like DLSS or FG, they are super cool features, the issue is that nvidia is only using this features for marketing their new cards instead of generally improving the hardware itself, and gating the tech which allows users to play new games.
@nossy232323
@nossy232323 8 месяцев назад
@@TheShitpostExperience Frame generation is not for low end cards, as you already need a reasonable good framerate to use it.
@user-lh1wr9sr8m
@user-lh1wr9sr8m 8 месяцев назад
The problem with frame gen was that it was only available to Nvidia at first, and then even then, of that subset it was only available to those who had forked over the cash for a 40 series card. People are incredibly chauvinistic, especially on the internet concerning anything even remotely brand oriented. It's pretty sad. I guarantee the majority of the people poopooing FG in the beginning had never even used it before. Just like the majority of people shitting on upscaling had only ever used FSR 1 or 2 and then wrote it off.
@rolux4853
@rolux4853 8 месяцев назад
Yes that is insane and completely muddies the water! Another reason why I stepped away from Nvidia and bought a 7900 xtx.
@1vaultdweller
@1vaultdweller 8 месяцев назад
And Nvidia's annoying persistence to put inadequate amounts of VRam to their gpus, which results in hardware with short life spans
@Spright91
@Spright91 8 месяцев назад
There was a negative reaction before because people couldn't afford it. And its positive now because fsr 3 has allowed everyone to have it.
@TTx04xCOBRA
@TTx04xCOBRA 8 месяцев назад
this
@LeegallyBliindLOL
@LeegallyBliindLOL 8 месяцев назад
Yup, I remember calling it that AMD users would suddenly be for the technology as soon as FSR 3 releases, yet keep hating on "fake" frames when DLSS FG was released
@NostalgicMem0ries
@NostalgicMem0ries 8 месяцев назад
not just this but also first versions of frame gen was buggy, and like dlss it improves and will improve to point that it will be no brainer to use it if you dont see diference or diference is so low that almost unnoticable
@RuteFernandes-qs6pc
@RuteFernandes-qs6pc 8 месяцев назад
@@LeegallyBliindLOL Because only AMD users have been paywalled out of FG, 2000 and 3000 series users looking a bit red these days huh
@A1D4Nk
@A1D4Nk 8 месяцев назад
I hated frame gen before mods put it on my 3080, still not a huge fan of it but this sentiment is pretty accurate lol
@finnenhawke
@finnenhawke 8 месяцев назад
I have 3080, I originally tested FSR3's FrameGen in Forspoken demo and I really didn't like it. It felt a bit stuttery, uneven and unclear. BUT then I followed your last video about the FSRxDLSS FrameGen mod and modded in the FSR3's Framegen into Starfield and... holy hell, not only it worked, but it actually performs amazingly there, and looks great coupled with DLSS. It's super smooth and I don't notice any artifacts and issues. For a single player game like that it's awesome and it boosts the FPS nicely in the unoptimized places, making the native framerate drops in cities basically invisible. A bit higher input lag, sure, but then again, I don't need super reflexes in Starfield ;-)
@xerxeslv
@xerxeslv 8 месяцев назад
In my experience ~80fps is like perfect baseline fps for FG introduced latency not being much noticeable, basically at ~80fps+FG it feels like ~70(60+) but looks like ~160 And yeah, that was my exact thought after Forspoken, like it was usable in some places with high base fps but still felt so-so. But now, when you can enable it together with DLSS... I was surprised how well it works. Capped fps in cp2077 at 80, so GPU have some "free time" and now I have nearly stable 160. Another thing is, it seems when I am not making rapid movements in game, visual smoothness makes my brain fool itself in believing that game is more responsive than it actually is. Like say when driving and making just slight adjustments or simply running, it feels almost like I am getting real 160FPS(low latency), only on sudden direction change it is noticeable a bit that the game is not really doing so much FPS as it looks like, but still feels great.
@SebbS82
@SebbS82 8 месяцев назад
@@xerxeslv I think it too. At ~80 FPS and higher is the best way for Frame Gen.
@themodfather9382
@themodfather9382 8 месяцев назад
I tried the Forspoken demo based on your comment and I really regretted it. If this is what modern games are, then I might as well just sell my GPU and play solitaire, which I'm going to play right now.
@leeli4692
@leeli4692 8 месяцев назад
I felt the same way in Cyberpunk. Now I play the game in PathTracing
@philth7587
@philth7587 8 месяцев назад
I honestly felt that the frame gen in Starfield really struggled with fine detail like grates. So much so that I found it distracting and turned it off. This was using, from memory, DLAA at 4k. Maybe it's gotten better since though. I've been using it in Alan Wake and feel it's really good on the whole.
@lucasrezende7214
@lucasrezende7214 8 месяцев назад
A benefit from Frame Generation I don't think I've ever seem mentioned is it allows for games with framerate dependent things like Skyrim's physics system to rum "over" the intended framerate without breaking those systems.
@skorpers
@skorpers 8 месяцев назад
Frame rate derived physics velocity is considered a shitty practice and Bethesda is the main studio who's negligent enough to make them after the 90s
@lucasrezende7214
@lucasrezende7214 8 месяцев назад
Yeah, I'm not saying it's a good practice, I'm just saying framegen allows you to go over 60 fps in those games without messing up the physics.@@skorpers
@kenneththomson3123
@kenneththomson3123 8 месяцев назад
If you know how to mod your game use SSE Engine Fixes and SSE Display Tweaks to unlock the framrate in Skyrim AE
@juanpaulofricke1506
@juanpaulofricke1506 8 месяцев назад
yeah but the mod is buggy and can cause issues @@kenneththomson3123
@0M0rty
@0M0rty 8 месяцев назад
True, it could be a great tool to get around framecaps, physics issues or other engine limits (like GTA V stutter at what is it, 187fps?). Sadly I don't see many high quality backports/mods into older games that need it.
@xShikari
@xShikari 8 месяцев назад
A game needs to reach 60 FPS by default, then FG is cool for singleplayer. In "competitive" multiplayer however "real" frames are worth more, because they actually respond to input. Generated frames are only visual, they don't have anything to do with input. Edit: added the word competitive
@paul2609
@paul2609 8 месяцев назад
Most people play singleplayer games anyway
@emma6648
@emma6648 8 месяцев назад
@@paul2609I somewhat doubt that nowadays
@taroushi
@taroushi 8 месяцев назад
why does the game need to hit 60 fps to enable frame generation ?
@IceBreakBottle
@IceBreakBottle 8 месяцев назад
No they don't haha @@paul2609
@Stardomplay
@Stardomplay 8 месяцев назад
I'd also argue that frame generation in single player games enhances the experience by increasing depth to the overall presentation by providing more information on the screen. It was a much greater experience driving around the most demanding areas of cyberpunk where I traditionally got lower frame rates; the drastic increase in frames allowed me to view many details of the city all at once which translates to x percentage increase in frame rate to additional detailed frames per second.
@f.miller801
@f.miller801 8 месяцев назад
FG should be an afterthought not a requirement.
@themodfather9382
@themodfather9382 8 месяцев назад
Agreed, but unfortunately it's not really up to us how these companies decide to implement it...
@Coconutgl
@Coconutgl 8 месяцев назад
The latency is around 1 frame time more with frame gen, so if the native fps is high, the frame time is low so the latency is relatively low. I think if you use frame gen and lock the max fps, it can save power so it maybe a solution for handheld gaming. If the native fps is 40, the latency per frame is around 25ms, it is a bit lower than cloud gaming but the effect is similar, some people feel ok some not, but this is very good for us to choice this feature.
@fourtii8707
@fourtii8707 8 месяцев назад
Fair point friend
@ChristianKoehler77
@ChristianKoehler77 8 месяцев назад
Something similar to frame extrapolation is widely used in VR, it is called asynchronous space warp (ASW). The images in the headset have to follow head movement very quickly to avoid motion sickness. For many people native, locked 60 FPS is too slow in VR. The problem usually disappears at ~80-90 FPS. ASW helps with that. If you move your head, it will take the last frame, shift, rotate and zoom it according to your head movement and apply some perspective correction -magic- math and use that as the next frame. This is much quicker than a traditional render. It is very common to play a demanding game like MS flight sim in VR at a locked 45 FPS and use ASW to bring that to 90. (Or turn 60 FPS into 120 if your computer can handle it). The VR headset for the PS4 had a "controller box" that even contained special hardware for that task. We don't have that for PCs or the PS5 because modern GPUs can do it with ease. This will not smooth out motion of other players. It will also not speed up hand movement to image latency, there is no advantage in games like Beat Saber. But head movement to image latency is greatly reduced. It seems logical to use something similar to reduce mouse to screen lag. Looking around with a mouse on a flat screen game is very similar to looking around in a VR game with your head. If you push the button, the bullet won't come out faster. But it would greatly reduce the "rubber band effect" that you get when looking around with a mouse at 30 FPS.
@abeidiot
@abeidiot 8 месяцев назад
VR uses frame extrapolation while DLSSFG and FSR3 use interpolation. It has a lot more artifacts than interpolation, because it needs to 'guess' the next frame without reference while interpolation has 2 reference frames and tries to find something inbetween. It can work with head movements in VR, because they are slow, but doesn't produce good results with mouse
@jordongee2347
@jordongee2347 8 месяцев назад
I agree with the sentiment, manufacturers are going to provide less card fkr the dollar, relying on FG to boost to "appropriate" performance, and game devs will falter to working less to make the game run acceptable without it.
@biIl
@biIl 8 месяцев назад
I tried FSR3 in CP2077 and AW2 and I'm not really sure if I would use them. I like high framerates because of the 'snappy' feeling and FG doesnt give that. If I have a base framerate high enough that the input lag isnt noticeable, then I feel its also high enough that FG is kinda pointless. 120 real frames already looks and feels great. I dont need to get marginally better smoothness at the cost of input lag and more artifacts.
@acetone6117
@acetone6117 8 месяцев назад
You need to know that it is not for people who are already getting 120 fps in games.
@fernandochapa1433
@fernandochapa1433 6 месяцев назад
@@acetone6117but then who is it for? If you’re getting 30fps with FG you’ll get 50fps visually but 20fps on feel. And if it only makes sense in a 80fps base scenario then dont you think that’s an ultra small market?
@biIl
@biIl 5 месяцев назад
@@acetone6117 there is no framerate where I would use framegen. it worsens responsiveness at every level
@Wheatthin21
@Wheatthin21 4 месяца назад
@@biIlI disagree, the latency is not noticeable to me in CP2077, although I was a getting 80 fps prior to it
@invisibleevidence1970
@invisibleevidence1970 3 месяца назад
​@@fernandochapa1433if u get 30 FPS and 50 FPS through frame generation, the feeling is 50 FPS . The amount of down play u guys doing to FG.
@DonaldHendleyBklynvexRecords
@DonaldHendleyBklynvexRecords 8 месяцев назад
I think the "avg" user gets into the game and doesn't really notice as long as that base FPS is snappy, most times it's only a thing when I'm not really into the game
@OnBrandRP
@OnBrandRP 8 месяцев назад
Hell, I do frame gen with Remnant 2 now and I literally cannot notice the difference in Latency at all, and get 144fps locked. I never drop below it. I went from stuttering at 60's and 80's, to 144 and no input delay thats noticeable. As long as the frames you generate can cap above around 90hz, you won't really notice the delay. The biggest issue is artifacting and blurring caused by new frames generating but becoming anomalies to the OG image.
@DonaldHendleyBklynvexRecords
@DonaldHendleyBklynvexRecords 8 месяцев назад
@@OnBrandRP yea I can definitely see why ppl are pulling away from Digital Foundry on games they actually want to enjoy, looking behind the curtains is fine for understanding gaming and flaws etc but once you focus on the mechanics and flaws instead of experience or even letting the flaws instead most the experience then the game is already a fail vs firing it up and seeing if the games good as a whole, plenty games have gotten a bad wrap due to focus on mechanical flaws
@BourbonBiscuit.
@BourbonBiscuit. 8 месяцев назад
upgraded from a 3060ti to a 4070 for frame gen, 3 months later and I never use it. Played lots of game with it too recently but it has too many compatibility issues with things like HDR, G/V Sync Etc. and different games require different setting to make it usable and a high frame rate to begin with. NEEDS WORK.
@d.d5619
@d.d5619 8 месяцев назад
4070😂 get a 4080/90
@mineguto211
@mineguto211 8 месяцев назад
bro who bothers to spend so much on a gpu? nvidia scam pricing@@d.d5619
@aasdgnxcv4504
@aasdgnxcv4504 8 месяцев назад
@@d.d5619 Name three good games that warrant the purchase of those GPUs.
@piper67890able
@piper67890able 7 месяцев назад
@@aasdgnxcv4504 anything you want to play at 60+ fps 4k lol. yall broke boy mentality people quite literally never think for 2 seconds. Every triple a game this year requires a beefy gpu to play at 4k high. If you play fucking 1440 or don't use VR don't buy a expensive card. I have a 49in ultrawide, a 4k tv and my index to play on. yeah I'll fork over whatever I need to play at stable frames. Invest in your setup or shut up
@DaBrain2578
@DaBrain2578 8 месяцев назад
I think Intel has the best idea. Instead of interpolating, they extrapolate frames. Solves the latency issue. It has it own set of issues, but the direction makes sense.
@rakon8496
@rakon8496 8 месяцев назад
"Renderframe" vs "Interpolationframe" may be the most suitable/descriptive terms. 💙
@adriankoch964
@adriankoch964 8 месяцев назад
The negativity about FG is 90% NVIDIA's fault: - The lock of FG to 40 series is probably just a sales tactic, since they sat on a literal mountain of 30 series chips at the time of 40 series launch and needed a tangible gimme for gamers and it's not the first time they claimed something was hardware locked only for people to dig and find an easy way to make it work on their excluded cards that "lacked hardware support", most recent example: NVIDIA Broadcast launch version working on 1060 cards when they claimed it used RTX specific cores. - 40 cards being so overpriced at launch, so the 30 card mountain would eventually be sold off, essentially making FG a $500+ feature that would be really useful on a $250 card. - Scummy NVIDIA graphs and naming that bundled DLSS3 & FG into one thing, while it is two clearly two separate components as people digging have found out (you can combine DLSS with AMD FG for example) - Scummy NVIDIA marketing with their DLSS3=FG slides comparison vs older card comparison for a feature that just~20 games out of the box support a year later. The other 10% is AMD's fault with the first few games having a crappy launch implementation, essentially pre-murdering their hype with the first two games being wonky due to using outdated FSR versions.
@KyoukiDotExe
@KyoukiDotExe 8 месяцев назад
Biggest issues are that they are dependent on implementation to be good or bad. Next to having a varied of each of the tech being available per game... Yes, developers can put minimal time into it to put it in. However making it look and perform well takes time and effort. DLSS in a lot of games suck on my 3080 but when I tried FSR3 on Avatar it looked amazingly because I feel like that game is really optimized to show off what FSR3 can look and feel like, and you do get more frames and played awesomely. TAA being enforced because of it, kind of sucks for those sensitive to blur, like my own eyes. Which is often forgotten that some people may not be able to use it. Generally speaking do think because little amount of people got access to it and it being limited to only one of the latest and MOST expensive generation of graphics cards made a lot of people upset because of course they would like this as well. I still would like more raw pixels, no TAA being enforced, being generated like old school traditional rasterzation performance. Games already became blurrier because of deferred rendering engines vs forward rendering engines we had before. Personally I think RT is really cool tech, but the offset performance for that visual trade is still too much. I don't want to discredit these technologies but they in essence are bandaids to overcome the lack of performance we can get out of games. I don't mind if a game looks a bit worse to make it play well, it is a game. Not a movie. and I feel like we lost that touch by making it too much movie-like experience.
@snj7502
@snj7502 8 месяцев назад
You really don't need to yap so much to make a couple of points and plus fsr3 is open source and can be modded into most games
@KyoukiDotExe
@KyoukiDotExe 8 месяцев назад
Also didn't need to reply on my yap. I am aware of both those points, so what is your point?
@maldazzararr9603
@maldazzararr9603 8 месяцев назад
You are not seeing the whole picture. Is FG a good tech? Kinda. It's not bad in itself. However release of DLSS and FSR excused a lot of publishers to push unfinished and unoptimized games. This is exactly the situation that FG will cause in the future and we will be told: "Just turn frame generation ON". Already there are games that turn upscaling ON by default. The games look worse, we are getting lower frames overall and we are being told to upgrade.
@bubbleaddict
@bubbleaddict 8 месяцев назад
Very rational video Daniel. You hit everything I like and dislike about frame generation. It is a net positive, and this is just the beginning. Drivers and frame generation will develop and mature.
@SebbS82
@SebbS82 8 месяцев назад
Many people don't understand the technology behind it and only see the results.
@cronoesify
@cronoesify 8 месяцев назад
I use frame generation all the time bros. Its on my TV, and its what makes quality mode on a console bearable. The fact, though, that we are using this on our PCs is proof that 2012 will never return. Moore's law is not just dead, bros. It's extinct.
@themodfather9382
@themodfather9382 8 месяцев назад
Do you mean it's a technology built into your TV? Which exact model is it? Also, a lot of this frame gen shit is focused on making the games easier to run at 4k (caused solely by the TV market) and with RT enabled. So I'm not sure if you can blame moore's law exclusively, without those 2 things we'd still be cool.
@cronoesify
@cronoesify 8 месяцев назад
@@themodfather9382 it's just a joke talking about my TVs motion interpolation feature. It's a 2022 Samsung QN90B, so it's at least high quality enough that it interpolates the frames without adding much input lag. Exactly what frame generation is all about. The tech behind nvidias solution may not be 1:1, but the premise and effect is largely the same. And as you said, this is being done because Ray tracing at 4K is just not very tenable without it. But there was the expectation that we would have been there by now, yet haven't. And that's because increases in raw computing power hasn't been scaling the way it had in the past. That much is undeniable.
@MrDs7777
@MrDs7777 8 месяцев назад
An rtx 4090 has 76 billion transistors. Moores law is alive and healthy.
@tiagoalvarez7092
@tiagoalvarez7092 8 месяцев назад
I tried the DLSSG to FSR3 mod with a bunch of games and it works amazingly well. Some games got some HUD issues tho, most notably Ratchet and Clank, but the majority of them it works really good.
@SebbS82
@SebbS82 8 месяцев назад
Nice work from AMD but Frame Generation is the better technology with the Optical Flow Engine which does not need the CPU to generate the frames.
@slowlymore2
@slowlymore2 8 месяцев назад
@@SebbS82Doesn't FSR3 use Async compute on the gpu, hence why it's not recommended for older cards like pascal / RDNA 1 cards?
@xerxeslv
@xerxeslv 8 месяцев назад
​@@slowlymore2Yeah I dont think it uses any cpu at all, it works very well in cpu bottleneck scenarios as well, cause gpu is not 100% busy and have enough resorce to generate all the frames. Will be surprised to find that it actually uses cpu in some way, seems like gpu should be much better for this task.
@yellowflash511
@yellowflash511 8 месяцев назад
​@@SebbS82it doesn't use CPU, it uses async compute in GPUs. Pls stop simping for Nvidia.
@SebbS82
@SebbS82 8 месяцев назад
​@@slowlymore2The way is a different. You can read the discription of this. AMD build a software variant that works different.
@Adri9570
@Adri9570 8 месяцев назад
Extrapolating is not that new as a technique. Every wife can extrapolate every single thing you did wrong since Middle Age to generate frames against you during a discussion, even if you are right.
@MrDs7777
@MrDs7777 8 месяцев назад
Lol
@cosmefulanito5933
@cosmefulanito5933 10 дней назад
It makes the most sense to apply it to low-resource hardware. Not high resources. For example, to play new games with old GPUs. This allows me to play, for example, Hogwarts Legacy with a GTX750 with 1GB of VRAM.
@c99kfm
@c99kfm 8 месяцев назад
One thing I *CAN* say: I *NEVER* want, nor will want, upscaling or frame generation included in pure hardware tests (e.g. GPU reviews), because that will make it harder to have an apples-to-apples comparison when selecting my hardware. A separate test, either in a separate section or a secondary review, is not a bad idea - but I want to know how much pure render power I'm buying, not how well it performs at a lower resolution than tested or how many additional frames it can construct without actually rendering.
@4340leo
@4340leo 8 месяцев назад
"FG frames are worse than real high fps, but better than real low fps. " From a user of FG since day one!
@pedrobrazon6610
@pedrobrazon6610 8 месяцев назад
Dude, i cannot thank you enough for explaining FG with the FSR3 works. I love Cyberpunk and using that FSR3 mods was magic, i can run the game cranked up to the max and it plays well :D. I then tried it on The Witcher 3 cranked up to 11 and oh my god was it beautiful, i always played it on dx11 because i could run well the raytracing version but now its butter smooth :O
@Intrinsic16
@Intrinsic16 8 месяцев назад
Frame generation from AMD's fsr3 is a game changer for rtx 3060 gamer like me.
@sebastiantamayo1988
@sebastiantamayo1988 8 месяцев назад
Same I’ve been thinking of upgrading but now I’m not so sure
@bb5307
@bb5307 8 месяцев назад
Personally frame interpolation is still a negative for me. lower input latency is what i want from higher framerates and visual clarity is just a bonus. Intels extrapolation technique sounds more interesting to me. I hope its good and AMD and nvidia will build something similar if it is.
@skillgamer76
@skillgamer76 8 месяцев назад
With reflex mode, u don’t feel the latency in single player games
@ozzyp97
@ozzyp97 8 месяцев назад
​@@skillgamer76Reflex also works without FG (except in a few games where it's disabled for no reason), so you're effectively giving up a free latency improvement by using it. That might well be worth it when using a controller or playing at a high base fps, but one way or the other enabling FG is always costing you some latency.
@sapphyrus
@sapphyrus 8 месяцев назад
The thing is that once the generated frames aren't noticeably garbled, and it's harder to notice in higher frames because those artifacts are generally temporary, the main argument is that it degrades input latency. However games that have DLSS3 have natively forced Reflex onto them which improves that. In previous tests it was proven that input delay with FG & Reflex is either better or as good as no FG and no Reflex. And let's be honest, none of those games would have Reflex if it wasn't due to DLSS3 being in those. So it's a net positive either way. If someone really prefers better input latency, they can skip it. I'll personally take fluid motion instead until I realize I'm not doing as good as I want to do. But yeah, games shouldn't arrive at a point where we need framegen in recommended specs for 30FPS.
@DarthAsdar
@DarthAsdar 8 месяцев назад
Ppl did not hate technologies of frame generation, they hated Nvidia for bad generation performance upgrade + bad pricing and covering this problem with DLSS 3. "Yes, rtx 4070 has got the same performance as 3070ti and costs +$100, but if you turn on DLSS 3, you may see the difference..."
@unclexbox85
@unclexbox85 8 месяцев назад
>Yes, rtx 4070 has got the same performance as 3070ti and costs +$100 No, 4070 has the same perfomance as 3080 and cost like 3070ti, but only by enabling dlss3 it will be higher than 3080, please stop lying
@leovanlierop4580
@leovanlierop4580 8 месяцев назад
Probably AMD users did that.
@MrDs7777
@MrDs7777 8 месяцев назад
Found the uniformed AMD fan.
@KC80SiX
@KC80SiX 8 месяцев назад
DLSS/FSR and FG should be complimentary. A feature to help consumers get some extra longevity from their hw. But now it’s just mandatory to use it even with the latest and most expensive hw. That is the problem. You need them to play even crappy looking games like starfield.
@doc7000
@doc7000 8 месяцев назад
Personally I rather not use an upscaler or a frame generation, though I understand that there are situations in which frame generation and upscaling works better and I guess for those it is ok. Though for me if I can hit my good enough performance at a target fidelity without using either then I won't use them.
@OrjonZ
@OrjonZ 8 месяцев назад
For AAA games I rather use frame generation and keep native res than dlss.
@zaidlacksalastname4905
@zaidlacksalastname4905 8 месяцев назад
​@@OrjonZit really is that good
@darkfire3691
@darkfire3691 8 месяцев назад
IN a lot of newer games DLSS 3.5.1 looks better and more stable than TAA, why is the reason to be still living under a rock?
@doc7000
@doc7000 8 месяцев назад
@@darkfire3691 My preferred AA is SSAA, and I try not to run temporal forms of AA as they can have temporal artifacts. Also for as much as this has been claimed I mean maybe this time it is true however I will hold out until I see it myself. When DLSS 2 came out everyone was looking at still frames and comparing them to native rendering and yes in some cases DLSS looks better then native in still though what they did not tell you is that in some cases it looked worse as well as while in motion it can look really bad. If I am playing a game I don't notice that bush 20 miles away that looks slightly better with DLSS over native however I notice those temporal artifacts. Certainly for Nvidia it benefits them to sell you over priced weak cards that are held up with things like upscaling and frame generation and AMD will happily follow them there as well as Intel. And why not? they can sell you a $300 card for $1,000 and say it produces 10X the performance but you have to use upscaling and frame generation artifacts be damned.
@NamTran-xc2ip
@NamTran-xc2ip 6 месяцев назад
@@darkfire3691these people put everything to ultra and complains that gpus are so overpriced
@RazeAndJadith
@RazeAndJadith 7 месяцев назад
it sucks to always see people showing native 4k but its got TAA. like seriously, blurring anything in motion then comparing it to dlss, which is also a blur fest, is just bad imo. I can't stand all this vaseline on the screen. when did we switch from crisp clean visuals to just blurring the image in every scenario?
@SPG25
@SPG25 8 месяцев назад
If the cards had the hardware to render it natively then we wouldn't "need" frame generation at all. They cut back on the hardware and are trying to push the "feature" THATS my beef with FG
@MrDs7777
@MrDs7777 8 месяцев назад
An RTX 4090 has 76 billion transistors and petaflop processing power. There’s nothing cut back about it.
@redclaw72666
@redclaw72666 3 дня назад
Happy with my 4090 lol 😅
@AGGPEE
@AGGPEE 8 месяцев назад
No, I won't accept fake frames, I don't care what you people say
@lowlady2346
@lowlady2346 8 месяцев назад
I don't want to HAVE TO use FG when i buy new hardware, i want to BE ABLE to use FG with my existing hardware to make them longer lasting.
@BlackJesus8463
@BlackJesus8463 8 месяцев назад
Cheap tvs have been doing it for years but Nvidia makes you buy the latest.
@DC3Refom
@DC3Refom 3 месяца назад
​@@BlackJesus8463 You are right
@electricant55
@electricant55 2 месяца назад
Who's forcing you to? Contrary to popular belief path tracing is not a human right
@redclaw72666
@redclaw72666 2 дня назад
Isn't that the same for every tech related thing lol people don't complain when they buy new TV's or phones or anything in general lmao
@Stardomplay
@Stardomplay 8 месяцев назад
20:00. I love the idea of leveraging software to improve performance where additional optimizations aren't possible but it's hard to see 3x, 4x, and 5x performance amplifiers which have traditionally relied on hardware upgrades. I've already seen a bunch of comments talking about there is no longer any need to upgrade their GPUs soon because of this open source upscale friendly frame generation mod. Seems like an advanced version of this technology would eat into GPU sales.
@Maggpieify
@Maggpieify 8 месяцев назад
I like the tech, I despise the sales pitch and marketing angle. If they had named it "visual smoothening" or "motion blur+" then I wouldnt have a problem with it. It being marketed as "fps gain" or "free performance uplift" should be fking illegal.........
@biIl
@biIl 8 месяцев назад
I'm curious if the artifacts in AW2 were due to motion blur + FG. Looks like its on. perhaps if you disabled motion blur it'd be a better experience for you.
@blendegames7657
@blendegames7657 8 месяцев назад
TIMESTAMPS 0:00 Introduction: Frame Generation getting better reception then at launch 0:51 Why was the reception of Frame Generation so negative when it first launched 2:52 More people been able to use Frame Generation and been happy with the results 5:05 The downsides of Frame Generation 7:42 The benefits of Frame Generation 9:35 Personal experience of Frame Generation 13:07 DLSS Frame Generation better than FSR 3 for frame pacing, FSR 3 is still good 17:29 Frame Generation is not pointless 18:05 The future of Frame Generation 23:20 Conclusion/Final Thoughts P.S.: If you feel that these timestamps misrepresent what is presented in this video, let me know!
@ibangladeshi1161
@ibangladeshi1161 8 месяцев назад
thanks dude
@Xtoffer87
@Xtoffer87 8 месяцев назад
I'm happy with locking single player games to 60fps. No need to worry about fake frames.
@SevendashGaming
@SevendashGaming 8 месяцев назад
I've wanted RU-vid to support higher framerates for awhile and I'm really curious to hear the opinions of other creators like Daniel, who would be using it for practical purposes in these reviews. Thanks for the content as usual!
@danielowentech
@danielowentech 8 месяцев назад
The cost for storage and streaming of the higher refresh rate content would just not be worth it for YT. The kind of content that actually benefits from it is extremely niche.
@BlackJesus8463
@BlackJesus8463 8 месяцев назад
Do you even game?
@sudd3660
@sudd3660 8 месяцев назад
you can show 120fps on youtube for a decade now, you just upload 60fps half speed and the youtube player is set to 2x speed.
@imaginalex5850
@imaginalex5850 8 месяцев назад
RU-vid wants you to pay for 1080P 60fps "true bandwith" , you will never have more then 60 fps for "free" because google is greedy and will charge for 1080P 60 fps "normal" levels of bandwith. Don't even start asking for 120 fps 1440p on youtube, it's never going to happen, they are bleeding money left and right enough as it is.
@Avalan666
@Avalan666 8 месяцев назад
Playing Cyberpunk 2077 with the modded fsr3 frame-gen bumped my fps from ~60 to ~120. Dlss quality, high settings. It feels smooth. Very, very smooth. Even on kbm (though I prefer to play with Dualsense, the new adaptive thingy makes the game so much more immersive), it feels like magic to me.
@Bas-Man1
@Bas-Man1 8 месяцев назад
Fellow PC user with PS5 controller here. Please advise how you got the adaptive triggers to work for games. I've been trying without any success
@Avalan666
@Avalan666 8 месяцев назад
@@Bas-Man1 Dunno what to tell you. Connected it with the usb cable, launched the game (gog version) and it auto recognised it. Adaptive triggers were on by default, but if not you can find it in the controller options. Steam auto-detects as well.
@DC3Refom
@DC3Refom 3 месяца назад
it increased the smoother , the game internally and the engine is still running at 60fps
@Alovon
@Alovon 8 месяцев назад
honestly for me, for those with VRR Displays. I say it's a overall improvement as long as you're above 30fps, play on controller, and the game's input latency is good on controller. For me I have a RTX 3080-12GB, so in Cyberpunk for example I can hit ~40-50FPS at the lowest at 4K Ultra Performance DLSS Mode with Path Tracing and Ray Reconstruction Engaged. So FSR-FG perceptually gives me 80-100FPS 98% of the time, and a bigger boon that I don't think is considered as much for FG, it helps act as Software Low Framerate compensation (but better), as while my TV would experience tearing or stutter if I dropped bellow 40FPS (As it's not a Freesync Premium TV, heck, I'm already modding it to let it VRR down to 40), with FG I wouldn't experience tearing/stutter as long as it works with VRR. Which the Public release of FSR3 does support FSR3, and in the modded version with Cyberpunk, it produces a stellare experience, so good in fact I could swtich from 4K UP (720p IR) to 1662p Performance Mode (831p IR) with NIS to 4K and get overall similar performance and still retaining above 30FPS at the absolute lowest internally. And considering how Cyberpunk's Path Tracer and Ray Reconstruction scale to resolution internally, that ~110p radically improves image quality despite the reconstruction output being lower before the less sophisticated NIS takes over to take it to 4K
@davethefoxmage5797
@davethefoxmage5797 8 месяцев назад
I find the point at the end (looking toward the future) interesting. But the biggest question I have is input lag. Even in solo games, I notice that more than raw visual framerate most of the time (but that's going to vary person-to-person). To use your example of a 1,000Hz monitor, if we could get to the 557th frame of that (a generated one) and it could go "Wait, the player just hit a button!", and it could react to that, I'm 100% on-board. Even if the game didn't display the difference right at that moment. Think of the whole "predictive movement" thing in online games, where it assumes another player continues their same direction, then "goes back" and changes it when your client hears the other player changed direction a quarter-second ago. What I'm picturing is sorta like that, on much shorter timescales. Like even if the next 10-20ms of frames don't *show* my input, the game engine registers them and the effects of my keypress shows up after that. The idea is that at that point, there's no reason to really *lock* input-handling to spitting out game-engine frames. From a programming standpoint, you would have a thread that spits out frames, and a thread that process inputs. These happen asynchronously, and each just goes as fast as it can. 🙂
@FSAPOJake
@FSAPOJake 8 месяцев назад
I'm just sick of people calling frame generation a performance feature. It's not a performance feature; it's a visual smoothness feature.
@mechanicalmonk2020
@mechanicalmonk2020 8 месяцев назад
Language takes time to evolve. "Performance" at the moment if the term the people will understand immediately because 99% of the population thinks that it means the rendered framerate.
@thecatdaddy1981
@thecatdaddy1981 8 месяцев назад
I can already picture the dozens of clickbait articles/videos like "Your old GPU just got a FREE 2X performance boost!!" I despise this era of gaming...
@DC3Refom
@DC3Refom 3 месяца назад
​@@thecatdaddy1981So do I not believe how gullible so called intelligent Nvidia shill know it alls can be , Im not going along with this delusion.Nvidia have lied and twisted the truth in the market , common sense abd critical thinking seems to be lacking in all aspects of society
@takinafan4328
@takinafan4328 Месяц назад
Why are you guys even talking about this diff? For single player games, responsiveness at 30+ fps is already good enough, it just looks awful. That's exactly what frame gen fixes though. What's the issue? Are you trying to protect people with a theoretical 4050 trying to crank up from 10 frames to 20 fake frames?
@FSAPOJake
@FSAPOJake Месяц назад
@@takinafan4328 That is complete and utter nonsense. Responsiveness at higher framerates is absolutely felt and extremely noticeable.
@Disco_Tek
@Disco_Tek 8 месяцев назад
If you play single player games its awesome. Set your settings to where you're in the 40-50fps running something like native or DLSS Quality then turn it on. Great experience. For multiplayer head clickers... not so much.
@Hollowtriangles
@Hollowtriangles 8 месяцев назад
I’m excited for the future of frame interpolation and extrapolation. Once they nail the quality of these frames it’s going to be exciting to see them figure out ways to generate potentially multiple fake frames between real frames and get some real life levels of smoothness. I’ve only used it in Avatar so far and it’s incredible to me
@altima22689
@altima22689 8 месяцев назад
I tried the FSR 3 mod for Cyberpunk in tandem with DLSS on my 3080. The gains are ridiculous, and NVIDIA ought to be ashamed of themselves for being so anti-consumer. If I bought a 40 series card to use frame generation, I'd be pretty miffed right now.
@micha3624
@micha3624 8 месяцев назад
Even on 4000 series this fsr FG mod is 10% faster than dlss fg
@kirbyatethanos
@kirbyatethanos 8 месяцев назад
The only people who hated Frame Generation are people who couldn't experience it. Just like DLSS 1.0 back in 2018
@ZoDDeL
@ZoDDeL 8 месяцев назад
DLSS 1.0 was and still is dogsh1t. so is framegen now. people love to cope to justify their purchases even if the praised feature doesnt do anything helpfull at all. most people for example dont know how to properly setup gsync / freesync. even most youtubers dont. they guide others to make diametral wrong settings.
@kirbyatethanos
@kirbyatethanos 8 месяцев назад
The only "problem" with DLSS 1.0 was that it was only available for 1440p and higher displays. Keep using buzzwords though lol.@@ZoDDeL
@yellowflash511
@yellowflash511 8 месяцев назад
COPIUM
@MrDs7777
@MrDs7777 8 месяцев назад
You said it.
@carlosnumbertwo
@carlosnumbertwo 8 месяцев назад
I’ve tried it on cyberpunk, it runs on my 3060 12gb and I’m just amazed. I initially expected noticeable input lag. I was totally wrong, if anything it feels better.
@nooux1966
@nooux1966 8 месяцев назад
Tried it last night on CP with a 3080, can literally max everything out even stick path tracing on with 50-60 fps, not very smooth but still crazy considering how much you're pushing your specs.
@carlosnumbertwo
@carlosnumbertwo 8 месяцев назад
@@nooux1966 I’m running a 5700X. What’s your cpu? I feel like you should be seeing higher numbers.
@nooux1966
@nooux1966 8 месяцев назад
@@carlosnumbertwo Without path tracing looking at 100-120 with max RT on but path tracing is still a killer and cpu is 5900X
@Erik_Dz
@Erik_Dz 8 месяцев назад
The point you are making at 20:15 seems really REALLY bad for the future. Why would you rather dedicate PC hardware towards interpolating and extrapolating frames rather than just rendering the game. At that point you are spending more resources to increase the frame rate number than you are to render the game engine and *play* the game. This is reductive at best. Upscaling and Frame generation are duct tape for poorly optimized games or weaker hardware. Its OK to tape over a few holes, but it becomes a problem when all you are doing is covering everything you see with tape.
@JohnnyBg2905
@JohnnyBg2905 8 месяцев назад
Minimum base frame rate for FG to make sense is 45 - making 90-100 frames and upwards. 60+ base framerate is optimal and all of this for single player games.
@subcon959
@subcon959 8 месяцев назад
One of the problems is people usually see reviews/metrics on RU-vid before they experience the technologies themselves. It's a lot harder to notice some of these issues if you haven't already been told by someone that they are there and that you should be bothered by them.
@__-fi6xg
@__-fi6xg 8 месяцев назад
i love xess in cyberpunk but i have a 30-40 fps difference in 2.5k in vr, thats the difference of motionsickness to smooth gameplay in vr. Plus xess only looks good when you stand still, its super blurry at 45 fps.
@NyuNighteyes
@NyuNighteyes 8 месяцев назад
After trying FG myself i felt scammed. Basically you need 60 fps baseline or else it will feel really floaty and input lag will be worse than native framerate. (imagine it was 30 it will have way more input lag than if you just rendered 30) So it's only useful if you want to run a game that you already can run at 60 fps at 120+ fps... which is just so weird? because the only games that i play at higher than 60 framerates are twitchy multiplayer shooters and similar competitive stuff, which don't need FG because they are games that already run at 200 fps by default!! I play all my singleplayer/story/graphics heavy games at 60 fps and thats plenty for an amazing experience, interpolating frames to boost that to 120+ in my eyes is nvidia creating FOMO for a small gain ( motion becomes more fluid if you have high Hz monitor), so def not worth buying 4000 series gpu. I also think path tracing whole scenes and GI for games that don't have day nigh cycles is ridiculous and borderline dishonest from devs to seemingly make people think they need to buy these expensive gpus. I am a software engineer and really into game engines. All you need for most games (static scenes and static light sources) is raytraced reflections for curved surfaces, planar surfaces can be handled with planar reflections and its faster if you use correct LODs, everything else should be baked! Take Alan Wake 2 as an example, there is no reason for that game to have whole scene ray tracing. All it needed was RT reflections and maybe RT shadows as a future proof toggle. The enviroments are hand crafted static scenes!!
@glub1381
@glub1381 8 месяцев назад
I'm using it on path traced cyberpunk with base fps ~40 and it feels and looks great. I don't notice any extra input lag. I know it's there, but for single player games it really doesn't matter. Frame gen is perfect for single player graphics heavy games like cyberpunk or aw2, especially for 3080 users like me who are on the verge of getting nice framerates on max settings. And idk about other games, but path tracing in cyberpunk looks amazing, and playing with it on at ~80-90 fps with fsr 3 is amazing.
@nothingineternityterms
@nothingineternityterms 8 месяцев назад
For a slow paced game like Alan Wake 2 FG bringing 40fps up to 80 fps actually works fairly well for it. Also a game that is hovering around 45-50 fps and making my freesync toggle on and off getting boosted to 90-100 fps is usually worth it IMO, even with some extra latency/ghosting.
@Wobbothe3rd
@Wobbothe3rd 8 месяцев назад
Plenty of people enjoy FG at 30hz base framerate, stop lying. The input lag is no worse than 30hz native - are you saying 30hz is unacceptable? Because plenty of people played games at 30hz for DECADES, including most high end PC tech.
@BaKa60gaming
@BaKa60gaming 8 месяцев назад
generated frame serve no purpose if it doesnt give a better feel, if going from 60 to 120 with generated frame but it still feel 60 why do we need to know it say 120?
@TheIndulgers
@TheIndulgers 8 месяцев назад
6:35 I would consider them “fake” frames as they do not respond to input. Even better I think the tech should be called “image insertion” or something similar.
@xanzxx
@xanzxx 8 месяцев назад
theres already a term for it called frame interpolation. Frame Generation is just the name Nvidia gave it and Fluid Motion Frames is what Amd calls it
@PingTPunk-rq9us
@PingTPunk-rq9us 8 месяцев назад
I agree with this. They exist, of course but they don't effect anything in the game.
@DonaldHendleyBklynvexRecords
@DonaldHendleyBklynvexRecords 8 месяцев назад
there is scenarios where it does "feel" like actual double frames though, cant just blanket feel the whole tech is garbage
@BlackJesus8463
@BlackJesus8463 8 месяцев назад
@@DonaldHendleyBklynvexRecords It is garbage its slower than native and we dont need it unless GPUs are being artifgicially limited so Nvidia execs can get a bigger bonus.
@DonaldHendleyBklynvexRecords
@DonaldHendleyBklynvexRecords 8 месяцев назад
@@BlackJesus8463 I can agree and in alot of instances I do, especially with Nvidia, but it's something they set the president for and these other companies follow so it's here to stay, thankfully it's not at a point that we must use it but that's only if you have top of the line card and then Nvidia wins anyway
@bullit199
@bullit199 8 месяцев назад
Another reason FG can be a game changer is PCVR. If you can’t hit 90 fps in VR it’s a miserable experience and can cause motion sickness. Have you tested FG and PCVR yet?
@DarianPaul
@DarianPaul 8 месяцев назад
PCVR has had "frame generation" since almost the beginning of modern PCVR. Oculus/Meta calls their implementation Asynchronous Spacewarp. I believe they introduced this in 2016.
@Wobbothe3rd
@Wobbothe3rd 8 месяцев назад
​​@@DarianPaul2012. But to be fair frame extrapolation is different than interpolation, vr async time/spacewarp doesn't affect input latency or game physics at all, it just updates the image to exactly where your head is.
@hollywoodmeow
@hollywoodmeow 8 месяцев назад
i find interpolation visibly jarring, and the interpolated frames arent responsive to peripheral input. i can see and feel the difference...it degrades my experience. its an unpleasant technology meant to impress casual gamers with 'higher number better' style logic, and i would rather have more raster horsepower and optimized driver packages than have AMD or nvidia focusing on these silly surface-level marketing gimmicks obviously only designed to increase mindshare for cards that are already selling themselves anyway (at ludicrous MSRPs might i add) in a vacuum its a cool technology, but this isnt an engineering proof-of-concept sort of discussion, so im setting aside that aspect of my opinion as its irrelevant in the face of the lack of functional improvement in my gaming experience (the entire supposed point of the "feature"). i hate motion blur - the frames are blurry and artifacted. i hate input lag - the frames dont respond to peripheral input. what do i gain from using it? nothing.
@ThisSteveGuy
@ThisSteveGuy 8 месяцев назад
It's certainly more of a net positive for people watching the games than the people playing them (depending on the title, of course). Maybe NVIDIA or AMD could come up with a way for streamers to use FG just for OBS while they play a non-FG version on their monitor.
@MrDs7777
@MrDs7777 8 месяцев назад
Nvidia Broadcast.
@HunterTracks
@HunterTracks 8 месяцев назад
I maintain the same stance on it I've had since FSR 3 was announced: DLSS 3 FG is not bad as a tech, it's bad because it has allowed Nvidia to massively upcharge customers on a bad card gen. Even if I cared very deeply about lag and image quality compromises that the tech may bring, it's not as though anyone is forcing me to use it. Everyone having an option to turn on FG and see if it makes sense for them is a net benefit; the way Nvidia implemented it was just scummy. You can argue about it setting a dangerous precedent, but honestly it's not targeted at a performance range that would allow developers to compensate for poor performance with it.
@Wobbothe3rd
@Wobbothe3rd 8 месяцев назад
Rtx40 is an EXCELLENT generation of cards, check the sales instead of the whining on social media.
@HunterTracks
@HunterTracks 8 месяцев назад
​@@Wobbothe3rd I wonder if you get paid to shill for Nvidia, or if it's some kinda addiction for you. Either way, we've been over this: a product selling well or being popular doesn't make it good.
8 месяцев назад
You never feel the bad thing about FG if you are watching videos of it. Not because it is hard to capture, but because it is impossible. I tried FG to smooth out the drops below 60Hz in two games, Starfield and Cyberpunk. So even my frames were over 50 stable, the input lag became very noticeable and both are singleplayer games. I understand why AMD only recommends it with high frame rates. If you want to smooth out a 30FPS game to 60Hz, this is not the solution unfortunately (haha, cyberpunk with pathtracing is not playable despite all the claims of nvidia). And this you won't ever see on recorded footage.
@sudd3660
@sudd3660 8 месяцев назад
cyberpunk is not playable on any pc on any settings, i tried on my 12700k and 4090 latest version of the game. but it did finally look good tho, i hated how it looked up til now. it is the inpulat this games has that ruins it.
@tunaware
@tunaware 8 месяцев назад
I think a key term to use here instead of "picture smoothness" is "motion clarity" (borrowing from high refresh rate monitor discussion). All else being equal, it will be easier to notice or track fast moving elements (enemies or items when moving the camera quickly) with well implemented frame gen. I completely agree with the marketing point, I was initially very negative about it when they marketed it as just double performance
@russianbeginner643
@russianbeginner643 8 месяцев назад
I have a gtx 1650 laptop is this good news or not ?
@danielc04
@danielc04 8 месяцев назад
I've been playing cyberpunk with path tracing and fsr frame gen mod on a RTX 3060 achieving around 75fps. It doesn't feel too bad for me and I really prefer motion fluidity sacrificing some latency in a single player game like this.
@myclips1892
@myclips1892 8 месяцев назад
how? i have a 4060 and with dlss quality and fg i barely get 50-55
@mofstar8683
@mofstar8683 8 месяцев назад
@@myclips1892he’s definitely not using dlss quality😭
@CS-pl8fc
@CS-pl8fc 8 месяцев назад
Bro, turn path tracing off lol
@danielc04
@danielc04 8 месяцев назад
I use dlss performance, I play with a 14 inch laptop so it doesn't look bad, looks the same as Quality dlss to me on that screen size
@CS-pl8fc
@CS-pl8fc 8 месяцев назад
@@danielc04 it will look better and feel better if you turn path tracing off and use quality. Trying to use path tracing on a 3060 is just foolish.
@aventuraalert8254
@aventuraalert8254 8 месяцев назад
CONGRATS on 169K ---- We have multiple Frames generated already -- VR does 22.5 and 30 to native 90 so 3 and 2 INTER frames generated - Motion Reprojection in OpenXR --
@Wobbothe3rd
@Wobbothe3rd 8 месяцев назад
True, but frame extrapolation through timewsrp or spacewarp doesn't have any effect on input latency at all.
@ryogaming4771
@ryogaming4771 8 месяцев назад
Most big studio are going the 'RayTraced GI, Shadow, Reflection' era. Avatar is a nice example. It runs on none RT hardware but it's super slow, but it runs. Snowdrop V2 at ubisoft is going to be in a LOT of games going forward. Lots of UE5 dev will use LUMENS. It's a win win for both of us. It's faster for them to make the video game and it look better to us!
@longjohn526
@longjohn526 8 месяцев назад
Actual it does use RT cores *IF* they are available and uses software RT if no RT cores are present ..... There is about a 20% boost in performance using RT cores
@starvader6604
@starvader6604 8 месяцев назад
@@longjohn526 how? there's no difference in performance between nvidia and amd cards
@manishraj1590
@manishraj1590 8 месяцев назад
​@@starvader6604amd cards do have rt cores. Avatar is more optimised for amd cards as it a amd sponsored tittle. Hence the same performance on nvidia and amd. Avatar uses rt cores lightly compared to other games
@arenzricodexd4409
@arenzricodexd4409 8 месяцев назад
​@@starvader6604isn't that Avatar typically faster on nvidia gpu than AMD?
@Wobbothe3rd
@Wobbothe3rd 8 месяцев назад
​@starvader6604 of course there is.
@kamikaze00007
@kamikaze00007 8 месяцев назад
FG = Depends on per game dev implementation and Nvidia driver implementation on software. GPU with just plain better raw performance = Works on every game. Question: Which would you willingly spend more money on?
@ocha-time
@ocha-time 8 месяцев назад
As long as it's being used in marketing (and being parroted by other people) as a "frame boost" or "performance boost" and is being used to say "Look at how much better THIS card is compared to LAST gen!" I will always be reluctant to actually care. The more I hear about frame interpolation the less I actually care to do it. Someone actually said "Digital Foundry put up a video of the interpolated frames and only those and the output was absolutely perfect, as good or better than the original" and I cringe at their hard work being subverted like this. You WILL eat the interpolated frames as performance boosts and you WILL like it, even though it sucks at a lot of stuff. The swing in attitude is 100% marketing shills.
@johnnyringo35
@johnnyringo35 8 месяцев назад
You're wrong. It's a way to gain more with less. We are running out of die shrinks, and silicon space in general. Anybody know what node comes after 3nm at TSMC? We got 2 to .5nm at best. Eventually quantum tunneling (if I remember the right term) becomes an issue.
@thecatdaddy1981
@thecatdaddy1981 8 месяцев назад
When people say stuff like "our old GPUs just got faster!" or "I dont notice any input lag" - you know we are screwed, because developers will use FG as a crutch just like they did with upscaling.
@Godmode_ON24
@Godmode_ON24 8 месяцев назад
​​@@johnnyringo35what comes next is chipset design instead of monolithic design. Also the 4090 seems to be doable so we clearly are leaving a ton of performance on the table even with the current node process. Maybe if mid range GPUs didn't cost more than current gen consoles it wouldn't be that big a problem.
@ocha-time
@ocha-time 8 месяцев назад
@@thecatdaddy1981 THAT is what's upsetting to me. We already have devs skipping optimization and saying "Just turn on DLSS3.5". We're sprinting backwards and paying a premium for it. I'm not okay with it, and I'm sick of excuses. I don't mind it as an option, until it's being used as a crutch for lack of innovation or optimization. Nobody wins in that scenario. Other than the GPU manufacturers laughing on their way to the bank and publishers getting to get an ROI three months earlier than they would have normally.
@RegularNobody
@RegularNobody 8 месяцев назад
Precisely
@eugkra33
@eugkra33 8 месяцев назад
The reactions were negative until AMD launched theirs. Then suddenly it's the next coming out Christ! AMD has a huge online following, that's not represented by real sale figures.
@MrDs7777
@MrDs7777 8 месяцев назад
Poor people identify with AMD. Thats why they happy to have GPU features three generations out of date.
@Sicara9
@Sicara9 8 месяцев назад
frame generation is pretty cool, but as a person who gets motion sick in first person games i am still on the fence about this technology as my source of motion sickness often comes from input latency rather than smooth movements. Because frame generation increases input latency it is hard for me to weigh the tradeoffs of playing a first person game at a lower framerate for longer rather than a higher framerate but experience motion sickness sooner.
@npcimknot958
@npcimknot958 8 месяцев назад
Fellow cursed motion sickness fellow.. sigh 😢😢😢😢
@FrostyBud777
@FrostyBud777 8 месяцев назад
motion sickness is histamine overload. I highly recommend the book dirty Genes by Dr Lynch. You can cure motion sickness in a few weeks with methylated vitamins. . I actually have low histamine and am trying to increase it with niacinamide and non methylated vitamins. I want motion sickness a little because that means the game is actively being portrayed as real life 3d image to your brain. Giving full immersion. I like that i can play games native 4k with frame gen and be very sharp compared to the older turn on upscaling to get extra fps and losing that clarity.
@vmafarah9473
@vmafarah9473 8 месяцев назад
How old r u? On what age did you started playing games? I used to play games on igpu and don't and I came from screen shot like FPS to RTX 4070 . I don't have motion sickness when I control the mouse. But will have when someone else control the mouse. Also have rarely on explosion scenes.
@Wobbothe3rd
@Wobbothe3rd 8 месяцев назад
Frame generation doesn't increase input latency above the base framerate for DLSS3. If you aren't getting motion sickness at 60hz it's extremely unlikely you will get it at 120hz with frame generation (at least for DLSS3).
@samarthkapil6579
@samarthkapil6579 8 месяцев назад
For fsr3 u just need a cap, doesnt need to be at your monitors max refresh rate. For amd users just cap your framerate (should be low enough that you are locked at it) from the amd overlay. Has perfect frame pacing
@exception6651
@exception6651 8 месяцев назад
It works similarly with Nvidia when reflex is enabled vsync plus gsync it caps the fps for you. I don’t know why more people know that these FG technologies work best by capping under your refresh rate lol
@xerxeslv
@xerxeslv 8 месяцев назад
@@exception6651 Yep, it just makes sense, but I guess a lots of people just want more frames and don't care. I have noticed the FSR mod feels the worse when GPU is struggling at lower fps and high load (say punk with pathtracing) but that's the scenario where more frames are needed the most. That's the problem. Under normal load it can do about 80% more frames, and when capped and GPU have some free % power it doubles the frame rate and it feels great.
@WarrenLeggatt
@WarrenLeggatt 8 месяцев назад
Not had a chance yet to test frame generation, only have a 30 series. But I get the feel it will be like DLSS as you say. It is far better to not have to use it but in the right situation the gains are worth the downsides.
@SamLoki
@SamLoki 8 месяцев назад
I'm not going to lie I've grown very fond of DLSS. Sure the fps boost is nice, but DLSS Q actually does an excellent job of antialiasing compared to older, more demanding methods. I use DLSS Q whenever it's available even if I don't need the FPS boost
@pf333
@pf333 8 месяцев назад
Got around to use it on my 3070 laptop on Cyberpunk thanks to the mod by Nukem. Works pretty well, allows me to get around 55 fps on ultra with psycho ray tracing. Quite incredible.
@WarrenLeggatt
@WarrenLeggatt 8 месяцев назад
@@SamLoki I guess it depends on the resolution you are running at etc. I am on 3060 and on a play through of Cyberpunk I am currently doing with lots turned up the 3060 does cut it but with RT in play there are noticable stutters etc. So I am using DLSS and run at 1080p and it is a compromise. On the whole it is worth it as the visual quality from RT but I do notice some DLSS artifacts and "shimmer" etc but it looks good enough on the whole I live with. Had not thought about DLSS as AA and can see that work.
@SamLoki
@SamLoki 8 месяцев назад
@@WarrenLeggatt Ah you're right. It's totally different upscaling to 4k vs 1080p
@rolux4853
@rolux4853 8 месяцев назад
We need to adress another core problem! Modern developers not being able to properly optimize games anymore! They just model it together in their premade sandbox engine and that’s it. Gone are the days of smart programmers creating features in a way even a potato pc can run them. All while looking stunning on high end gear! It’s very sad that these days the kids are all „designers“ but no more real „programmers“. They ts because they grew up with those sandbox tools where they drag and drop assets and boom there’s a final game. Back when I was at it in the 90s you had to think up almost everything in an editor and that’s what gave you the deep understanding and you created much better concepts.
@EdToml
@EdToml 8 месяцев назад
I play on linux with a Ryzen 7700 and a rx6600xt with a 1440p 165fps screen with freesync. I've not yet seen much frame generation but on CP2077, upscaling, in auto mode, gives me a solid 60+ fps. Look forward to FSR3 (frame gen) in CP2077 and other games. Its a tool, if gives me a better experience I'll use it, if not I will not. So far, FSR 2.x has usually helped - suspect FSR 3.x will be similar (I have used FSR 3 in Forespoken, it was ok but I consider it a 'release candidate' not the product that FSR3 will end up being). Bottom line here is that FSR is improving my game play and is letting me stretch my GPU while waiting for better $/frame...
@zaidlacksalastname4905
@zaidlacksalastname4905 8 месяцев назад
FSR3 allowed me to play cyberpunk with path tracing on a 3080, and took me from an inconsistent 40 FPS to 70ish. I play with a controller so the 40 FPS (with reflex) latency is good enough for me, and the only problem was the visual smoothness, which fg solves. Nukem9's mod also works for other games so you should check it out. I think FG is great, but should NEVER be marketed as "performance"
@MrBeetsGaming
@MrBeetsGaming 8 месяцев назад
I think unless it's gets to a point where you can use it at a base framerate of 30 and get 60 it isn't honestly that useful for most people.
@TheEnhas
@TheEnhas 8 месяцев назад
I've seen it used at less than 60 base FPS and increase it to 60+, which looks much smoother than the original framerate but still doesn't look exactly like native 60+ FPS. If anything, it looks like it's in-between the original and interpolated FPS numbers, but it's still better to have it on than not.
@formulaic78
@formulaic78 8 месяцев назад
I've been playing cyberpunk with path tracing and ray reconstruction at around 45-60 FPS with a 3080, which has been just acceptable for me with a controller. This will put it in the 60-80 range, which will make it more than acceptable as long as artifacting isn't too noticeable.
@Wobbothe3rd
@Wobbothe3rd 8 месяцев назад
DLSS3 FG can do this perfectly well in games like Cyberpunk and Flight Simulator. FSR3 has more trouble.
@touchthis371
@touchthis371 8 месяцев назад
I bought 4070 yesterday and tested fg in cyberpunk and Jedi survivor. Fg is fake. I won’t play games with fg on, because it feels like you are watching vid without any control.
@fvallo
@fvallo 8 месяцев назад
Its useful for high end cards only! You need base 90fps for fg to feel decent... So yea, its useless for 95% of people
@rulewski33
@rulewski33 8 месяцев назад
50-60 is fine but below that yeah no
@tiromandal6399
@tiromandal6399 8 месяцев назад
I just set up FSR frame gen for Alan Wake 2 on my 2060 Super an hour ago, works great! Almost 60 fps with medium-high settings, med RT with ray reconstruction and DLSS quality. Without frame gen it's around 35. Oh and no increased latency whatsoever.
@Liquidfard
@Liquidfard 8 месяцев назад
Does it actually feel smooth while playing or does it feel like how it's supposed to feel on base framerate ? Cuz I've tried this mod today on NFS unbound, and even though the fps metrics say 80 fps, It didn't feel smooth at all. Infact it felt like I was playing on base framerate, which was 40 fps.
@SebbS82
@SebbS82 8 месяцев назад
I tested FMF 2 months ago in Immortals of Aveum with the 3080 12GB. Before I had 52 FPS in 4K ultra. After 75-85 FPS. It works fine in this game.
@tiromandal6399
@tiromandal6399 8 месяцев назад
@@SebbS82 What's "FMF"?
@SebbS82
@SebbS82 8 месяцев назад
​@@tiromandal6399 Fluit Motion Frame / at NVIDIA named Frame Generation
@theoathman8188
@theoathman8188 8 месяцев назад
I'm playing Cyperbunk 2077 path tracing with FSR FG. I'm just blown away with how smooth the game feels. I went from 45fps to 90fps. Latency and artifacts are noticeable but very tolerable. I genuinely lost complete interest in 4000 series GPUs now. AMD just added 10 years to my 3070.
@starvader6604
@starvader6604 8 месяцев назад
not so sure about that bud.. future games will use more than 8gb vram.. good luck running those i guess
@IceBreakBottle
@IceBreakBottle 8 месяцев назад
10 years to a 3070 hahahahaha
@gunturbayu6779
@gunturbayu6779 8 месяцев назад
Cry in 8 gig of vram
@jjlw2378
@jjlw2378 8 месяцев назад
I think this is the biggest misconception with FG. It is actually the worst on lower/mid range hardware and it doesn't extend the life of older gpus at all.
@ironiclee9751
@ironiclee9751 8 месяцев назад
The internet can't have a rational discussion about this stuff, this is borderline console warrior territory. Alex from Digital Foundry brought up the same frame pacing issues you did about FSR3 but people think he's an nvidia shill or he set up his PC wrong or whatever despite the fact that he also points out DLSS issues. So I don't know what people want, maybe no one should point out the flaws in frame gen tech and then neither company has to bother trying to improve them since they're perfect.
@steaksoldier
@steaksoldier 8 месяцев назад
I've only used AFMF so far on my 6900xt and I absolutely love it. cant wait for it to be built in to the driver instead of having to use beta drivers
@mackdaddypeypey1
@mackdaddypeypey1 8 месяцев назад
You like amfm? I tested on my 7900xtx and it was TRASH. Kept turning itself off the second you pan the camera
@steaksoldier
@steaksoldier 8 месяцев назад
@@mackdaddypeypey1 mine only turned off if I moved my mouse A LOT really fast. It handled single fast movements just fine. As long as I was calm and not panic moving my mouse it was surprisingly smooth
@mackdaddypeypey1
@mackdaddypeypey1 8 месяцев назад
@@steaksoldier I use a controller and I tested in cyberpunk. Not sure if it’s that game or what but it was a dreadful experience
@mattblyther5426
@mattblyther5426 8 месяцев назад
U guys seem to talking about the issue that was in the 1st preview driver. Their on like 3rd preview now (maybe 4th) and on my 6800xt haven’t had the panning issue after the 1st driver.
@steaksoldier
@steaksoldier 8 месяцев назад
@@mattblyther5426 pretty sure theyre on the 4th one now. I had the 3rd one installed and loved it but windows shit the bed and I had to reinstall. Havent reinstalled the beta drivers hoping they would be merged in to the normal drivers soon
@alun1038
@alun1038 8 месяцев назад
Now that more people can utilise it, it’s actually a pretty nice feature to have. Especially in extremely CPU heavy games like Jedi Survivor, where the fps drops like crazy in some areas, with FG it stays well above the desired 60-80 fps.
@astreakaito5625
@astreakaito5625 8 месяцев назад
I think it sucks. You can't turn 30fps into a convincing 60fps anyway, so this is pointless for the one use case that would have been nice, as many excellent games, mainly older titles, are stuck to 30fps. Sure you can turn 60 FPS into 100FPS and with some luck you won't see some obvious artifacting everywhere and double images on hud elements, big whoop. This whole thing is just yet another ploy to fake higher performance that don't actually exist, but the price increases are absolutely going to be very real. So screw both Nvidia and AMD for this bs.
@rulewski33
@rulewski33 8 месяцев назад
Thats not the use case it was intended for. I know that there are Games that are locked to some fps but that is only because the Game Engine cant do more then that, simple as that. There are Mods for that, if thats in your Interest. Its also is always a matter of Implentation, i only tested FG in Avatar so far and the results are great imo but then on the other hand Avatar is pretty well optimised and the Implentation is surprisingly well done too
@pliat
@pliat 8 месяцев назад
Yeah duh 60fps generated isnt nice, but 120fps generated (dlss not fsr) is better than 70fps non-generated. Either way, you best get used to it, these ‘fake’ rendering techniques are the future.
@Dcully
@Dcully 8 месяцев назад
Bought Cyberpunk few weeks back to compliment my new rig and FG with the 4070 adds about 40 extra fps for me max settings raytracing etc. Fantastic tech for single player games imo.
@HakenMods
@HakenMods 8 месяцев назад
I have always loved all the ai tech DLSS is so good! And now frame gen aswell❤
@jemborg
@jemborg 8 месяцев назад
There's video AI upscaling AND.... there's also the AI denoising for ray-tracing... brilliant. I think Intel are gonna be pretty competitive in this in the future as AMD seem loath to employ AI.
@keir92
@keir92 8 месяцев назад
the main thing i can see wrong with it is games releasing that need frame generation to achieve 60 frames per second.
@martytube821
@martytube821 8 месяцев назад
I don't like it, seems like a way for these corps to charge more for less silicon and true performance.
@RegularNobody
@RegularNobody 8 месяцев назад
That's because it's exactly that
@DevouringKing
@DevouringKing 8 месяцев назад
think the other way around. i have 200fps for 100 watts instead of 200fps for 180 watts. its nice for silent gaming rigs.
@TTx04xCOBRA
@TTx04xCOBRA 8 месяцев назад
​@@RegularNobodyL take. very ignorant
@Tigerhearty
@Tigerhearty 8 месяцев назад
nvidia shills like owen are crying everyone has access to it. He was okay shilling it for nvidia when it was only on the nvidia side, now he suddenly grows some type of conscience about it, of course.
@LeegallyBliindLOL
@LeegallyBliindLOL 8 месяцев назад
​@@TigerheartyBro, touch some grass, this isn't a war 😂
@SNUSNUU
@SNUSNUU 8 месяцев назад
Great video. Using this with ASA on a private server without battle eye enabled, doubles my average frames from 30-50 to 61-75 with my 3080 Ti. Appreciate you and the modder.
@UncannySense
@UncannySense 8 месяцев назад
DLSS allowed Nvidia to rebadge a 4050 into a 4060 and charged $400 for the privilege...While making last gen cards appear obsolete which is great for forcing a single generation of hardware become obsolete and coerce an upgrade much like gimped vram capacity...FSR will be like Freesync and eventually become the standard thanks to opensource... But if you want "premium" experience Nvidia will be happy to sell it to you for an open wallet...I prefer not to use fake frames... and I will continue to call them fake frames...
@BlackJesus8463
@BlackJesus8463 8 месяцев назад
Nvidia wants to do it again! In any other world the 4080 12GB would've been a 4060Ti but profits must always go up for the shareholder. lolz
@svendtang5432
@svendtang5432 8 месяцев назад
It was on my mind that it will be an excuse to not try to optimise the games to run without.. say go with 25 fps and launch title because you can just use fg
@sudd3660
@sudd3660 8 месяцев назад
that is what is happening, cpu bottlenecks that last for a decade and more, gpu power wont fix the way they make game engines.... well made games that are lets say 5 years old you can get hundreds of fps on now, non scalave engines are bad for gamers.
@nexornator
@nexornator 8 месяцев назад
I think FG techniques are awesome, what I won't ever ever agree is on greed and that is what Nvidia did by lying and still lying that DLSS 3 is only capable on 40 series cards, and the only way to enjoy this tech is by buying their over expensive cards while making think that owners of a 30 series cards they have obsolete cards meanwhile that's far from reality, Nvidia didn't think wisely their approach in how to sell their tech, take for example the 4060 and 4060 ti, a 3070 rtx is far superior that both of those cards, so knowing this, Nvidia is telling us a far superior card like the rtx 3070 easely above of the 4060 and 4060 ti is not capable of running DLSS 3 ?, what? all the metrics shown in dozens of videos the 3070 is above those cards and Nvidia is saying is not capable? because "new" generation of tensa cores whatever shat?, hmm yeah if someone still want's to believe that well keep falling for that lol, the FSR 3 mod it literally proves the DLSS 3 only in 40 series card is complete BS!! Also maybe is not on topic but I always wanted to say this about Nvidia greedy ways, do you know why Nvidia raised the prices up so much with the 40 series cards?, well the reason was because crytominers don't have any use of their GPUs now , so what Im trying to say is that Nvidia was always in favor selling literally all their cards to this people non gamers, so the only way possible Nvidia can retain high amounts of income now was to skyrocket their prices, there isn't even a huge jump between the 30 series and 40 series yet again Nvidia catapult their prices without any justification. Very sad Nvidia become this company of pure greediness instead of being a consumer friendly company like it seems it was in the past.
@Ebilcake
@Ebilcake 8 месяцев назад
Will have to judge this on game by game basis, maybe even setting by setting tbh. Found a few individual settings which cause visual issues with AMD FG but it's mostly fine with them turned off.
@javierarroyo6600
@javierarroyo6600 8 месяцев назад
What happend is that AMD fanboys are not crying anymore because now they have it, before frame Gen was "cheating" hahaha remember those arguments? those guys were the loud ones against technology 😃
@mechanicalmonk2020
@mechanicalmonk2020 8 месяцев назад
I think you need 3.5 more emojis for the point to really hit home
@andersjjensen
@andersjjensen 8 месяцев назад
Except there are about 15-20x more GTX 1000 and RTX 2000/3000 cards benefitting from this than AMD cards. And the biggest outcry came from people with 3080s and above. They just spent BIG BUCKS (most of them during the crypto plague) on a GPU, only to be given the middle finger.... so no. Just no. You're trying to retrofit the history into your existing narrative.
@javierarroyo6600
@javierarroyo6600 8 месяцев назад
@@mechanicalmonk2020 😃😃😃 : You know I'm right my guy
@GuyManley
@GuyManley 8 месяцев назад
So DLSS was supposed to allow for more compatibility in the daunting era of ray tracing and 4K gaming. It ended up being the catch all in place of optimizing games to not waste their performance potential. That said it is still useful and a good technology that is very useful to devs and gamers alike. FG felt like it was just a flex that Nvidia made and it didn't sell GPUs like they thought it would. Bigger number better, doesn't really mater if it objectively worsens the experience of the game. It would be nice if FG worked at lower framerates to make 20 fps look like 30/40 FPS for low end PC and handhelds. It would be amazing if FG could help hide shader and traversal stutter even a little bit. Instead, it makes 60 FPS drop to 40 FPS and double to 80 via inserted frames. That feels like 40 FPS still. It just sounds like a step backwards. I have yet to try it in earnest. There hasn't been a single Nvidia DLSS3 or FSR3 game that I own or would day one buy yet. Maybe I'll see what the mod compatibility is like but that would probably not be a good experience compared to native so I'm hesitant to base any opinion off of it. Regardless, seems this is the way the industry is going.
@syeo501
@syeo501 Месяц назад
I just tried it and it's crazy how well it works on Spiderman Remastered for Steamdeck. Previously, I played on almost minimum settings and got like 40 FPS. Now I can almost play on highest settings and play on like 60 FPS, barely going down to like 55 FPS
@clem9808
@clem9808 8 месяцев назад
Do you think input latency issue will somehow be fixed or at least be reduced in the future?
@Kahunaseb
@Kahunaseb 8 месяцев назад
god only knows
@MrDs7777
@MrDs7777 8 месяцев назад
AMD, never. Nvidia reflex is already very very good.
@siyzerix
@siyzerix 8 месяцев назад
When people saw that below the rtx 4070 there's not much of a performance upgrade, and infact spec's being downgraded, they said why bother for a single feature with relatively limited use? The $400 and below are the most popular GPU's that're brought. Not everyone can spend $600+ on a GPU. Thats much worse outside US. FSR 3 FG mod now helps people properly ignore rtx 4000 and rx7000 and wait till rtx 5000 or rtx 6000 to really upgrade. Now they don't have to pay money just for this 1 feature. Its why people are more than content with FSR 3 FG's artifacts, which actually can be cleaned up a bit with DLSS upscaling. The reason I think why this is a scummier move from nvidia than making DLSS restricted to rtx GPU's is because rtx 4000's lower stack's sole reason to exist was FG. This wasn't the case with turing which still brought in respectable perf improvements across the board. Its price was the issue. You still got the 16 series and a super refresh 1 year later, which is still not the case with rtx 4000.
@Absolyim
@Absolyim 7 месяцев назад
Dumb question.. I have a RX 6800.. how do I enable FSR 3 Frame Generation?
Далее
Chelsea gym be like.. 😅⚽️
00:20
Просмотров 14 млн
This issue is plaguing modern gaming graphics
23:30
Просмотров 851 тыс.
Star Wars Outlaws PC Performance Tested
16:02
Просмотров 29 тыс.
4070 Super Benchmarks! NVIDIA vs AMD...
19:51
Просмотров 363 тыс.
I watched 121 FPS guides and they’re full of lies
20:07
DLSS 3.5 - Better Pathtracing for Free
13:10
Просмотров 295 тыс.
NixOS Setup Guide - Configuration / Home-Manager / Flakes
3:01:39