Тёмный

I’m Embarrassed I Didn’t Think of This.. - Asynchronous Reprojection 

Linus Tech Tips
Подписаться 16 млн
Просмотров 1,6 млн
50% 1

Get up to 25% Off Pulseway's IT Management Software at lmg.gg/PulsewayLTT
Save up to *40% off and get Free Worldwide Shipping until Dec. 22nd at www.ridge.com/LINUS
What if you didn’t need the best frame rate to reduce input latency? What if your display’s refresh rate was enough all on its own? With Async reprojection, anything is possible… Even turning 30 FPS into 240.
Discuss on the forum: linustechtips.com/topic/14713...
Comrade Stinger's original video + download links: • Async Reprojection out...
2kliksphilip's video about async reprojection: • The future of upscaling?
Purchases made through some store links may provide some compensation to Linus Media Group.
► GET MERCH: lttstore.com
► SUPPORT US ON FLOATPLANE: www.floatplane.com/ltt
► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
► PODCAST GEAR: lmg.gg/podcastgear
FOLLOW US
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
Twitch: / linustech
MUSIC CREDIT
---------------------------------------------------
Intro: Laszlo - Supernova
Video Link: • [Electro] - Laszlo - S...
iTunes Download Link: itunes.apple.com/us/album/sup...
Artist Link: / laszlomusic
Outro: Approaching Nirvana - Sugar High
Video Link: • Sugar High - Approachi...
Listen on Spotify: spoti.fi/UxWkUw
Artist Link: / approachingnirvana
Intro animation by MBarek Abdelwassaa / mbarek_abdel
Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
CHAPTERS
---------------------------------------------------
0:00 Intro
1:36 Play along at home!
1:57 I'm sorry, what is this? Demo time!
3:40 That looks bad, can it be improved?
5:12 Why haven't we been using this??
6:38 Blind frame rate tests
9:44 If frame rate is so important, why can't they tell?
11:52 Showing how the sausage is made
12:31 This could be a game-changer!

Наука

Опубликовано:

 

28 май 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 3 тыс.   
@LinusTechTips
@LinusTechTips Год назад
Thanks to Ridge for sponsoring today's video! Save up to *40% off and get Free Worldwide Shipping until Dec. 22nd at www.ridge.com/LINUS
@theboogerbomb
@theboogerbomb Год назад
thank you ridge
@nukeshine
@nukeshine Год назад
ridge sucks!!!!!!!!
@gibbyhale4217
@gibbyhale4217 Год назад
Why does it look like inside of a vagania
@YTxGalaxy
@YTxGalaxy Год назад
6:43 "Top 5 - 10%" , me whos top .1% & still havent become a pro player...
@0opsyt
@0opsyt Год назад
if you render a little bit off screen you might not need stretching
@tillson8686
@tillson8686 Год назад
I asked in a VR subreddit about a year ago why nobody is making Async for computer games and people gave me shit about it like "wouldn't work that way, the idea is stupid, just not possible, etc." so I gave up. Glad I asked the right people
@InfernosReaper
@InfernosReaper Год назад
There are a lot of people who like to do the seemingly safe bet of saying "it won't work" without actually knowing, because they aren't the experts they want to pretend they are. If a person speaks in absolutes without even trying explain why, chances are, they are not truly experts. They might know some things and even genuinely think themselves to be experts, but in reality, they have much more to learn.
@kristmadsen
@kristmadsen Год назад
People that reply to things on the internet tend to respond that way to new ideas.
@kazioo2
@kazioo2 Год назад
Maybe you got this answer because it was discussed and tested like a hundred times since John Carmack invented it in 2012. There are serious issues with this that are much less problematic in VR.
@ffwast
@ffwast Год назад
The right idea is not asking redditors anything.
@DJayFreeDoo
@DJayFreeDoo Год назад
@@kristmadsen it was the same when the first computer mouse was invented. the higher ups said its useless, why would anyone need this? And bam! everyone has a mouse or a trackpad
@rodrigoteles1409
@rodrigoteles1409 Год назад
Plouffe's "He owns a display" gag is always going to crack me up.
@Lu-db1uf
@Lu-db1uf Год назад
Don't all of them own displays? It's a tech media company, I'd hope they do.
@Fay7666
@Fay7666 Год назад
There was a video a couple of weeks ago about his _display._
@bigdoggo5827
@bigdoggo5827 Год назад
@@Lu-db1uf You know.. that's the joke
@reeepingk
@reeepingk Год назад
@@Lu-db1uf But his display is.... *special*
@thebyzocker
@thebyzocker Год назад
@@Lu-db1uf he bought the alienware miniled one and hes proud that he was one of the first to get it and now its a meme
@Blap7
@Blap7 Год назад
2kliksphilip and LTT is a crossover I never knew I needed. Make it happen.
@stupot46
@stupot46 Год назад
Bump lol
@Mraz565
@Mraz565 Год назад
Wonder if it can be used with CSGO, whether Valve allows or you brute force it.
@morfgo
@morfgo Год назад
They won't They just use him and his ideas with not even 1 full second of credit
@mipacem
@mipacem Год назад
@@morfgo its not malicious
@isaacolukanni
@isaacolukanni Год назад
@@morfgo Dude, they literally credited him and his video in the description!
@thepillowmancer
@thepillowmancer Год назад
Not mentioned in the video: you can render frames at slightly higher fov and resolution the the screen, so that there's some information "behind" the monitor corner. Won't save you from turning 180 degrees, but it will fix most of the popup for a very slight hit on performance
@lyrilljackson
@lyrilljackson Год назад
this is not what pc gaming is supposed to be about. using vr handmedown techs. and vr and pc scene shouldn't be segregated and minding their own scene either if yall sudden mindlessly hurrah at this crossover weirdofiesta
@martinkrauser4029
@martinkrauser4029 Год назад
@@lyrilljackson What do you mean?
@lyrilljackson
@lyrilljackson Год назад
@@martinkrauser4029 asmh
@JustSomeDinosaurPerson
@JustSomeDinosaurPerson Год назад
@@lyrilljackson Brainlet take
@latinodollar
@latinodollar Год назад
@@JustSomeDinosaurPerson Careful... thats a [Lvl. 163] PC Master-Supremacist, the bane of mobile, console, and vr gamers.....
@tubehellcat
@tubehellcat Год назад
"He owns a display" - that's gotta hunt him for ever like the "you're fired" for Colton 😂 Love it 😁
@fsendventd
@fsendventd Год назад
which video is "he owns a display" from again? having a hard time finding it
@Jeffrey_Wong
@Jeffrey_Wong Год назад
@@fsendventd He's been doing a lot of monitor unboxing videos on ShortCircuit, I think it's from one of the Alienware monitor videos
@GigaSeadramon
@GigaSeadramon Год назад
@@fsendventd it's from the 8k gaming video
@AnuraagDaniel
@AnuraagDaniel Год назад
@@Jeffrey_Wong yeah, it's also a direct quote, he says "I own a display" in the dlss 3.0 video
@TwinShards
@TwinShards Год назад
Yeah i got a really good laught out those 4 words under his name.
@mauromerconchini
@mauromerconchini Год назад
I'm so happy Phil put a spotlight on this concept, and I'm even happier that a channel like LTT is carrying that torch forwards.
@SirDragonClaw
@SirDragonClaw Год назад
I tried to build something like that demo a few years ago, but I was trying to use motion vectors + depth to reproject my rendered frame which I never got to work correctly. In my engine I rendered a viewport larger than the screen to handle the issue with the blackness on the edges and then was going to use tier 2 variable rate shading to lower the render cost of the parts beyond the screen bounds. But VRS was not supported in any way on my build of Monogame which is what my engine was build apon so that was another killer for the project. I am so glad that Phil popularised the idea and its awesome that someone else managed to get something like this working, how he did it in one day I will never know, I spent like 3 weeks on it and still failed to get it working correctly. I should find my old demo and see if I can get it compiling again.
@heeerrresjonny
@heeerrresjonny Год назад
You might be able to hide a lot of the edge warping by basically implementing overscan where the game renders at a resolution that's like 5-10% higher than the display resolution, but crops the view to the display resolution. It should in theory be only a very minor frame rate hit since you're just adding a relatively thin border of extra resolution.
@carlo6953
@carlo6953 Год назад
The size of border you would need to eliminate the edge warping would probably impact performance more than just using a higher refresh rate to lower the amount of warping in the first place.
@ABonYT
@ABonYT Год назад
The magic combo there would be foveated rendering alongside the async reproj with overscan. The games that would make sense for will inevitably be a case-by-case thing for but the performance gains would be massive.
@batuhancokmar7330
@batuhancokmar7330 Год назад
@@carlo6953 That assumes you'd need same resolution for overscan. If game is rendered at 45deg FOV at 1440p, render an overscanned area between 45 and 90deg FOV at 360p. You don't need a lot of detail, just something to make valid guesstimates within that motion blur until proper frame fills up the screen.
@WiggyWamWam
@WiggyWamWam Год назад
Yes, definitely. Surprised they don’t do this.
@ccd03c
@ccd03c Год назад
I’m glad I wasn’t the only one thinking this
@boysenbeary
@boysenbeary Год назад
As someone who plays VR constantly, it’s nice to see this brought up for non-VR stuff
@moonsicklegaming7990
@moonsicklegaming7990 Год назад
2kliksphilip is an unsung hero, his DLSS coverage is also one of his best content
@MrChanw11
@MrChanw11 Год назад
His upscaling content is the best ;)
@Aeroxima
@Aeroxima Год назад
Never seen either of those but I agree
@Diie89
@Diie89 Год назад
Personally super excited to see 2klicksphilip's video referred to in a LTT video, a lot of Philip's content is really high quality, especially the ones where he covers DLSS and upscaling as mentioned earlier. Can't recommend checking it out enough!
@elise3455
@elise3455 Год назад
2kliksphilip had a good idea, but 3kliksphilip is more advanced in every way!
@HonoredMule
@HonoredMule Год назад
@@elise3455 3klicksphilip is just more work. Both will be _automatically_ obsolete when 0clicksphilip releases.
@Nabalazs
@Nabalazs Год назад
I am so happy that Philip managed to get the message THIS far out. I do fear that this tech might have issues with particles and moving objects and the like, but when you mentioned that we could use DLSS to ONLY FILL IN THE GAPS, my jaw dropped. Thast so genius! I really hope that this is one of those missed opportunity oversights in gaming, and there isnt like some major issue behind it not being adopted yet.
@antikz3731
@antikz3731 Год назад
Exactly. On Linux this exact setup has been available for the last year. It makes a massive difference
@hubertnnn
@hubertnnn Год назад
You don't need to worry about particles, just render them later. The whole idea behind this solution is to split rendering into two phases: 1. Render the scene (expensive 3d phase) 2. Render the final frame from pictures of the scene (cheap 2d rendering) Just move all particle and HUD rendering to phase 2 To be honest I would suggest to go even further and add phase `1.1` where you use DLSS to draw the less important background stuff, this way you can render the important objects in 4k and background objects (buildings, grass, trees etc) in 720p or lower and just upscale with DLSS. Or go even further and render each layer in different framerate. Background in 30fps, while objects in 60fps and final image in 120fps
@GeneFJacket
@GeneFJacket Год назад
This was my first thought, too. Combining Async with DLSS/FSR could potentially be the actual magic bullet we're looking for.
@MrDavibu
@MrDavibu Год назад
@@hubertnnn I mean cheap is relative, if all animated and moving objects and particles have to be rendered later on it won't be that cheap. Especially if it means that transparent object have to be rendered after that. Also screen-space-reflections of animated objects will disappear if they are not part of an animated object itself. Not saying it's not interesting, but it's definitely not a solution without compromises.
@tralphstreet
@tralphstreet Год назад
@@antikz3731 ??? Explain. I use Linux and there's no such thing.
@palaashatri
@palaashatri Год назад
Take this a compliment : I love how LTT has now transformed more into a Computer Science/Electronics for Beginners channel than just another "Hey we got a NEW GPU [REVIEW] " channel.
@PrograError
@PrograError Год назад
Well... They had covered everything on that aisle...
@sirspamalot4014
@sirspamalot4014 Год назад
It's why I keep watching them, I got tired of watching reviews of hardware I can't afford/don't really need yet. Though my VR rig is getting very tired.
@mrogalski
@mrogalski Год назад
Just think about that: You can see the difference on a RU-vid video! Granted it's 60FPS but it's still compressed video streamed from RU-vid. I can only imagine how much of the difference you can see live running it yourself. This makes it even more amazing!
Год назад
3kliks and LTT collabing is what the INTERNET NEEDS!!!!
@Qimchiy
@Qimchiy Год назад
2kliks but either way Yes.
@RennyChuggs
@RennyChuggs Год назад
no it isnt
@cora2887
@cora2887 Год назад
@@Qimchiy its the same guy
@pair_of_fins
@pair_of_fins Год назад
@@cora2887 Erm, if you paid attention you would know they were brothers 🙄
Год назад
@@Qimchiy might as well get kliksphillip in here too ;)
@hkoizumi3134
@hkoizumi3134 Год назад
This explains the weirdest I've felt in VR. The game itself lagged for random reason but my head tracking and responsiveness of the control wasn't affected. I remember thinking if the head tracking lagged along with the other lag, I would have had severe motion sickness.
@HappySlappyFace
@HappySlappyFace Год назад
Yeah it was honestly a very amazing thing when my quest 2 froze, I was like "oh no please no motion sickness" for the first time but it was so normal
@rpavlik1
@rpavlik1 Год назад
Yep, it's pretty standard and required for hmd based VR. (At least some sort of reproduction or time warp) There are a lot of different variations.
@TheTrainMaster15
@TheTrainMaster15 Год назад
Philip is revolutionising the way we think about gaming and game dev just with common sense
@Neurotik51
@Neurotik51 Год назад
what? nothing here is new
@TheTrainMaster15
@TheTrainMaster15 Год назад
@@Neurotik51 using technology for VR with conventional monitors? I haven’t heard of that before
@KeyT3ch
@KeyT3ch Год назад
This technology on handhelds will be ABSOLUTELY a game changer, not only it "look" better, it will be even more difficult to spot artifacts on a way smaller screen.
@aTron0018
@aTron0018 Год назад
We need this integrated into Steam Deck OS!
@KingBowserLP
@KingBowserLP Год назад
One thing that Philip's video covers that this one does not, and which I'm personally really excited about: combining this with a low shading rate border around the viewport (the fully rendered frame). Since peripheral vision is more trained on movement than detail, this is fine quality wise, and it means the screen doesn't have to guess what's at the edges - the information is already there, just in lower quality than the main viewpoint would have. That would, if not eliminate, significantly reduce the stretching artifacts.
@666Tomato666
@666Tomato666 Год назад
like doing actual FOVeated rendering, where the "sharp part" is the whole normal viewport, while the low resolution is just around it, like extra 5-10% or so
@ffsireallydontcare
@ffsireallydontcare Год назад
I haven't seen Philip's video but I'm guessing you'd need eye tracking as well. It'd be pointless to render the fringes of the "screen" at a lower quality if you can point your eyeball directly at it...
@gabrielenitti3243
@gabrielenitti3243 Год назад
@@ffsireallydontcare what if the lower quality rendered parts are actually outside your screen? You would trade a bit of framerate for more accurate projection predictions which would recoup the lost performance and give you a better experience
@rileyn2983
@rileyn2983 Год назад
@justathought No because it will be fixed in 1/30th of a second. It's obviously not perfect, but that's what this technique is about, compromises. Lower resolution fringes would be way better than stretching.
@ffsireallydontcare
@ffsireallydontcare Год назад
@@gabrielenitti3243 Ahh ok, yeh that makes more sense.
@KalkuehlGaming
@KalkuehlGaming Год назад
I know philip will see this and I know he will feel awesome. You have come a long way Philip. I am proud to be part of your community since your first tutorial videos.
@charliegroves
@charliegroves Год назад
Here's to Philip, love his videos on all 3 of his channels
@TuRmIx96
@TuRmIx96 Год назад
Love him. His tutorials layed the base for my environment artist gamedev job.
@fargoththemoonsugarmaniac
@fargoththemoonsugarmaniac Год назад
kliki boy i love you
@jo_kil9753
@jo_kil9753 Год назад
@@charliegroves more like 14 lol
@stephenmurray5276
@stephenmurray5276 Год назад
Y’all have done a stupid good job recently researching and explaining difficult concepts. Between this video and the recent windows sleep/battery video, my (already high) respect for LMG’s tech knowledge has gone through the roof! And y’all didn’t even discover this hack! Thanks for sharing (and explaining)
@MrPaxio
@MrPaxio Год назад
older videos were more technical now they suck up to the chump who doesnt know how to navigate a settings menu
@dezzydayy4608
@dezzydayy4608 Год назад
I actually was thinking about writing an injector to apply this to existing games a few years ago when I have seen the effect on the HoloLens. A few limitations though: camera movement with a static scene can look near perfect, however if an animated object moves depth reprojection cannot fix it properly, and you would need motion vectors to guess where objects will go, but that will cause artifact near object edges.
@ambassador.to.Christ
@ambassador.to.Christ Год назад
couldn't you just zoom in a little bit so you would never see the artifact from the edges of screen and then use a higher resolution or AI to compensate for the crop?
@lawnmower16
@lawnmower16 Год назад
@@ambassador.to.Christ I think the problem they're talking about is that this works great when objects are holding still because the algorithm knows where the object should be in the next frame, but if it's a moving object, it has to be able to adjust not only for the altered perspective of the object, but also the altered position, but since many moving objects in games are random or player controlled, there's not actually any way to know for sure where the object will be on the next frame, so the information the player is getting is not necessarily the most up to date accurate information, which could mean the result is actually worse than a low frame rate. Because slow information that's always correct is better than fast information that's sometimes wrong.
@yensteel
@yensteel Год назад
Yeah, the parallax effect is a big issue to address.
@blitzkriegiv1169
@blitzkriegiv1169 Год назад
This technique can be used only for the background or environment while the additional frames for the subjects of a scene can be rendered through the gpu. This way you can get the best of both technoques
@xogmaster
@xogmaster Год назад
Not if you can somehow use light ray data from sectors in an environment to determine depth (or lack thereof)
@monkeywithocd
@monkeywithocd Год назад
This is great, it really explains some odd behavior I've noticed while playing VR games, and using it for flat games sounds like an awesome idea, especially for consoles.
@TechnologistAtWork
@TechnologistAtWork Год назад
Handhelds too. This would make any game on the Switch or Steam Deck run near perfectly without having to tap into too much hardware power. Why are we not funding this? GPUs are the size of a gaming console nowadays but they couldn't bother to solve those issues with much simpler and cheaper solutions?
@nktslp3650
@nktslp3650 Год назад
Yes ! Sometimes when the game is stuttery, you can still move freely but you can see the black screen ! Such a cool tech. It works really well, input latency is really important.
@nebelwaffel8174
@nebelwaffel8174 Год назад
@@nktslp3650 yeah, when he showed the black borders I had a strong feeling of "i have seen this before", but i couldn't put a finger on it, until he mentioned VR.
@mirage809
@mirage809 Год назад
As per usual: John Carmack is the king of optimizing rendering in games. He first implemented this tech for the Oculus Rift and has a long history of coming up with awesome solutions for problems like this. This is the man that made Doom, he knows his stuff. He's probably laughing right now and having a big "I told you so" moment.
@Felipemelazzi
@Felipemelazzi Год назад
John Carmack is the responsible for asynchronous reprojection!?!?!?!?!?! This living god never stops to amuse the world of technology!
@imdurc
@imdurc Год назад
@@Felipemelazzi I thought JC was the one who had seen it somewhere and wanted to bring it to Oculus, but, I don't think he was responsible for its actual creation. Anyone know?
@HamguyBacon
@HamguyBacon Год назад
he basically mimicked how your eye works in real life, I thought of this too but i thought it was already implemented.
@CJMAXiK
@CJMAXiK Год назад
Meta improved on this tech, now it is called Asynchronous Spacewarp and bundled with Oculus Quest 2. And let me tell you, it is really cool.
@Bonez32186
@Bonez32186 Год назад
Realistically, if you render outside the fov that you are doing by a percentage, you would have enough scene overshoot for it to not really be a problem unless you have extremely low frame rates and incredibly fast movements.
@lucasthompson6405
@lucasthompson6405 Год назад
This feels like a slightly hacky optimisation you would see in older games, and I personally find that really cool. I always admired hearing the cleaver ways game devs to over come the limitations of hardware. Where as these days it feels like we rely on an abundance of processing power. That abundance of processing power is generally a good thing but it feels like these sorts of optimisations are becoming a lost art
@jovieasyrof2017
@jovieasyrof2017 Год назад
cough cough Gotham Knight
@randxalthor
@randxalthor Год назад
This is probably my favorite type of video from LTT. Highlighting and explaining interesting technology is fascinating.
@deolamitico
@deolamitico Год назад
oh wait, time for another balls to the wall computer build! only the third this week. /s But for real, they've been doing a great job with not doing what I just said
@thebaum64
@thebaum64 Год назад
it's up there for sure
@alexmathewelt7923
@alexmathewelt7923 Год назад
As a Hobby Game Engine/GFX developer, I developed this technique with some tweaks: static geometry would just be rendered every few frames, but characters, grass or particles get permanently rendered. With the depth sort and extended viewport, it feels like native rendered and one can really aim precisely on a target, since this is always up to date. As mentioned in the video, dlss uses motion vectors, but has to guess the motion and static geometry. With proper implementation, this guess is not required, but can be calculated by the same hardware as the AI
@j1000a
@j1000a Год назад
Does this end up looking like motion blur?
@Rhedox1
@Rhedox1 Год назад
What happens if rendering your static geometry takes 20ms on the GPU? How do you schedule the reprojection to ensure it's executed in time? Also, which graphics API did you use to implement this?
@Winston-1984
@Winston-1984 Год назад
What I'm wondering is, does the GPU in any way know what it wouldn't need to render, sections of the screen that can persist using this tech & only re-rendering additional frames for the sections that require more updates? This make sense?, its hard to put down.
@CathelijneLadyRider
@CathelijneLadyRider Год назад
I want it, and i want it now
@alexmathewelt7923
@alexmathewelt7923 Год назад
@@Winston-1984 that's what I'm currently working on, since this is now a common technique for ray tracers. Currently I'm trying to create the formulas I need and proof them for small movements. But with this fixed splitting it works for first person shooter or smth like that with much static geometry. Static geometry is nowadays really fast to render.
@sound123mine
@sound123mine Год назад
A LTT vídeo at 60fps?! My god, the little animations they put like the outro card look so good 👍
@MrPaxio
@MrPaxio Год назад
yeah doesnt the dummy know 60fps is better than 8k uploads
@SaucedTech
@SaucedTech Год назад
I love how much labs has instantly matured this channel. I have watched LTT for a long time but recently its really boosted its level.
@TikoyTV
@TikoyTV Год назад
But… they are not even done?!
@HonoredMule
@HonoredMule Год назад
It's been far from instantaneous, but we are starting to see the returns and it is definitely nice.
@malgaras6204
@malgaras6204 Год назад
Seems like interesting tech. Two immediate thoughts: 1. What about moving objects? Seems like the illusion falls apart there as this only really simulates fake frames of camera tilt, not any changes to things already in your FOV. 2. What if you just slightly over-rendered the FOV? Then you actually have some buffer when tilting the camera where you have an actual rendered image to display before you need to start stretching things at the edges of the screen. Now obviously since you're rendering more geometry, you are going to take a further FPS hit, but is there a point where the tradeoff is a net gain?
@user9267
@user9267 Год назад
2. You can do that I think. 1. They will move at the actual framerate. All asynchronous reprojection does is make the game feel more responsive.
@Sunisway
@Sunisway Год назад
I mean vr uses it and it runs pretty well
@Pro720HyperMaster720
@Pro720HyperMaster720 Год назад
In the 2kliksphilips video he mentioned how interesting would be instead of stretching the borders or showing the void leaved by not yet rendered, if we rendered a bit extra of the display area (quasi like overscan) but in low resolution to impact as little as posible the performance, and as our peripheral vision is not great we barely noticed when moving fast that a small area in the corners is momentarily lower resolution. So yes, we would have plenty of ways to improve the illusion, for example you could boost the ondisplay area with DLSS or FSR, and maybe even the extra area (I don't think that always would a good idea, depending on your main resolution is not the same that the extra area is 480p and you're creating that image from 240p with DLSS that it to be 240p and get that from 120p, probably a bad idea at least for the ladder), maybe if the resolution on the extra area is not suitable for DLSS or FSR, you could use for on display area but the frame generation of DLSS 3.0 (and future FSR 3.0) but only the frame generation on the extra area to fill the gaps with mostly fake frames gotten of deducing what your movement would lead to showing and anticipating to it
@chanceslaughter3237
@chanceslaughter3237 Год назад
moving objects still have poor framerates, that's how it is in vr as well, your hands are much more jittery feeling than the rest of the game when your fps drops... in my experience anyway.
@reptarien
@reptarien Год назад
1. Yes, moving objects are still noticably 30fps, but coming from someone who has spent time in VR reprojection in a game like Skyrim, with lots of moving actors, you don't notice that nearly as much when your actions are still so instant, as shown in the demo. It's crazy how much you can find yourself forgiving if your head and hand movement is still smooth as butter. 2. That is another technique that VR absolutely uses that works very well to solve that issue. Easily implementable and workable.
@twinklesprinkle1318
@twinklesprinkle1318 Год назад
I was amazed watching Philip's video when it came out. I'm happy that it has reached you now! Hopefully the game developpers will get the message, I'd be really happy to see this implemented in actual games, because at the moment unless you have the most recent hardware, you have to choose between high resolution and very high framerate...
@SendFoodz
@SendFoodz Год назад
I wanna see the video in question, 3klinkphilips is the channel right? what's the video? Im guessing around to find this guy/video, what's the title so I can show him some love?
@Wanklacus
@Wanklacus Год назад
@@SendFoodz it's linked in this video's description my man (if you haven't found it yet)
@zach99999
@zach99999 Год назад
This is really cool! I play shooter games a lot and the most annoying thing about low fps in games is the input lag. Slow visual information is more of an annoyance as long as it's above 30, but the slow input response times at anything below 60 fps drives me insane.
@kingsizemedal
@kingsizemedal Год назад
2KP is such an amazing channel, he always has very interesting, out of the box ideas, and I love to see more of his wacky stuff being picked up!
@cometor1
@cometor1 Год назад
Yay, 2kliksphilip and his brother 3kliksphilip finally get some well deserved attention!
@kazioo2
@kazioo2 Год назад
The inventor already suggested using it for normal games in 2012. Then many people made experiments and demos over the last decade. This one finally got some traction, so kudos for that, but it's nothing new.
@cometor1
@cometor1 Год назад
@@kazioo2 truth to be told, I'm a long time klik empire supporter, and I'm always happy if anything good happens to him like getting mentioned by another creator I like. The technology is interesting, and it needs traction to take off, but i actually care more about Philip than the tech.
@semick4729
@semick4729 Год назад
Brother?
@MrALjo0oker
@MrALjo0oker Год назад
@@semick4729 yeah he has two brothers kliksphilip and 3kliksphilip
@saladgreens912
@saladgreens912 Год назад
@@MrALjo0oker Not sure if that is necessarily true, someone should get to the bottom of that. Valve, please fix.
@BlameDavid
@BlameDavid Год назад
I'm so happy to see phillip reach this far out of the csgo bubble with this
@OriginalityDaniel
@OriginalityDaniel Год назад
'valve please fix'
@ToasterTom
@ToasterTom Год назад
I stumbled upon 2kliksphilip’s channels when I was researching how to make maps in Hammer. So glad you guys have mentioned him in multiple videos now!
@sirbughunter
@sirbughunter Год назад
This is so so incredible! I hope this will be the next-gen image helper in all upcoming and older games!
@Neoxon619
@Neoxon619 Год назад
This actually reminds me of the input delay reduction setting that Capcom added for Street Fighter 6. The game itself still runs at 60fps, but the refresh rate is 120Hz for the sake of decreasing input latency.
@sm7085
@sm7085 Год назад
Good point. That's one of the added benefits of a high refresh rate monitor. Even though you might not reach a high fps, having a high refresh rate monitor can still benefit you from a reduced input latency.
@MudHut67
@MudHut67 Год назад
that's not how any of this works...
@Kaisogen
@Kaisogen Год назад
this tech won't really be useful for fighting games specifically, and I think it would be more counterproductive tbh.
@MrNicePotato
@MrNicePotato Год назад
What does that even mean... The async shown, as I understand it, is essentially shifting your point of view before the GPU producing a new frame. But for a fighting game, it would have to make the new frame no matter what to show your input turning into a move.
@j1000a
@j1000a Год назад
The *effect* (not reality, which is a bit different) also reminds me a little bit of QuakeWorld (and to a lesser extent, Quake and Doom). Even when the framerate is high the models use low-FPS animations, and with QuakeWorld I seem to recall objects in motion skipping frames based on your network settings. Meanwhile the movement was still buttery.
@afroninjaen
@afroninjaen Год назад
I always had a feeling that tech like this is actually the real future of gaming / VR performance. And not just raw rtx 4090 performance.
@Thezuule1
@Thezuule1 Год назад
This tech has been a part of VR for years and it's awful. They need to take a new approach and have developers actually implement it at the game level rather than it being an after effect because as it stands now it doesn't work worth a shit. Awful.
@-NoodleBoy
@-NoodleBoy Год назад
@@Thezuule1 I use it in RE8 vr so I can run rtx while in vr and it doesn't feel great but it feels better than native.
@possamei
@possamei Год назад
@@Thezuule1 On quest, they've built support for it in-engine, it's called SSW. It's actually better than ASW on PC because it has motion data for the image, so the interpolation is quite good. Sure, real frames are still better, but the tech is getting better
@Thezuule1
@Thezuule1 Год назад
@@possamei you've got that a little twisted up but yeah. SSW is the Virtual Desktop version, AppSW is the native Quest version. It works better but still not well enough to have picked up support from any real number of devs. Step in the right direction though.
@DJayFreeDoo
@DJayFreeDoo Год назад
@@Thezuule1 But what if DLSS and FSR only had to correct the flaws of this instead of making whole frames. DLSS and FSR might get you even more performance.
@jamiekerr5514
@jamiekerr5514 Год назад
Wow, not sure who wrote this one but such a good explanation. So clear and well presented, good job!
@Czllvm
@Czllvm Год назад
THIS IS INSANE, I use this already on Assetto Corsa in VR so I play at 120hz but it renders 60fps, Such a light bulb moment at the start, Really wish this can catch on because I've already seen first hand how great this is
@txsurvivalandcreations
@txsurvivalandcreations Год назад
Really cool stuff. When I’m in VR and the frames drop during loading or something, it does exactly what you showed in the 10fps demo. You can see the abyss behind the projected image on the edges, with the location of the image updating to return right in front of you with each new frame. I had no idea that that’s what it was for.
@DiamondDepthYT
@DiamondDepthYT Год назад
ive been using vr for 3 years now- and I had no idea what it was until today either! Super cool to learn more about that stuff
@MrScorpianwarrior
@MrScorpianwarrior Год назад
Oooohhh. You're absolutely right and not once did that occur to me! Imagine if that didn't happen and everywhere you looked was the same loading screen...
@merryjerry69
@merryjerry69 Год назад
@@MrScorpianwarrior how to get motion sickness lol.
@methejuggler
@methejuggler Год назад
I'd imagine that a lot of the edge stretching could be mitigated by rendering slightly more than is displayed on the screen, so there's a bit extra to use when turning before having to guess
@Blancdaddy
@Blancdaddy Год назад
that's an interesting thought. this would bring us back to the age of overscan i feel lol
@RyoLeo
@RyoLeo Год назад
@@Blancdaddyreject dlss return to overscan
@LasticDJ
@LasticDJ Год назад
I think I remember 2kliksphilip talking / showing this in his video, just have the part just outside of your fov rendered at a lower resolution and use that instead of most of the stretching because you can't see the detail anyway
@odinsplaygrounds
@odinsplaygrounds Год назад
I just commented something similar. Just have it render out 10% extra which is cropped off by your display anyway, so whatever "stitching" it's doing is outside your view. Would love to see this. Combine that with foveated rendering, so the additional areas rendered outside view is lower resolution.
@Blancdaddy
@Blancdaddy Год назад
imagine the longevity of GPUs if this technology ever becomes standardized. man.
@saquial
@saquial 11 месяцев назад
That really explains some quirks of the Quest and Streaming, specially on how when something’s loading you can move your head freely and the VR picture would stay still in the space as a single frame, just like you showed here.
@SirDragonClaw
@SirDragonClaw Год назад
I tried to build something like that demo a few years ago, but I was trying to use motion vectors + depth to reproject my rendered frame which I never got to work correctly. In my engine I rendered a viewport larger than the screen to handle the issue with the blackness on the edges and then was going to use tier 2 variable rate shading to lower the render cost of the parts beyond the screen bounds. But VRS was not supported in any way on my build of Monogame which is what my engine was build apon so that was another killer for the project. I am so glad that Phil popularised the idea and its awesome that someone else managed to get something like this working, how he did it in one day I will never know, I spent like 3 weeks on it and still failed to get it working correctly.
@Lunch-b0x
@Lunch-b0x Год назад
been watching 3kliks for years. I'm glad he's getting some recognition.
@TehF0cus
@TehF0cus Год назад
2kliks
@GreenzQe
@GreenzQe Год назад
@@TehF0cus kliks
@veganssuck2155
@veganssuck2155 Год назад
@@TehF0cus same person
@nbt1254
@nbt1254 Год назад
@@TehF0cus but don't confuse him with his evil brother kliksphilip
@2opmataron991
@2opmataron991 Год назад
@@veganssuck2155 Its a joke....
@matiitam111
@matiitam111 Год назад
On the other side: this is a static scene, no animated textures, no characters moving around, no post process effects, no particles ect. Porting this to modern game would be similar to what assassins creed syndicate (or later one, I don’t remember) did to clothing physics: capped it at 30fps while game run at 60. Effect would look similarly to what modern games do to animations when characters are too far for game engine to update them as frequent as game current fps. So I’m skeptical Also, nice GPU you got there, can’t wait for the review ;)
@jacobc5747
@jacobc5747 Год назад
it is effective in VR games designed with it, so it will probably be effective on normal PC games if it's kept in mind.
@kazahesto
@kazahesto Год назад
Yeah, this isn't new tech. Pretty much every web browser does something similar when scrolling or zooming, where most content is static, and it looks terrible when a heavy webpage tries to do parallax scrolling on an under powered system. The whole "no body thought about it" angle in this video is strange and patronising.
@blackstar-genX
@blackstar-genX Год назад
Who's ever PC y'all used to show the games list at the start. Nice to see you as a man of culture. 0:55
@JakalairVG
@JakalairVG Год назад
This was just fascinating to see. I love it when a tech solution is used in a new way.
@ruix
@ruix Год назад
Glad seeing Philip getting bigger every day. He's amazing
@RocketSlug
@RocketSlug Год назад
Having just started getting into VR, I just recently learned about what asynchronous reprojection. Really cool to see it getting mentioned, because when I heard about it it seemed like what DLSS 3 wanted to do, only it's been here already for quite some time. Your description about how it decouples player input with the rendering makes me think of rollback netcode for fighting games and how that also decouples player input and game logic, and I'm really excited for what that means for the player experience
@davidbakersound
@davidbakersound Год назад
I’ve never understood why this hasn’t been done before. I’ve thought it should be done since 2016 when I got my VR headset. Like you said, extremely obvious!
@chrisc1140
@chrisc1140 Год назад
One thing I wondered about when I first saw that video is if the PERCIEVED improvement is good enough that you could lose a couple more frames in exchange for rendering a bit further outside the actual fov, but at a really low resolution. Basically like a really wide foveated rendering. It would mean the warp would have a little more wiggle room before things started having to stretch.
@gamebuster800
@gamebuster800 Год назад
Cloud gaming would be great with this! You handle the reprojection locally and use the delayed frames as a source. It will basically eliminate the input lag.
@thewhywhywhy4302
@thewhywhywhy4302 Год назад
This is a sick idea
@MrMoon-hy6pn
@MrMoon-hy6pn Год назад
You would still need to sent a depth buffer and probably other information to the computer playing the game. So that means more load on the internet connection, but it still sounds interesting.
@khhnator
@khhnator Год назад
not that will not do anything, but it will do less than what you think. even if it works, and I'm not sure it does, at least not in this form, because the time till you receive a frame that will fill the stretched gaps you just created by the camera around is so much higher, compared to a computer which will just fill that gap in the next 34ms if you are running at 30fps. then you might have a super smooth camera turning, but the time to shoot, jump, etc will still be the same. heck because of the higher disparity between camera and everything else. it might even be a worsen the experience instead of making it better.
@gamebuster800
@gamebuster800 Год назад
@@khhnator You're right, but the latency for cloud gaming is already not that high. The most noticeable effect at latency is moving the mouse to look around.
@fayenotfaye
@fayenotfaye Год назад
This is already done with VR cloud gaming services, when you use oculus air link, you’re basically doing the same thing but over LAN. If you drop a frame, you can still move your head around and it’s perfectly playable all the way down to 30 fps for most games.
@odytrice
@odytrice Год назад
I literally did a spit take at 6:01 Now I have coffee all over my keyboard 😂😂
@saart2212
@saart2212 Год назад
Now that's a public interest video ! Raising awareness on this technique will certainly go a long way, especially in open source. I hope the constructors don't shy away from it from fear that it would diminish interest on their high-end GPUs.
@dnitz9608
@dnitz9608 Год назад
High-end? What r u talking abt, they can make AAA game 8k 240fps without 6slot GPU
@prasunbhuin3259
@prasunbhuin3259 Год назад
This is one of the reasons why I love LTT. Making known new and innovative technologies that could revolutionize the industry. Not only handheld consoles but mobile games and even cloud gaming can get significant improvements from this
@Adam-em1mf
@Adam-em1mf Год назад
I knew about this because of my oculus rift, and as you mentioned, in racing games, asynchronous spacewarp (as oculus calls it) is quite noticeable, moving your head around while driving at 100 mph can be quite jarring but oculus updated the feature and the visual bugs weren't as noticeable, it's quite interesting to see how this works, excellent video guys
@grumbel45
@grumbel45 Год назад
With the latest "Application Spacewarp" on Quest2 the games can now send motion vectors, so the extra-polated frames no longer have to rely on so much guesswork.
@Dimondminer11
@Dimondminer11 Год назад
Yeah the visual artifacts can actually cause MORE issues in VR than not, at least in some very specific games. Its not super noticeable in VRChat but in the Vivecraft mod for minecraft the screen turns into a wavy, smeary mess. I actually hated that WORSE than running at 40fps natively which is what my system could do with my settings at that time
@joshuacook2
@joshuacook2 Год назад
Your demo should really have included any animated object. That would have shown some serious limitations and also would be present in nearly all games.
@sandmaster4444
@sandmaster4444 Год назад
Seriously though.
@MrPhillian
@MrPhillian Год назад
I keep seeing this concern, and while it might be true at low FPS, I don’t think that’s really where it would be aimed. I’d imagine most would still aim for 60+ rendering. Keep in mind that VR headsets are already using this method and are they experiencing these problems? I legitimately don’t know.
@matsv201
@matsv201 Год назад
It proboly wouldn't have been as obvious as you might think. At least not in 30fps. The reason why 60 or even 120fps is so obviously faster have to do with light retention in the eye when you move the mouse. But you can't se that with animation. Have you ever been Iin the cinema.... 24fps... yes.. 24.... do you think cinema is chopy? I-max+ have 48 fps
@amysteriousviewer3772
@amysteriousviewer3772 Год назад
@@matsv201 Movies don’t look or feel choppy because there is natural motion blur to everything and you also don’t control anything in them. It’s an apples to oranges comparison. Also IMAX doesn’t “have” 48 fps, IMAX is simply a format. A movie has to be shot at that framerate to display in that framerate. If it’s shot at 24 then it will be 24 in IMAX.
@TOGSolid
@TOGSolid Год назад
@@MrPhillian Depends on the game and what sort of post processing is going on in the game from my experience. It can work well but if there's a bunch of fancy effects going on it can be very noticable the smoothness is being faked.
@jabadahut50
@jabadahut50 Год назад
it really is amazing how fast game technology is getting better now that we've hit diminishing returns on fidelity and we're starting to see a wider gamut of techniques being reached for to make a certain experience better by using tech techniques that others have used for a while in different ways.
@chriszuko
@chriszuko Год назад
Things like Particle FX and transparency is where the issues really show up not just on the edge. A spinning rocket for instance may warp the inner sections where the fins were in the previous frame especially if you pause the game.. since the predicted velocity remains the same but the object has stopped moving. They've solved this for oculus space warp by calculating velocities per object in order to have better prediction, however that requires the game engine to support this on an object basis for all materials and edge cases. We wish we could have used this on Mothergunship: Forge, but had to disable it due to the visible warping. It has the potential to be an absolute savior though since trying to hit 72 or 90 fps basically on a phone attached to your face is a huge challenge for peformance.
@QuaziInc
@QuaziInc Год назад
I feel like I just had my mind blown wide open at the possibilities. This is one of my favorite ltt videos. Its hard for me to find such a technical concept so well explained. well done.
@elin4364
@elin4364 Год назад
Something worth noting is that comrade stingers demo does not really do what they say it does (mostly due to issues with how unity is made being pretty incompatible with this sort of demo). The gpu draws the entire frame during the last frame, so work is NOT split up over several frames. Doing that would be a pretty complicated task in existing game engines like unity.
@comradestinger
@comradestinger Год назад
This! The demo only distorts the *simulated* bad framerate from the slider. If you ran the demo with an actual bad framerate, it would just lag like normal. To actually implement it properly is much harder than what I did, in unity's case might require some severe shenanigans, or straight up engine modification.
@TrackmaniaKaiser
@TrackmaniaKaiser Год назад
@@comradestinger Do you think that something like that could be a driver feature like the DLSS2 Stuff. Where the GPU gets some motion vectors and shifts existing objects more or less like sprites arround until a new real frame got created?
@comradestinger
@comradestinger Год назад
@@TrackmaniaKaiser I think both could work, though I lean towards it being done by the devs themeslves rather than by driver. Since games vary so much, different scenes and camera modes would benefit/suffer from the effect in different ways. to be honest It's all very complicated.
@devzozo
@devzozo Год назад
Wonder if using DOTS and scriptable render pipeline would allow for it, can't imagine figuring all that out in an evening though. I wouldn't trust a solution that leverages Unity's undersupported APIs to be that stable though...
@leongao5120
@leongao5120 Год назад
@@comradestinger Good work man
@happysmash27
@happysmash27 Год назад
Asynchronous projection is great in VRChat (in PCVR on my relatively old PC) where I usually get 15 fps and often even far less than that! I don't really mind the black borders that much in that case especially since the view usually tends to go a bit farther than my FOV making them usually only appear if things are going _extremely_ slow, like, a stutter or any other time I'm getting over 0.25 seconds per frame. So, perhaps another way to make the black bars less obvious, would be to simply increase the FOV of the rendered frames a little bit so that there is more margin. Would make lower frame rates, but it might be worth it in any case where the frame rates would be terrible anyways.
@SHINYREDBULLETS
@SHINYREDBULLETS Год назад
I'd be very interested in a followup on this in the form of interviews/queries with the likes of nvidia/amd/nintendo/steam as to whether this is something they're aware of/considering/etc! With ridiculous power-draw for graphics cards being accepted as necessary, this seems like a gigantic sidestep with machine learning assistance to amazing benefits for end-users!
@Yalden_
@Yalden_ Год назад
Holy shit, Philip MADE IT
@waybove
@waybove Год назад
I think we just witnessed one of those rare moments when an elegant solution clicks and starts a revolution
@kazioo2
@kazioo2 Год назад
This video is poorly researched. Timewarp was invented by John Carmack in 2012 and described in his post "Latency Mitigation Strategies", more than 10 years ago, in the early 2012. In his original article games other than VR were already mentioned. I remember seeing normal desktop demos many years ago, but it never gained traction despite that.
@ScorgeRudess
@ScorgeRudess Год назад
I have been in VR for almost 3 years, and as soon as meta implemented ASW and ATW, I wanted this for PC Games... I have been and still waiting for this, for years
@EritoKaio
@EritoKaio Год назад
This is pretty awesome, i can totally imagine this working together with something like DLSS in the future, exciting.
@50REN
@50REN Год назад
I have said for a very long time that when it comes to refresh rate, I don't mind lower frame rates from a visual standpoint, but the input delay is more what I love about high refresh rate gaming. I'm excited to see where this technology goes.
@ValenteXD
@ValenteXD Год назад
I remember watching 2klik's video last month and saying wow this is amazing and mind blowing, but thought I was just excited for it because I'm a programmer, guess not
@harrasika
@harrasika Год назад
I was also excited for it but thought nothing would come of it since I've only ever seen him talk about it. Now perhaps there's a chance of this actually becoming popular and coming into games.
@samudec5134
@samudec5134 Год назад
It's huge for low/mid range setups to make games more responsive but it's also nice for high end machines because you'd completely negate the impact of 1%frames and feel like you're always at your average
@MadmanLink
@MadmanLink Год назад
You guys were so young and fresh faced back in the early VR days!
@user-hk3ej4hk7m
@user-hk3ej4hk7m Год назад
The main issue with these workarounds is that they depend on the Z buffer, they break down pretty quickly whenever you have objects superimposed like something behind glass, volumetric effects or screen space effects
@lucky-segfault4219
@lucky-segfault4219 Год назад
Ya, that sounds like it could be a big issue...
@DavidGoodman
@DavidGoodman Год назад
You technically only need the depth buffer for positional reprojection (eg. stepping side-to-side). Rotational reprojection (eg. turning your head while standing still) can be done just fine without depth, and this is how most VR reprojection works already, as well as electronic image stabilization features in phone cameras (they reproject the image to render it from a more steady perspective). It might sound like a major compromise but try doing both motions, and you'll notice that your perspective changes a lot more from the rotational movement than the positional one, which is why rotational reprojection is much more important (although having both is ideal).
@splatlingsquid4595
@splatlingsquid4595 Год назад
I absolutely love the recognition that kliksphilip and his brothers have been getting. It really is an amazing idea and would make everything so much better!
@Lord.Chadsworth
@Lord.Chadsworth Год назад
Omfg, ploufe is never living down that "I own a monitor" comment. Genuinely the best thing that's happened on LTT for a while.
@digitized_fyre
@digitized_fyre Год назад
The coffee guy, the tech news guy, hi, he owns a display, Mark. Such incredible descriptions of the people and their roles
@jonasmostert3294
@jonasmostert3294 Год назад
I think one caveat here, which has not been mentioned, is that dynamic objects in the focus/center of the screen will also only be updated by whatever frame rate your GPU allows. I wonder how to handle those scenarios. Still a very worthwhile improvement for a lot of games, for sure!
@99domini99
@99domini99 Год назад
From my experience with VR, although while looking around is perfectly smooth, animated characters on the screen update slower.. But that really isn’t much of a problem. There is no input lag from your HMD, you can look around perfectly fine and the 45fps the headset makes when reprojecting is still smooth enough to track targets. The only real caveat is the input lag from your controllers. Moving your hands will feel less responsive when reprojecting than when running native framerate. I wonder how this will carry over in desktop reprojection.
@lightphobe
@lightphobe Год назад
I'm curious if you'd still get the same results if you add moving objects into the scene. Since the objects update their position at the true frame rate, I bet they would look super choppy.
@NichtDu
@NichtDu Год назад
Yeah youre right If you maybe played the teardown lidar mod this is kinda the same (atleast the mod has the same downsides as this) Honestly tho it cant get any choppier because the framerate stays the same. The examples were with really low framerates but if you had your normal 120 fps and a 360 hz monitor this technoglogie makes a big difference
@R3BootYourMind
@R3BootYourMind Год назад
Some games already separate physics fps from rendering fps.
@DMitsukirules
@DMitsukirules Год назад
The thing is, you always want consistent input no matter what. The alternative in your scenario is the objects are still choppy, but your mouse movement is also sluggish
@timmbruce99
@timmbruce99 Год назад
I'm actually surprise LTT got a demo version without the moving objects (or at least not showing it). Cus yes, animation still looks choppy according to the fps cap, but the perception of moving the camera is smoother than butter 0-0
@comradestinger
@comradestinger Год назад
The latest version of the demo has moving objects, so you can see for yourself. (they look laggy, moving at the true framerate, as expected) x)
@mda187
@mda187 Год назад
Glad Linus mentioned the application of this to the Steam Deck at the end. One of the best things to do for a lot of games is to cap your FPS at 30-45 on the steam deck to get more battery life with a manageable amount of compromise. In games where using this technique makes sense, it would be amazing. I hope it's not something they have to wait until version 2 to implement though.
@meowmix705
@meowmix705 Год назад
it was would be the biggest of ironies if the STEAM Deck featured native Reprojection for their games. Valve originally was against Asynchronous Reprojection tech back when Oculus released it with the help of John Carmack. At the time, Valve was against Reprojection tech and even wrote some tid bits about how 'fake frames are bad, and devs just need to optimize better'. Valve eventually wised up and adopted their own version of Reprojection for SteamVR (not the first time they were proven wrong, Valve initially only wanted teleport locomotion in VR, called free locomotion bad). The most recent iteration of Async Reprojection from Oculus/Meta meshes Reprojection + game motion vector data + Machine Learning; called Application Spacewarp (AppSW). A recent title that features of good example of it in action is Among Us VR on the Quest2.
@jdkap201
@jdkap201 Год назад
Another technique to "double" your FPS I know is to combine the last and the next frame into a frame in-between. To my knowledge the most common application was with Bink!-Video on the Nintendo DS where the video sequences would be stored as a 15fps stream and dynamically JIT reencoded with this methode to make it seem like 30fps. It relies on the concept to fool your eyes to preceive a motion as smooth. In this specific case it allowed to save more space for game data. Although there have been applications on stop-motion videos, especially with newer AI-technology that may use this methode as part of making a stop-motion video smooth.
@shawn2780
@shawn2780 Год назад
The issue with asynchronous reprojection is that with complex scenes or fast action it creates visible artifacts and weirdness. This is where AI comes in, like DLSS 3 frame generation. By using deep learning, it can more accurately and realisticly insert additional frames. That's really the way I'd the future along with ai upscaling. It has to be, otherwise we're going to need a nuclear reactor to power the future rtx 7090 or whatever.
@miikahweb
@miikahweb Год назад
Even bigger issue is that in VR games (where the camera can just move through walls if you move your head in the wrong place) the only thing async timewarp has to do is take the latest headset position and reproject there. However, in a regular game you can't just take keyboard/controller input and reproject to a new position based on that. Or your character would go through the floor, walls or obstacles in the map. Instead you would have to run full collision detections and physics simulations in order to tell where the camera is supposed to be in the reprojected frame. This not only makes it massively harder to implement compared to VR where it can be automatic. But it also increases the chance of hitting a CPU bottleneck and not gaining that much performance anyway. Then when you combine visual artifacts and other issues you start seeing why game developers haven't used their development time to implement this before.
@meowmix705
@meowmix705 Год назад
Oculus/Meta is already doing that with their 3rd generation reprojection tech - Application Space Warp (AppSW). It meshes Reprojection tech with game engine motion vector data w/ a splash of Machine Learning to generate the best looking VR Reprojection yet. The recently released Among Us VR on the Quest2 is a good example of a game using the latest AppSW techniques. All thx to John Carmack, the grand daddy of Asynchronous Time Warp (ATW; the first mainstream Asynchronous Reprojection)
@shawn2780
@shawn2780 Год назад
@@meowmix705 I don't play enough quest 2 games to be able to speak on AppSW (I play pc through link, and admittedly almost all of always disable regular ASW because of the inherent visual ghosting). Carmack may be the GOAT... I also get a kick out of the fact that he's probably the most reluctant Meta employee ever, but he sticks around because he just loves VR that much
@meowmix705
@meowmix705 Год назад
@@shawn2780 Yup, Carmack is the Goat. As to PC-ASW, the Rift mostly uses 1.0 of ASW (has the typical ghosting visual artifacts). ASW 2.0 is only used for a handful of games (greatly diminished the ghosting, but required depth data from the game; many games did not support it). ASW 3.0 is unfortunately a Quest exclusive (for now?), they've rebranded it as AppSW to not confuse it with PC-ASW.
@JoshsBookishVoyage
@JoshsBookishVoyage Год назад
This is interesting. I'd love to see this feature compared to actual higher fps to see whether the perceived gain renders an actual competitive advantage.
@ScorgeRudess
@ScorgeRudess Год назад
7:06 I love how editors put: "he owns a display"
@nemysisretrogaming3771
@nemysisretrogaming3771 Год назад
asynch reprojection and all other implementations of it such as motion smoothing for VR have always had one major flaw, and that is when rendering stationary objects against a moving background. The best examples are driving and flying titles such as MSFS and American truck sim. The cab/cockpit generally doesn't change distance from the camera/player view so when the scenery goes past the cockpit parts of the cockpit that are exposed to the moving background start rippling at the edges. This is one of the reasons AR it not used in VR anymore and also the reason why Motion smoothing is avoided as well. And besides we are talking two different technologies DLSS V Asynch Repro. One is designed to fill in the frames and the other is an upscaler. Not really an apple to apple comparison!
@c6m
@c6m Год назад
Whoa 2kliksphilip getting a shoutout on LTT before any of his brothers, imagine.
@paulwolff2121
@paulwolff2121 Год назад
I was super excited when Unity released the decoupled Input from Rendering Feature. I loved to develop mobile apps with unity but the biggest drawback always was that those were killing your battery suuuper quick, even if you were only rendering the main menu. This feature allowed to signifantly reduce frame rate in menus, if there was no input from the user and only up it again if the user was scrolling or anything was animating. I mean it was possible before, but you always had to wait for the next frame for your user Input to be detected and thus having a lag. Didn't think about FPS at the time. Great use of that feature!
@paulwolff2121
@paulwolff2121 Год назад
Now if you could also use something like foveated rendering in none VR games could mean even more battery savings for mobile games where only a few things are animated on the screen! Like only rendering those parts on the screen that are animated and keeping static parts. Well maybe that already is possible ... haven't done a mobile game in years. :D
@StormyDoesVR
@StormyDoesVR Год назад
As a huge VR fanatic, seeing the tech that makes standalone VR possible put to use on a flat screen game is amazing!
@MichaelBlock
@MichaelBlock Год назад
This actually seems interesting to try out, some games I play, even on a 2060 may struggle to have even just a stable fps, so perhaps this kind of thing will help for those kind of situations
@klayplayz
@klayplayz Год назад
Linus: It's for free Also linus: Makes no difference
@SelecaoOfMidas
@SelecaoOfMidas Год назад
Your point is?
@shib5267
@shib5267 Год назад
@@SelecaoOfMidas Makes no difference
@OrRaino
@OrRaino Год назад
It can make game smoother for low end pc gamers. The pc market will discourage it because then the high end gpu's will become relevant and only medium tier gpus are enough if the tech is further develop for pc gaming.
@semmu93
@semmu93 3 месяца назад
it would have been really nice to also do a very basic shooting accuracy test in each case with each candidate, but i know that the demo app itself unfortunately didnt contain such a feature. but it would have been REALLY interesting to see the actual, hard numbers.
@KingMuttley
@KingMuttley Год назад
This can make a huge difference to desktop gaming for sure, mostly in the budget and aging mid-tier builds, but can you imagine what this could do for the Steam Deck and other handhelds? Not just make games smoother, but the amount of battery life this could save would be a huge advantage. Would love to see how far this tech can be pushed and one day even become as widely adopted as DLSS and FSR
@GlorifiedGremlin
@GlorifiedGremlin Год назад
This could have insane potential for handhelds. The steam deck can usually pull at least 30 fps on new triple A titles, which seems to be perfect for making the picture much smoother
@RurouTube
@RurouTube Год назад
This kind of thing is what I actually always think about since way back when motion interpolation becomes common in TV plus the fact that I'm familiar with 3D (I don't do real time 3D rendering, only non real time). What I'm thinking was the fact that you have this motion data, depth, etc should be good enough to have some kind of in game motion interpolation but not really interpolation but for future frame. Even without taking the control input into account, just creating that extra frame based on the previous frame data should be good enough to give that extra visual smoothness feel (basically you'll end up with somewhat the same latency as the original FPS). Since it already working directly within the game, we should be able to account for the controller input and the AI, physics, etc to create the fake frames with an actual lower latency benefit, so basically the game engine run at double the rendering FPS so the extra data can be used to generate the fake frames. For screen edge problem, the simple way to solve it is simply to overscan the rendering (or simply zoom the rendered image a bit) so the game have extra data to work with. Tied to this problem is actually the main problem with motion interpolation and this frame generation/fake frames thing, which is disocclusion. Disocclusion is something that was not in view in the previous frame becoming in view in the current frame. How can the game fill this gap because there is no data to fill the gaps. Nvidia I believe is using AI to fill those gaps which even with AI, it still looked terrible. But as it has been mentioned by people using DLSS3, you don't really see it, which is actually good for non AI solution, because if in motion people don't see that defects, then using non AI solution to fill the gaps (simple warp or something) should be good enough in most situation. Also doesn't need that optical flow accelerator because the reason why Nvidia use optical flow is to get motion data for elements that is not represented on the game motion data (like shadow movement) but in reality, that is not important, as in most probably won't notice when the shadow just move based on the surface motion (rather than the shadow motion itself) for that in between fake frames. For a more advanced application, what I'm thinking is a hybrid approach where most stuff are being rendered at like half the FPS and half of it will reuse the previous frame data to lessen the rendering burden. So unlike motion interpolation or frame generation, this approach will still render the in between frame, but render it less, like probably render the disoccluded part, maybe decouple the screen space stuff and also shadow so it rendered at normal FPS instead of half so what the game end up with is alternating between high cost and low cost frame. When I thought about that stuff, AI wasn't a thing thus I didn't think including any AI stuff in the process. Since AI is a thing right now, some stuff probably can be done better with AI like for example the disocclusion problem, rather than render the disoccluded part normally, probably it can just render the disoccluded part with flat texture as a simple guide for the AI to match that flat rendered look to the surrounding image which might be the faster way to do it.
@LutraLovegood
@LutraLovegood Год назад
Interpolation for the future is called extrapolation
@beseakos
@beseakos Год назад
The combination and slight use of all the frame-generating, smoothing and async render technologies will be a glorious crossing point to bring older console titles and popular mmos to mobile. Cannot wait to just bust my phone out and play games without having to stream them.
@TacticalFunnyMan
@TacticalFunnyMan Год назад
I thought this was VR when I first clicked because VR has had this since 2017 at least. Awesome to see this finally make its way to the flatscreen :D
@marvinvogtde
@marvinvogtde Год назад
i honestly think this might be really great for ultra and super wide gaming as even more of th rendered frame is already in your peripharal vision, so the suboptimal edges will be even less noticible
@Imevul
@Imevul Год назад
As a dude with a 32:9 monitor, I'd settle for games actually supporting my monitor resolution AND aspect ratio. Most games I have to play windowed mode, because even if they allow me to go full screen (and don't add any black bars), usually the camera zoom is completely fucked and/or the UI elements are not properly positioned. Heck, even a newer game such as Elden Ring just give me two 25% black bars on each side, but with mods it actually supports 32:9. Wouldn't have been that much work to support it by default. A checkbox for disabling black bars and vignette, and an option to push the UI elements out to the sides and all would be fine. But for some reason, most developers don't even care to support newer monitors with weird resolutions.
@huttonberries768
@huttonberries768 Год назад
@@Imevul it did support it by default but fromsoftware disabled it intentionally because of "competitive advantage" or some bullshit
@marvinvogtde
@marvinvogtde Год назад
@@Imevul I also run with a 32:9 display and feel you, but don't be surprised with elden ring from soft does not care about proper PC support. About the UI thing, I actually can't think of a game off the top of my head that supports 32:9 without also having the option to adjust UI elements as in my experience it's very common and has always been a thing even consoles 10 years ago
@The1Radakill
@The1Radakill Год назад
Would love to see some analytics on image scaling, ie software vs hardware (NIS vs DLSS), various resolutions, % variance between gpu generated resolutions vs monitor native and how effective they are etc. Comparisons between Nvidia AMD and Intel tecs with all of the above.
@leonardobosnar7194
@leonardobosnar7194 Год назад
"In the early days of VR" - damn, that one hits hard to hear 5:51
@ShadowNexis
@ShadowNexis Год назад
It would be interesting to explore the increased rendering cost of running a higher internal resolution with increased fov, then hiding the edges of the rendered resolution to eliminate the screen edge artifacting.
Далее
What is NVIDIA G-Sync? Explained - Tech Tips
5:53
Просмотров 1,1 млн
ULTIMATE MONITOR OPTIMIZATION GUIDE (2023)
8:32
Просмотров 68 тыс.
Android 15 Hands-On: Top 5 Features!
11:26
Просмотров 445 тыс.
Nvidia is Clearly Better, Right?
15:02
Просмотров 2,3 млн
Китайская зарядка❌НЕЛЬЗЯ
0:24
Эволюция телефонов!
0:30
Просмотров 6 млн
Эволюция телефонов!
0:30
Просмотров 6 млн
Wow AirPods
0:17
Просмотров 1,2 млн
IPad Pro fix screen
1:01
Просмотров 8 млн