Тёмный

✔ 3D Gaussian Splatting Examples - The Future of LIVE EVENT Capture? 🎥 

Reality Check VR
Подписаться 23 тыс.
Просмотров 36 тыс.
50% 1

Опубликовано:

 

8 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 68   
@wolf-mk9ik
@wolf-mk9ik 11 месяцев назад
That’s mind blowing. If we can manage to get AI to develop on the fly meshes to manage the motion and keep the textures coherent, you could automatically convert most pan shots into a fully 3D rendition of any scene captured…. Allowing you to walk into a movie, as your watching it and see it from any perspective shown in the footage.
@ThymeCypher
@ThymeCypher 10 месяцев назад
The internet is weird. I'm just letting RU-vid autoplay videos then this comes on and I recognize the neighborhood. Hello, neighbor!
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
Howdy!! 🤓🙏🌴 Hope you are enjoying the cool weather! 🙌
@Kobriks1
@Kobriks1 11 месяцев назад
Holy shit! The 360 video is like a brain dance from cyberpunk. Amazing
@parlindhe7573
@parlindhe7573 4 месяца назад
Great crash course and update- thanks!
@Niyabrock1
@Niyabrock1 11 месяцев назад
Reminds me of Minority Report Technology
@Instant_Nerf
@Instant_Nerf 11 месяцев назад
It’s actually even better.. but yeah. And I did something similar over a year ago using LiDAR and pointclouds ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-UKAguEUb2Lc.htmlfeature=shared
@oliviercandito7791
@oliviercandito7791 6 месяцев назад
Nice infos thanks ! Some of your animations on the tennis court make me think of some of the 3D photoreal shots in Fight Club :)
@JanneWolterbeek
@JanneWolterbeek 11 месяцев назад
Imagine how Google Street View would look in the near future. 😮
@DLAXTOX
@DLAXTOX 4 месяца назад
It is like a dream
@polytrauma101
@polytrauma101 11 месяцев назад
Aaaaaand subbed! ^^ Thanks for the great info!
@Niberspace
@Niberspace 4 месяца назад
amazing indeed, tho I can't figure out how I will use this technology
@Axodus
@Axodus 10 месяцев назад
It didn't even occur to me that we could use this for footage in the future so you could fly around a real world location in "real time"(probably offset due to the required rendering).
@antjones2281
@antjones2281 8 месяцев назад
We are getting beyond Blade Runner now.
@Trendish_channel
@Trendish_channel 6 месяцев назад
Hi! Thanks! How to train the scene from 360 video???
@AirDigitalDroneServices
@AirDigitalDroneServices 11 месяцев назад
Tremendous. Subbed.
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
Thanks! Cool Channel, subbed back! 🤘
@CharlesVanNoland
@CharlesVanNoland 11 месяцев назад
Gaussian is "gow-see-ann", even though it looks like "gawsh-ee-ann". Just remember "Gau" rhymes with "Tau" (which rhymes with "cow"). The name Gauss rhymes with house.
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
so it's like House-In Splats! Gaussian Splats! NICE 😎
@leandrosn962
@leandrosn962 11 месяцев назад
Loved the video! I wasn't aware that we could do the training with old gpu! I've read at some github that the minimum required was 24gb! I was completely disappointed hahahah Please do the tutorial you mentioned! Subscribed =)
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
I'll have some time this coming week! Which GPU are you running? SIBR viewers was where I was getting stuck. I had to use Nerfstudio because SIBR required my CUDA Compute Capability of 7.0+ whereas my 1080ti was only a 6.1 (developer.nvidia.com/cuda-gpus).
@leandrosn962
@leandrosn962 11 месяцев назад
@@RealityCheckVR I'll upgrade it to a 40xx series, I'm currently with a 1060 6gb... I wanted to upgrade to a 4070, which would be really cool with my current setup. if I had to go with a 4090 I'd need to upgrade the power supply, the case and most likely a water cooling system... It really adds up. But I really wanted to dive into this new tech so I'm really uncertain about the investment.
@SakakiDash
@SakakiDash 3 месяца назад
Boys we made it
@gordo8189
@gordo8189 10 месяцев назад
GOW-SAY-YAN
@ThomasAuldWildlife
@ThomasAuldWildlife 11 месяцев назад
SUBBED!! Caleb, how do you think the video-making will progress? By using multiple cameras and generating a gaussian Splatt for each frame of synced video, and then somehow "playing the 3d environment forward", or some other way!!?
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
Surely there will be advancements that will change the workflow up as things get better and ideas are shared. Still, as I thought exercise it's fun to think about how it might work. For example a basketball game could have hundreds of little cameras embedded around the arena as well as Lidar sensors making the process even more accurate. Each camera system would be mapped out virtually so that the software would understand how the images overlap and where objects are correctly in 3D space. Basically, the "colmap" process would be done ahead of time. The system would have to be able to account for moving objects with a simple timeline feature. It takes my PC about an hour to do 30,000 iterations of a good-sized area. I can assume that you wouldn't need to do this many iterations and could probably get a pretty decent-looking picture after only a few iterations if the tech is advanced enough. The static areas could all be baked in and the processing power could be spent on all the live-action movement. As long as the mainframe is powerful enough it could hypothetically power everything in real-time and offload proxy data for everyone to jump in and fly around on their cellphones. I'm sure I overlooked somethings in my explanation, and maybe it would be much simpler. The point is, we can do this right now with enough time to simply integrate it!
@jcd-k2s
@jcd-k2s 11 месяцев назад
I think it would be great to create content for autostereoscopic screens or VR obviously
@ysy69
@ysy69 10 месяцев назад
Very cool. Would you care to walk us through the process of capturing (smartphone?) and processing the capture media ? have you made a tutorial on installing and using Nerfstudio?
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
I'm going to try to make some videos showing the process ASAP! Wish I had more time to make content, unfortunately work takes over.
@mousatat7392
@mousatat7392 11 месяцев назад
Are you kidding me, nerf studio does not support Gaussian splatting how did you do that?
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
I'm going to make some more videos on it soon! Any luck with it yourself?
@yurcchello
@yurcchello 10 месяцев назад
this looks like scene from sci-fi detective where guys trying to solve crime from camera recordings
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
no doubt 😅
@Niberspace
@Niberspace 4 месяца назад
feels like that movie Deja Vue
@cristinapandrea9000
@cristinapandrea9000 10 месяцев назад
very thorough video! Caleb, how do you see the bandwidth requirements for end users will change with this type of volumetric imagery? Where do you see the render will happen, can user devices handle it?
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
Thanks for checking it out! I think we will have service providers that will be able to have a master feed which users will be able to "plug into" and control wirelessly. All future decisions are always based technical feasibility combined with "LET US HAVE ALL OF YOUR DATA!" 😅
@TwoThreeFour
@TwoThreeFour 11 месяцев назад
Imagine what the government will do with this kind of technology 🥰🥰
@andrelip1
@andrelip1 11 месяцев назад
Can you visualize that in VR?
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
Yep! You can toss it into Unreal or Unity and use it with vr hmds. There are even a few social apps that allow direct import.
@VladiDeVasca
@VladiDeVasca 10 месяцев назад
You already can move around the field whereever you want without Gaussian Splatting!!!
@h.a.9880
@h.a.9880 10 месяцев назад
So, how long until we get some Gaussian Splatting smut? Asking for a friend.
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
4D Gaussian Splats are already a thing! Just gotta make it easier to create and view now)
@Niberspace
@Niberspace 4 месяца назад
uhm.. any updates regarding this? my friend is nagging me for updates
@Instant_Nerf
@Instant_Nerf 11 месяцев назад
I think what would get rid of all the fog is if it was able to capture the full volumetric video and not have to stay steady all the time and going around the subject
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
You can clean up the scenes in Unity or Unreal no problem)
@johnterpack3940
@johnterpack3940 11 месяцев назад
Does this have any relevance to games? Other than maybe digitizing real settings?
@Cuckons
@Cuckons 11 месяцев назад
It can be used with UE5 now but there's long way to go to efficiently make actual game with it ( collision, having a competitive result when responding to lighting, etc)
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
I believe this will be automated very soon, there is still a lot of refinement to be had and right now we have no standardization. Exciting times for sure!
@gblargg
@gblargg 11 месяцев назад
What about feeding it movie footage, especially 3D movies?
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
Currently the training as well as viewers are coded for still images. They will need to create a timeline viewer that can update and store data points so that you can view video in real time. We have all of the building blocks to make this tech possible right now, we simply need a venue with the right person to get the first use case up and running and others will surely follow suit. Or more likely some company will perfect the tech and run around installing it as fast as possible before the copycats show up. EDIT: I realize you probably meant just simply using movie footage to recreate certain areas, and YES, you can absolutely do this especially if it shows off a scene without cuts and people are still throughout. It would work great I imagine. Exciting times!
@gblargg
@gblargg 11 месяцев назад
@@RealityCheckVR This requires a really beefy computer and GPU, right? Or not even run on a PC but cloud computing? Trying to get an idea about how far off this is from everyday use.
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
@@gblargg I run this on a 1080ti GPU from years ago in combination with an i7 4790k CPU. I can create a Gaussian Splat in about an hour for 7k iterations.
@StefanReich
@StefanReich 11 месяцев назад
@@RealityCheckVR So the process just needs to be sped up 90,000 times to work for live video 😄
@michaelsmith2785
@michaelsmith2785 11 месяцев назад
Gow-see-un
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
no no no, it's [GOW] + [SEE] + [UHN] 😅
@KadajinGaming
@KadajinGaming 10 месяцев назад
WOW you face was WAY too close :D
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
I'm getting closer next time! 😘😎 lol
@DLAXTOX
@DLAXTOX 4 месяца назад
I wonder what the sex industry will do with this!
@robertfletcher8964
@robertfletcher8964 11 месяцев назад
gauss is pronounced gowse (gow like cow)
@meowl9329
@meowl9329 10 месяцев назад
what if google map use this
@RealityCheckVR
@RealityCheckVR 10 месяцев назад
soon it will be "Google Live Map" with Satellite data showing us 4D Gsplats 😎
@jpjpomxyz2319
@jpjpomxyz2319 4 месяца назад
Please remove this painful background music. It is wortless, and ruins the quality of the video. Thank you.
@pingpong1727
@pingpong1727 11 месяцев назад
This is not AI
@RealityCheckVR
@RealityCheckVR 11 месяцев назад
How so? Everything the program is doing from the COLMAP to the training is done with new AI algorithms . What does AI look like to you? 🤨
@Niberspace
@Niberspace 4 месяца назад
@@RealityCheckVR bro thought it was a bunch of if-statements lol
@kevinoboyle8939
@kevinoboyle8939 10 месяцев назад
Gaw*See*In
Далее
Mind-Blowing 3D Gaussian Splatting On Your Phone!
14:22
3D Gaussian Splatting - Explained!
8:28
Просмотров 87 тыс.
Fake watermelon by Secret Vlog
00:16
Просмотров 10 млн
Я ЖЕ БЕРЕМЕННА#cat
00:13
Просмотров 484 тыс.
I Built The First LAMINAR FLOW ROCKET ENGINE
15:51
Просмотров 826 тыс.
This is the Difference of Gaussians
19:03
Просмотров 252 тыс.
The REAL Reason Unreal Engine VFX Looks FAKE
6:58
Просмотров 433 тыс.