Тёмный

Gaussian Splatting! The next big thing in 3D! 

Olli Huttunen
Подписаться 13 тыс.
Просмотров 249 тыс.
50% 1

In this video, we embark on a fascinating exploration of Gaussian Splatting, a cutting-edge technique that brings a touch of magic to computer graphics and visualization! I recently take a look on this 3D technique and found it very intresting. It is quite hard to manage while it is on its developement but fortunately there is some instructions on the net,
Specs:
These samples were rendered with Graphics card Nvidia RTX 3070 8Gb Vram
PC: Asus ROG Ryzen 7 64Gb ram
Check out NeRF Guru Jonathan Stephens beginner guide in here:
• Getting Started With 3...
#nerf #gaussiansplatting #lumaai

Кино

Опубликовано:

 

6 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 375   
@IRWBRW964
@IRWBRW964 11 месяцев назад
3D Gaussian Splatting is actually not a NeRF technology as there is no neural network, but the splats are directly optimized through rasterization rather than the ray tracing like method of NeRFs.
@spyral00
@spyral00 11 месяцев назад
Looks like it's a new way to display point clouds, am I wrong? Still amazing and I have to try it!
@JB-fh1bb
@JB-fh1bb 11 месяцев назад
@@spyral00Right? I thought this gaussian splatting technique was a new way to present the point data generated by NeRF
@malfattio2894
@malfattio2894 11 месяцев назад
Wow, it looks really damn good considering
@Blox117
@Blox117 11 месяцев назад
so it will be faster too
@WWG1-WGA
@WWG1-WGA 10 месяцев назад
That means we can play even more with the neurons
@jimj2683
@jimj2683 11 месяцев назад
Imagine Google Street view built with this. It could then be used in a GTA type game with the entire world.
@ariwirahadi8838
@ariwirahadi8838 11 месяцев назад
you forget about flight simulator..it is generated by real map
@valcaron
@valcaron 11 месяцев назад
Grand Theft Auto Frankfurt GTA but everything's all blurry.
@michaking3734
@michaking3734 11 месяцев назад
i bet in the next 20-30 years
@florianschmoldt8659
@florianschmoldt8659 11 месяцев назад
There is no good way use splatting with interactive light and shadow or animation. All the lighting is fixed together with the color information. So I guess, this tech won't make it into gaming.
@strawberriesandcum
@strawberriesandcum 11 месяцев назад
@@valcaron and most of it is missing
@crs11becausecrs10wastaken
@crs11becausecrs10wastaken 11 месяцев назад
If scanning software is actually capturing and rendering details as fine as leaves of plants, without all of the artifacts, then that is absolutely mind-blowing.
@filipewnunes
@filipewnunes 11 месяцев назад
I spent lots and lots of hours in my life unwraping UVs and correcting meshes to use in my archviz projects. The amount of development in this field is insane. And we are in the first days of this. What a time to be alive.
@OlliHuttunen78
@OlliHuttunen78 11 месяцев назад
My thoughts exactly. Many things are changing very fast now. Although this does not yet create anything from scratch and for these Nerf you still need something existing, from which things are transformed into 3D by taking the pictures from real world. Traditional modeling still certainly has its place in when creating something new.
@captainflimflam
@captainflimflam 11 месяцев назад
I got that reference! 😉
@loleq2137
@loleq2137 11 месяцев назад
Ah, a fellow Scholar!
@MagicPlants
@MagicPlants 11 месяцев назад
well said!
@nekosan01
@nekosan01 10 месяцев назад
Photogrammetry is very old, do not why you only know this marketing stuff and enjoying if its much worse than realitycapture and other app, and they do not require expensive videocard, also you can import in sculpt software for fixing mesh and project uv very easy, than this garbage
@4Gehe2
@4Gehe2 11 месяцев назад
Ok I did a quick reading of the paper. This is a clever thing however what should be kept in mind is that it doesn't preserve details so much as makes them up. (The explaning bit is in the chapters 5.1, 5.2, and fig. 2, 3, 4 of the paper). Basically you reconstruct the environment not by analysis but by statistically informed guesses. After which you then do analysis on whether the guess was too big or small. Then you refrence the solution to the original data to see how close you were with your guesses. If the guess was too small you duplicate it near the point; if the guess was too great you divide it in to two. Meaning that if you need to estimate a curve, instead of trying actually solve the curve you keep guessing the shape of the curve, but because of the process of duplication and division of the guesses you basically approach faster to the solution. However it is important to keep in mind that you don't actually get THE SOLUTION, you get approximation of the solution based of guesses. Basically this is the way you can do square roots and cube roots in your head to 2-3 decimals, by estimating upper and lower and iterating that (for those that don't know: if you want to estimate square root of 6, you can in your had calculate that 2x2 is 4 3x3 is 9, so the solution is between those; then you can do 2,5x2,5 you get 6,25 which is more, so you know that solution needs to be less than that so 2,25x2,25 you get 5,0625... so on and so forth. You will never practically get the solution of 2,2449489743 because we only go to 3 decimals but lets be honest 0,04% error is more than enough. To simplify a bit: Imagine you are sculpting with clay and want to replicate a shape. You can only add or remove instead of shaping it with your hands. If you have too much material you cut away hald of the amount of you know to be too much. If you added too little clay, you add another same sized lump. And you keep repeating this untily you get close enough approximation of the thing you are replicating. What is important to keep in mind is the limitations of this. You can't replicate things accurately for the simple reason that if you lack information on details you can't just guess them! Your data resolution doesn't increase. You only actually know the datapoints that you gathered. So for historical, scientific or engineering purposes you have will not be able to get any extra information (And I hope that people realise this, before they try to use details from this in a court of law or something), you really can't know anything more from this than you can get from just looking at the frames as pictures.
@JimmyNuisance
@JimmyNuisance 11 месяцев назад
I fell in love with splat engines when I spent time in Dreams on the PSVR. It's fantastic for creatives, it makes it very easy to make new and unseen surfaces.
@kazioo2
@kazioo2 9 месяцев назад
That renderer went through so many changes and iterations (also after they made public explanations) I'm not really sure how much of typical splatting is still used there. There are many conflicting informations about it.
@wozniakowski1217
@wozniakowski1217 11 месяцев назад
I feel like those galaxy-like ellipses with feathered edges are THE new polygons and soon this rendering method will replace them, especially in the gaming industry. What a time to be alive
@spyral00
@spyral00 11 месяцев назад
That depends. Can they support animation, rigging, fluids, etc? Voxels are great but they still aren't the norm... Maybe it's just another great tool in the shelf.
@bricaaron3978
@bricaaron3978 11 месяцев назад
I would say the gaming industry would be the last area to use this method. It looks like this is a method of rendering; it has nothing to do with the generation and manipulation of 3D data.
@spyral00
@spyral00 11 месяцев назад
​@@bricaaron3978 true... I would love to see a game made wth NERF point clouds rendered with this though.
@bricaaron3978
@bricaaron3978 11 месяцев назад
@@spyral00 Can you, in a few sentences, explain why NERF point clouds are different from any other point cloud so that I don't have to research it, lol?
@spyral00
@spyral00 11 месяцев назад
@@bricaaron3978 NERF is an algorithm that generates 3d models (or point clouds) from 2d photos, using neural nets. Pretty amazing stuff, but quite complex and not yet widely used. This technique seems to be just a way to display the results in a nice way, if I understand correctly. In theory one could make game environments using photos+NERF as an input and this to render them, pretty sure it'd look amazing
@TheCebulon
@TheCebulon 11 месяцев назад
All the time, I thought I saw videos and was wondering about 3D. 🤣 Then it hit me: These ARE 3D renders. Absolutely stunning.
@Ironside451
@Ironside451 11 месяцев назад
Reminds me of that moment on Star Trek into Darkness when they are looking at security footage and are able to move around the footage just like this
@TorQueMoD
@TorQueMoD 11 месяцев назад
Great video! The RTX 3070 has 8GB of Vram though, not 4. I'm super excited to see where NeRF will take us in another 5 years! It's a boon for indie developers who don't have the time or budget to create high quality assets.
@stash.
@stash. 11 месяцев назад
it varies, i have the 6gb 3070 model _===edit====_ Turns out i had the 8gb version not the 6gb as i mentioned earlier
@GrandHighGamer
@GrandHighGamer 11 месяцев назад
@@stash. 4GB would be incredibly low still (and 8GB is already pitiful for a card that cost around $800), to the point where it wouldn't make sense to exist at all. At that point a 3060 would both be cheaper and potentially have 4x the memory. I'd imagine this was just a mistake.
@esaedvik
@esaedvik 10 месяцев назад
@@GrandHighGamer 8GB is perfectly fine for the use cases of 1080-1440p gaming.
@Oho_o
@Oho_o 11 месяцев назад
Those gaussian splats looks like galaxies in space at 2:07 .. ;O
@patjackmanesq
@patjackmanesq 11 месяцев назад
2.7k subs is a ridiculously low amount for such quality videos! Great work, brother
@o0oo888oo0o
@o0oo888oo0o 11 месяцев назад
Great, best videos about this niche of nerf's etc. i found so far. Keep it up!
@linuxmill
@linuxmill 11 месяцев назад
guassian splatting has been around for many years. I used it in the late 90's. It's a method of generating implicit functions, which can then be contoured.
@MatteoMi
@MatteoMi 11 месяцев назад
I'm not a specialist, but I suppose this is similar to VR, that's also been around from the 80s, but the tech wasn't mature enough. I mean, maybe.
@EmileChill
@EmileChill 11 месяцев назад
@linuxmill I used autodesk 123d catch which isn't avalible anymore, i believe it was the same kind of technique but not 100% sure.
@danielalorbi
@danielalorbi 11 месяцев назад
Yup, the new thing here is using it to render radiance fields in real time
@EmileChill
@EmileChill 11 месяцев назад
@@danielalorbi That's incredible!
@stephanedubedat5538
@stephanedubedat5538 10 месяцев назад
while the technique is not new, its application to NERF is
@stash.
@stash. 11 месяцев назад
bringing old family photos will be a huge market boom
@thenerfguru
@thenerfguru 11 месяцев назад
Thanks for the shout out! You can now view the scene in the NerfStudio viewer which unlocked smooth animation renders.
@OlliHuttunen78
@OlliHuttunen78 11 месяцев назад
Yes. I just noticed your new video about it. Have to try it. Thanks Jonathan!
@The-Filter
@The-Filter 11 месяцев назад
Man, thank you for this video! That stuff is really next gen! wow! And top notch moderation! Very relaxing and informative!
@drekenproductions
@drekenproductions 11 месяцев назад
thanks for linking to the nerf guru. could come in handy some day if i decide to try this!
@romanograsnick
@romanograsnick 10 месяцев назад
Astonishing achievements were made, that is great! I hope this may lead to set builders to make more models which can be traced and recreated in 3d space, keeping these sculpting jobs relevant. Thanks!
@vadimkozlov3228
@vadimkozlov3228 8 месяцев назад
fantastic and very professional youtube channel. appreciate your work
@8eck
@8eck 11 месяцев назад
I remember when i first tried Nerf. Since then, they have evolved into insane quality!
@pan6593
@pan6593 10 месяцев назад
Great summary, insight and practical example - thanks!
@HandleBar3D
@HandleBar3D 11 месяцев назад
This is gonna be huge in real estate, once it’s a streamlined app on both ends.
@3dvolution
@3dvolution 11 месяцев назад
It's getting better and better, that's an impressive method, thanks for sharing ;)
@ChronoWrinkle
@ChronoWrinkle 11 месяцев назад
Hot damn, it should be possible to extract depth , normals, and glossines from such capture, this is insane!
@LaVerite-Gaming
@LaVerite-Gaming 11 месяцев назад
It's beautfiul that the first image I ever saw rendered in this way now is a Captain Haddock figurine ❤
@BlenderDaily
@BlenderDaily 11 месяцев назад
so exciting! thanks for the explanation:)
@chosenideahandle
@chosenideahandle 11 месяцев назад
Terve Olli! Another Finn with an awesome RU-vid channel (I'm not including myself 😁)! Thanks for keeping us up-to-date on what is going on with this cutting edge stuff.
@michaelvicente5365
@michaelvicente5365 11 месяцев назад
ohhh thanks for explaining, I saw a couple things on twitter and was wondering what this gaussian splatting was about!
@MartinNebelong
@MartinNebelong 10 месяцев назад
Great overview and certainly exciting times! 😊
@EBDeveloper
@EBDeveloper 11 месяцев назад
Glad I found your channel ;) .. nice to meet you Olli
@MonsterJuiced
@MonsterJuiced 11 месяцев назад
This is fascinating! I hope there's going to be some kind of support for blender/ unreal/ unity soon I would love to play with this
@Jackpadgett-gh8ht
@Jackpadgett-gh8ht 10 месяцев назад
there is support for it! volinga AI, search it up
@MommysGoodPuppy
@MommysGoodPuppy 10 месяцев назад
Yesss i cant wait for this to be utilized in vr, I assume we could render absolutely insane detail in realtime for simulating reality or having big budget cgi movie visuals in games
@TheABSRDST
@TheABSRDST 11 месяцев назад
I'm convinced that this is how our vision works irl
@Neura1net
@Neura1net 11 месяцев назад
Very cool. Thank you
@GraveUypo
@GraveUypo 10 месяцев назад
these are so good that you can probably use screenshots of these models to make 3d models with old photogrametry software.
@damsen978
@damsen978 10 месяцев назад
This is literally what will follow photographs and images in general where you can see captured moments of your family and friends in full 3D. Now we need a device that would capture these automatically with a click of a button.
@DailyFrankPeter
@DailyFrankPeter 11 месяцев назад
All we need now is a scanner in every phone for taking those selfie pointclouds and we'll be in the world of tomorrow.
@marco1941
@marco1941 11 месяцев назад
Wow, now we’ll see really interesting development in video game production and of course in the results.
@domovoi_0
@domovoi_0 11 месяцев назад
Incredible. Love and blessings!
@XRCADIA
@XRCADIA 11 месяцев назад
Great video man, thanks for sharing
@eekseye666
@eekseye666 11 месяцев назад
Oh I love your content! Should have been subscribed last time I met your channel. I didn't, but I do it now! )
@tristanjohn
@tristanjohn 11 месяцев назад
Absolutley phenomenal!
@jamesleetrigg
@jamesleetrigg 11 месяцев назад
If you watch two minute papers, there’s a new radiance, field, technique that is over 10 times as fast and better quality so look forward to seeing this in VR/AR
@Barnaclebeard
@Barnaclebeard 11 месяцев назад
Can't stand to watch TMP anymore. It's nothing but paid content and catchphrases. I sure would love a channel like the old TMP.
@primenumberbuster404
@primenumberbuster404 11 месяцев назад
​@@Barnaclebeard fr 😢 Many of those papers are actually not even peer reviewed.
@Barnaclebeard
@Barnaclebeard 11 месяцев назад
@@primenumberbuster404 And it's exceedingly rare that there is any analysis or insight beyond, "imagine what it can do two papers down the road!" anymore.
@Summanis
@Summanis 10 месяцев назад
Both this video and the TMP one are on the same paper.
@malipetek
@malipetek 11 месяцев назад
Very interesting, thanks.
@lordofthe6string
@lordofthe6string 10 месяцев назад
This is so freaking cool, I hope one day I can make a game using this tech.
@lemiureelemiur3997
@lemiureelemiur3997 11 месяцев назад
Stunning!
@luketimothy
@luketimothy 10 месяцев назад
Just imagine a machine that can generate point clouds around itself at a rate of 60 per second, and a technique like this that can render that point cloud at the same 60 per second rate. Truly 3D video. Would be amazing.
@talis1063
@talis1063 11 месяцев назад
I'm deeply uncomfortable with how fast everything is moving right now. Feels like anything you touch could become obsolete in months.
@flameofthephoenix8395
@flameofthephoenix8395 11 месяцев назад
Except for farming.
@ChainsawGutsFuck
@ChainsawGutsFuck 11 месяцев назад
@@flameofthephoenix8395 Or water. Or oxygen. Or physical existence.
@flameofthephoenix8395
@flameofthephoenix8395 11 месяцев назад
@@ChainsawGutsFuck I figured he was talking about careers.
@Sc0pee
@Sc0pee 10 месяцев назад
If you mean traditional 3D-modelling for gaming/movies or 3D-printing then no at least not for the foreseeable future, because this technique doesn't produce mesh models, which is a requirement in games and movies for dynamical lightning, animation, surfacing, interactivity etc. And it also requires you to have the object you want in real life to work with.
@FredBarbarossa
@FredBarbarossa 11 месяцев назад
looks really impressive
@ponadchmurami8008
@ponadchmurami8008 11 месяцев назад
amazing thanks man for this video
@MotMovie
@MotMovie 10 месяцев назад
Good stuff mate. Very interesting indeed and great to see such in depth look into things with self made examples. As a sidenote, music is a bit big for this, I mean it´s not cure for cancer (just yet) so perhaps go bit easier on "Life will win again, there will be beautiful tomorrow" soundtrack :p . Anyhow, cheers, will be back for more.
@Dartheomus
@Dartheomus 11 месяцев назад
My mom walked in the room and asked what the hell I was doing. I told her to just relax. I'm gaussian splatting.
@jimmyf2618
@jimmyf2618 11 месяцев назад
This reminds me of the the old "Unlimited Detail" video promising infinite rendering
@SerigioKun
@SerigioKun 11 месяцев назад
Impresiona como avanzan estas tecnicas. En un futuro poder aplicar esta tecnología para la resolución de crímenes u accidentes, con solo tener un video del mismo va a ser genial.
@afti03
@afti03 10 месяцев назад
Fascinating! could you make a video on what would be the most relevant use cases for this type of technology?
@bradleypout1820
@bradleypout1820 11 месяцев назад
good video got a new sub!
@JeremyDWilliamsOfficial
@JeremyDWilliamsOfficial 11 месяцев назад
Nice work! Well done :)
@JeremyDWilliamsOfficial
@JeremyDWilliamsOfficial 11 месяцев назад
Subbed. But to be honest, you really didn't explain much about it. Perhaps make another that explores the math, techniques or even touches on your install and implementation process.
@Inception1338
@Inception1338 10 месяцев назад
One more time for gauss to show the world who is the king of Mathematics.
@ralfbierig
@ralfbierig 11 месяцев назад
Interesting and seriously promising for VR applications!
@0ooTheMAXXoo0
@0ooTheMAXXoo0 11 месяцев назад
Yes, apparently used in Dreams on PS4 and PSVR.
@Datdus92
@Datdus92 11 месяцев назад
You could walk in your memories with VR!
@fontende
@fontende 11 месяцев назад
Technology from Minority Report movie, showed 20 years ago, that's how long it takes to make.
@Eddygeek18
@Eddygeek18 11 месяцев назад
Next step is getting it working with animations and physics and you have a new game rendering method. I have always felt mesh rendering is limited, been waiting for a new method such as this. Hope it's the one this time since there have been quite a few duds in the past
@0ooTheMAXXoo0
@0ooTheMAXXoo0 11 месяцев назад
Apparently Dreams (2020) on PS4 uses this technique.
@Tattlebot
@Tattlebot 11 месяцев назад
Games consistently refuse to use new technologies, because teams don't have faith in leadership, and don't have the skills. Games are getting less featureful and interactive. Talented writers are negligible. The result is an oversupply of unsophisticated chew toys. No incentive to upgrade from 5700 XT type cards.
@catsnorkel
@catsnorkel 11 месяцев назад
Until this method can produce poly models that can properly fit into a pipeline, I really don't see this being widely used in either the games or film industries, but I can see it being used a lot in archvis for example.
@Eddygeek18
@Eddygeek18 11 месяцев назад
@@catsnorkel i know what you mean gpus are designed for polygons and engines have very specific mechanisms for it, but i don't think it would take too much modify existing software to make use of GPU effeciently for this technology. They both use techniques hardware is capable of so if invested in i don't think it would take Unity or Unreal much more time to integrate the tech into their engines compared with poly based rendering pipelines. Since it uses a scattering field type rendering it shouldn't be much different
@catsnorkel
@catsnorkel 11 месяцев назад
@@Eddygeek18Thing is, this technique does not support dynamic lighting, and isn't even built in a way that could be modified to support it. Same with animation, surfacing, interractivity etc. It is a really cool idea to render directly from pointcloud data like this, skipping most of the render pipeline, however the parts that are skipped over is **where the game happens**
@tonygardner4077
@tonygardner4077 11 месяцев назад
liked and subscribed ... hi from New Zealand
@renko9067
@renko9067 10 месяцев назад
This is basically how the actual visual field works. Overlays of sensations, sounds, and smells complete the illusion of subject/object. It is the zero dimension quantum wave field. The scene ‘moves’ in relation to the ‘eyes’ of an apparent subject.
@striangle
@striangle 11 месяцев назад
absolutely amazing technology! super excited to see where the future takes us. thanks for sharing! ..side question - what is the music track on this video?
@MarinusMakesStuff
@MarinusMakesStuff 11 месяцев назад
Awesome!!! Though, for me, all that matters is getting a correct mesh and I couldn't care less about textures personally. I hope the mesh generation will soon also make leaps like this :)
@joonglegamer9898
@joonglegamer9898 11 месяцев назад
Yeah you're spot on, this is not new, there might be new elements to it which is great, but I won't bat an eye until they come up with a perfect, easy to seam - seamless uv-mapping model, we still have to make our models animateable, relying on low poly to get the most of the CPU / GPU powers in any setup, so yeah untill then we can keep dreaming, hasn't happened in 40+ years.
@Felenari
@Felenari 11 месяцев назад
Good watch. Subscribe earned. Haddock is one of my faves.
@MaxSMoke777
@MaxSMoke777 11 месяцев назад
It's a cute way to make use of point clouds. I'm certain it'll be handy for MRI's and CT scans, but it's nowhere near as useful as an actual 3D model. You couldn't use it for video game models or 3D printing. It could be extremely useful for real-time, point-cloud, video conferencing, since it's so fast.
@catsnorkel
@catsnorkel 11 месяцев назад
agreed. it will probably find a few niche use cases for certain effects that are layered on top of a traditional poly-based render pipeline, but it's not going to completely take over, probably ever. This is a technology developed for visualisation, and not really suitable for games or film.
@f1pitpass
@f1pitpass 11 месяцев назад
great to see
@DavidKohout
@DavidKohout 11 месяцев назад
This really makes me feel like living in the future.
@DavidKohout
@DavidKohout 11 месяцев назад
This just confirms that we're living in the best times, from the start of the phone technology to this.
@endrevarga5111
@endrevarga5111 10 месяцев назад
Idea! 1. Make a low-poly 3D scene in Blender. It's a 3D skeleton. Use colors as object IDs. 2. Using real-time fast OpenGL engine, quick-render some hundred images, placing the camera to different locations like photographing a real scene for the 3DGS creation. The distribution of the camera should be easy using Geometry Nodes. 3. Using these images, use Runway-ML or ControlNet etc. to re-skin them according to a prompt. If possible, use one image to ensure consistency. 4. Give the re-skinned images to the 3DGS creation process to create a 3DGS image for the scene. Et voilà, a 3D AI-generated virtual reality is converted to 3DGS.
@angelavolkov1126
@angelavolkov1126 10 месяцев назад
Very cool.
@IndyStry
@IndyStry 11 месяцев назад
This is awesome, is there a way to export this to an estimated actual polygonal model to use in 3d softwares?
@mankit.mp4
@mankit.mp4 10 месяцев назад
Hi Olli, great video and thanks for the intro to such fascinating tech. What’s your opinion on whether Insta360 or full frame camera a fisheye lens will provide a better result or workflow?
@OlliHuttunen78
@OlliHuttunen78 10 месяцев назад
Well. In the process where Colmap is used to generate the pointcluoud it doesn't like any kind of fisheyes lenses or round distortions on the images. Best way to train the model is use source images where all distortion has been removed. I'm not sure how the Luma AI's new Interactive Scenes are handling the material. It seems that it can take all sort of wide angle videos or 360 footage in. I recommend to try: lumalabs.ai/interactive-scenes
@abhi4154
@abhi4154 11 месяцев назад
very nice video
@EveBatStudios
@EveBatStudios 10 месяцев назад
I really hope this gets picked up and adopted quickly by companies that are training 3-D generation on nerfs. The biggest issue I’m seeing is resolution. I imagine this is what they were talking about coming in the next update with imagine 3D. Fingers crossed that would be insane.
@GeekyGami
@GeekyGami 11 месяцев назад
This point cloud technology is much older than 2020. It has been tried for a decade at this point, on and off.
@NecroViolator
@NecroViolator 11 месяцев назад
I remember a Australian company making infinite graphics with something similar. They made games and other stuff. Cant remember the name but it was many years ago. :(
@wolfzert
@wolfzert 11 месяцев назад
Woow, que bien, un punto más para seguir andando en la IA
@joelmulder
@joelmulder 10 месяцев назад
Once video games and 3D software rendering engines start to use this… Oh boy, that’s gonna be something else
@liliangimenez4461
@liliangimenez4461 11 месяцев назад
How big are the files used to render the scene? Could this be used as a light field video format?
@costiqueR
@costiqueR 11 месяцев назад
I tell you this: is a game changer for the industry...
@catsnorkel
@catsnorkel 11 месяцев назад
depends on the industry though. Archvis yes, absolutely. Games and film, it will only really have a minor impact since it isn't really geared towards those use cases.
@metatechnocrat
@metatechnocrat 10 месяцев назад
Well one thing it'll be useful for is helping me examine images for clues to hunt down replicants.
@taureanwooley
@taureanwooley 11 месяцев назад
Perforated disck layering at one point with bezier curve translations and HDR data mining ...
@imsethtwo
@imsethtwo 11 месяцев назад
solution to the floating artifacts would be just make procedural volumetric fog and use it to your advantage 😎
@triplea657aaa
@triplea657aaa 11 месяцев назад
Gauss strikes again!
@helper_bot
@helper_bot 11 месяцев назад
exciting news!
@aksi221
@aksi221 11 месяцев назад
wow this is awesome ! is it possible to get 3d objects out of this? and if so, how good are the results?
@santitabnavascues8673
@santitabnavascues8673 11 месяцев назад
I would say the 3d models out of this wouldn't be much better than those obtained on other 3d scan methods.
@catsnorkel
@catsnorkel 11 месяцев назад
This technology isn't really intended for that type of use. It isn't something that is designed to fit into a film or game pipeline with poly models. You might see certain effects here and there using it, but generally it is going to be used more for visualisation.
@GuywithThoughts
@GuywithThoughts 11 месяцев назад
Perhaps it can't be used this way, but I'm really hoping for similar technology to dramatically improve the speed accuracy of camera tracks. Would be amazing to just need a video recorded on the phone and get back the 3D camera track data and a point cloud of the environment.
@MilesBellas
@MilesBellas 11 месяцев назад
The entire VFX Industry is under massive disruptive growth that now prioritizes INDIVIDUAS..... ....a huge paradigm shift.
@lolmao500
@lolmao500 11 месяцев назад
Next gen of graphic cards apparently will all have a neural network chip on there.
@icegiant1000
@icegiant1000 11 месяцев назад
How long before micro drones are just buzzing up and down our bike paths, sidewalks, streets and so on, grabbing HQ images, beaming them to the cloud, and by the end of the day you can do a virtual walkthrough of the local fair, or the car dealership, or a garage sale on the other side of town, or the crowd at a football game. Only thing stopping us is CPU power and storage, and that is getting solved fast. Exciting times! P.S.- How long before people stay home, and just send out their micro drones, and view everything in VR at home. A lot safer than getting mugged.
@Moshugaani
@Moshugaani 10 месяцев назад
I wonder if the high demand of VRAM could be circumvented by using some other memory to compensate, like normal RAM or a part of your SSD?
@MagicPlants
@MagicPlants 11 месяцев назад
Thanks for showing this! I am also studying various technologies with success. We should collab sometime!
@manzell
@manzell 11 месяцев назад
Nerfies! What a time to be alive!
@Misthema
@Misthema 11 месяцев назад
Unlimited Detail did this before it was cool. It also did not require high-end GPUs nor that much power; it worked fast with software rendering!
@georg240p
@georg240p 11 месяцев назад
Wasnt the euclideon thing just regular point clouds? Cant capture any view dependent effects like reflections.
@Misthema
@Misthema 11 месяцев назад
@@georg240p prolly yeah. I just meant that this tech existed way before it was in any way adapted.
@thilakkumar008
@thilakkumar008 11 месяцев назад
fantastic , i need your suggestion?
@foxy2348
@foxy2348 11 месяцев назад
amazing. How is this rendered? In what program?
@ziomalZparafii
@ziomalZparafii 11 месяцев назад
Closer and closer to Esper from Blade Runner.
Далее
What 3D Gaussian Splatting is not?
8:21
Просмотров 91 тыс.
Китайка Шрек поймал Зайца😂😆
00:20
INSANE OpenAI News: GPT-4o and your own AI partner
28:48
My observations on Gaussian Splatting and 3D scanning
16:32
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 857 тыс.
3D Gaussian Splatting! - Computerphile
17:40
Просмотров 127 тыс.
Is Nerf The End of Photogrammetry
11:17
Просмотров 70 тыс.
NeRF in Unreal Engine 5 is INSANE!
4:02
Просмотров 18 тыс.
3D Gaussian Splatting - Explained!
8:28
Просмотров 82 тыс.