Тёмный

NVIDIA’s AI Learned On 40,000,000,000 Materials! 

Two Minute Papers
Подписаться 1,6 млн
Просмотров 247 тыс.
50% 1

❤️ Check out Lambda here and sign up for their GPU Cloud: lambdalabs.com/papers
📝 The paper "Real-Time Neural Appearance Models" is available here:
research.nvidia.com/labs/rtr/...
📝 My PhD thesis "Photorealistic Material Learning and Synthesis" is available here:
users.cg.tuwien.ac.at/zsolnai...
My latest paper on simulations that look almost like reality is available for free here:
rdcu.be/cWPfD
Or this is the orig. Nature Physics link with clickable citations:
www.nature.com/articles/s4156...
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bret Brizzee, Bryan Learn, B Shang, Christian Ahlin, Gaston Ingaramo, Geronimo Moralez, Gordon Child, Jace O'Brien, Jack Lukic, John Le, Kenneth Davis, Klaus Busse, Kyle Davis, Lukas Biewald, Martin, Matthew Valle, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Richard Sundvall, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: / twominutepapers
Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
Károly Zsolnai-Fehér's research works: cg.tuwien.ac.at/~zsolnai/
#nvidia

Наука

Опубликовано:

 

14 окт 2023

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 363   
@partha2806
@partha2806 9 месяцев назад
Our real world is soon going to need a graphics update 😂
@sacredgeometry
@sacredgeometry 9 месяцев назад
Thats what VR is
@dexterpoindexter3583
@dexterpoindexter3583 9 месяцев назад
Great idea! Though if that's possible, _please_ vote for an ethics, tolerance & mutual respect update first?
@AMOGUS_is_SUS696
@AMOGUS_is_SUS696 9 месяцев назад
We will need 69k glasses 😂
@player0000
@player0000 9 месяцев назад
Can't wait for the update 😂!
@HappyBirthdayGreetings
@HappyBirthdayGreetings 9 месяцев назад
that's the dopest comment ever! it weird because this morning I was thinking about how most human inventions will only be limited by our biological sensory system
@Chris.Davies
@Chris.Davies 9 месяцев назад
I recall that when the Human Genome Project began in 1984, it was widely mocked by many scientists, because at the rate the gene sequencers of the time operated, even with hundreds of machines, the task was going to take thousands of years. But they did not understand that the rule of accelerating returns leads to exponential advances. And the Human Genome Project completed the project in 2003, just 19 years later.
@nasrimarc7050
@nasrimarc7050 9 месяцев назад
Very interesting example 👍👍👍
@WirIez
@WirIez 9 месяцев назад
This is insane. Will we reach a point where VR headset + insane graphics is going to compete with how the real world looks like? To the point where we can’t make out the difference?
@coreblaster6809
@coreblaster6809 9 месяцев назад
To be honest, it's a matter of if, but when
@SyntaxDomain
@SyntaxDomain 9 месяцев назад
I think that's scheduled for one or two papers down the line. ;-)
@ge2719
@ge2719 9 месяцев назад
no matter how good the graphics get, youre still looking through a set of lenses strapped to your face. We'd have to get to the point where we can connect to the optic nerve and input the "data" directly into your brain. before it truly tricks anyone into not knowing the difference.
@LanceThumping
@LanceThumping 9 месяцев назад
Papers like this make you start to wonder if our display tech can keep pace.
@jasonhemphill8525
@jasonhemphill8525 9 месяцев назад
@@ge2719that’s a long way off, but optics is a field that is also making huge strides. Exciting times we are living in.
@DreckbobBratpfanne
@DreckbobBratpfanne 9 месяцев назад
When your new system isn't just comparing to the reference material but BEATS the reference material (up to a magnitude too) you know you've got some crazy stuff on your hands. This is one of the first cases where I actually couldn't believe it's not real (others are eg the floating leafs on a pond simulation or Unreal Engines Tire physics or a game demo set in real scanned houses)
@parazels83
@parazels83 9 месяцев назад
The next step is to make these things interactive (realistic bending, breaking, deformation etc).
@hadensnodgrass3472
@hadensnodgrass3472 9 месяцев назад
That is actually much easier; the math to compute it, I mean. Light scattering and subsurface reflection are crazy difficult problems to solve. This paper is a big step in the right direction. That being said, they aren't apples to oranges. Memory is the limiting factor is physics simulations, but processing power is the limiting factor for light-based visualization(ray tracing). Raster and geometry visualization are much faster, but they lack the finner details of light and are flat by comparison.
@mmheti
@mmheti 9 месяцев назад
I would say the next step is to get it to a game and 3D engines, and turn HLSL shaders into prompt based shaders. That would be awesome!
@NeovanGoth
@NeovanGoth 9 месяцев назад
@@hadensnodgrass3472Yes there are games with real-time path tracing, but not a single one with say a _good_ fluid simulation.
@ooOPizzaHeadOoo
@ooOPizzaHeadOoo 9 месяцев назад
no bro promp based shaders is not a good idea. Base materials with paremeter modifying sliders is what u want. Prompting is only good for people who have no idea how to make something, but with materials in game development u have developers working and prompting would actually be much more cucumbersome to use.@@mmheti
@PFnove
@PFnove 9 месяцев назад
@@mmheti soon enough GPUs won't even need rt cores anymore
@footballfusionpro2
@footballfusionpro2 9 месяцев назад
i'm still only a teenager, so seeing technology reach this level at such a young age for me makes me so excited to see what'll be in story in 20-30 years
@martiddy
@martiddy 9 месяцев назад
We will most likely have virtual reality with brain computer interface sending signals to our brains where everything is undistinguishable from the real world.
@adryncharn1910
@adryncharn1910 9 месяцев назад
Same!
@luc8254
@luc8254 9 месяцев назад
Pray for it not to be distopian kid
@vectoralphaSec
@vectoralphaSec 9 месяцев назад
Don't forget that you may not live that long. Kids younger than you die every day. So we can all die at any moment. So you're future seeing this isn't promised.
@sliwilly
@sliwilly 9 месяцев назад
If you still alive
@sidekick3rida
@sidekick3rida 9 месяцев назад
Your pause between every word or two is killing me, dude!
@IncognitoWho
@IncognitoWho 3 месяца назад
its the best part of the audio XD unique !
@thedofflin
@thedofflin 9 месяцев назад
Man I can tell you are genuinely stunned by this research. It is simulating absolutely everything about the material, almost in real time. Very very cool
@phpn99
@phpn99 9 месяцев назад
Boy, that voiceover is starting to test my patience
@IncognitoWho
@IncognitoWho 3 месяца назад
i love it XDDD
@TheMostGreedyAlgorithm
@TheMostGreedyAlgorithm 9 месяцев назад
I'm curious, how much variation in material could it handle. For example, can we change a base color without retraining a model?
@jefferson-silva
@jefferson-silva 9 месяцев назад
There is a link to the paper in the video description.
@bclaus0
@bclaus0 9 месяцев назад
Hi Karoly, great video as always. I was wondering if you could go into more depth on how artists can make use of this technology. The demo is great, but what is exactly involved in making that demo. Currently plugging some textures into a physically based shader and tweaking some values is the norm for making a material in any render engine. But how do materials like these work exactly?
@jhwblender
@jhwblender 6 месяцев назад
Thank you for showing some snippets of the paper and the graphics showing under the hood! I've really missed that.
@isalutfi
@isalutfi 9 месяцев назад
Great! Thank you for sharing this incredible graphic!
@armartin0003
@armartin0003 9 месяцев назад
I wonder how many lifetimes it would take me to look at that many materials, let alone comprehend them and use them as reference for my work.
@FritzSchober
@FritzSchober 9 месяцев назад
I was a teen when computer graphics looked like a game of Tetris. This is mind blowing.
@Krackatoa
@Krackatoa 9 месяцев назад
When I was going to animation school in 2006, my teacher said real time raytracing wasn't going to happen without some kind of technical marvel. This is an order of magnitude beyond that. If you told me this would happen so soon I wouldn't believe you for a second.
@Nexariuss
@Nexariuss 9 месяцев назад
is this channel also available with a normal voice?
@touristtam
@touristtam 9 месяцев назад
No, this is what is to be expected unfortunately. I almost wished he would just use an Auto generated voice without the off tempo pauses...
@Randalandradenunes
@Randalandradenunes 9 месяцев назад
Just amazing how much progress we are seeing these days.
@alexl266
@alexl266 9 месяцев назад
Almost lost my paper on this one... amazing!
@zohanthegreat7391
@zohanthegreat7391 9 месяцев назад
Ok this actually looks like real life I literally don’t know what to tell you
@EmergentStardust
@EmergentStardust 9 месяцев назад
Unbelievable! This feels eons ahead of just a few years ago
@leirex_1
@leirex_1 9 месяцев назад
This looks absolutely jaw dopping!
@demonsynth
@demonsynth 9 месяцев назад
Imagine walking through your favorite places, but looking at them through AR/XR lenses with filters utilizing NNS based on this technique. A sharper, clearer, less noisy reality. So many variations. Thermal overlay? Magnetic flux?
@stephenrodwell
@stephenrodwell 9 месяцев назад
Excellent content! 🙏🏼
@lowmagnet
@lowmagnet 9 месяцев назад
Love the ACM logo on the knife :)
@morn1415
@morn1415 9 месяцев назад
Holy moly! That´s the Utah Teapot taken to the next level...
@PereVS23
@PereVS23 9 месяцев назад
Wow the real time results are just crazy, I really hope in a few years they can get it to look 100% clean in real time, imagine that level of detail in videogames 🤤🤤🤤🤤
@mememanfresh
@mememanfresh 9 месяцев назад
you mean in a few weeks? all it needs is denoising like raytracing already hss
@Odz86
@Odz86 9 месяцев назад
The more of your videos I see the more I'm convinced we live in a simulation :D
@DontYouPick
@DontYouPick 9 месяцев назад
What I didn't understand is if these materials use the same old hand-crafted textures or if there is some generation on them too
@SicrosEye
@SicrosEye 9 месяцев назад
I am a 3D artist and I am a bit bummed he left that explanation out.
@SimonMenke
@SimonMenke 9 месяцев назад
The cadence of your voice seems to be off. did you use ML to provide the voice-over?
@hidgik
@hidgik 9 месяцев назад
Two Minute Papers recommendations showed up on my youtube 2 minutes ago. I think that asteroid will hit the earth now.
@jeraldehlert7903
@jeraldehlert7903 9 месяцев назад
That means in another 5-10 years at the most a consumer grade GPU will be able to render this in real time. I can't wait.
@touristtam
@touristtam 9 месяцев назад
Not if ML/DL is taking precedence on RT.
@Lell19862010
@Lell19862010 9 месяцев назад
What uses to determine what has to be reflected?
@mrlightwriter
@mrlightwriter 9 месяцев назад
I can't wait for these techniques to be added to Blender!
@himan12345678
@himan12345678 9 месяцев назад
It'll keep with the trend of only supporting Nvidia at the expense of all other hardware support...
@mrlightwriter
@mrlightwriter 9 месяцев назад
@@himan12345678 Well, to be honest, the Blender foundation is beginning to support the path tracing version of the AMD graphic cards, and the Intel graphic cards as well.
@Slav4o911
@Slav4o911 9 месяцев назад
@@himan12345678 Other hardware does not have tensor cores. Only Nvidia is working seriously on AI, others have dropped the ball for some reason... or just can't compete, because once your company has AI hardware to design chips for your company, in theory it would get harder and harder for everyone else to compete... even if the others are not far behind... but in this case the others are far behind. I have a suspicion Nvidia already uses their own AI to design some parts of their chips or if not part of the chips at least to aid their designers. At the moment Nvidia is becoming unstoppable, they already have the most advanced AI chips (more advanced than even what Google has).
@pandoraeeris7860
@pandoraeeris7860 9 месяцев назад
What a time to be alive aaand two more papers down the line aaand hold on to your papers aaand go little AI!
@VenkatKamavaram
@VenkatKamavaram 9 месяцев назад
Dear Doc what is the real world applications for this kind of simulations apart from creating virtual world.
@ChaosResearchParty
@ChaosResearchParty 9 месяцев назад
Could this be utilized for creating procedural maps for materials in video games? Are these shaders compatible with standard PBR shaders from video game engines?
@Slav4o911
@Slav4o911 9 месяцев назад
It's probably for creating textures for the said object without using photographs and other textures. Like for example how with Stable Diffusion you can make photos without being a photograph and without having or needing a real camera. You just put the desired material train it for this object and then have the right texture for the said object. Something like "procedural" texture, but better, because the primary model was trained on real materials. In other words, why create every time for every game a new texture, when there could be super powerful AI learning all of the possible textures and then you can use your GPU to train a texture for your specific object (with a text prompt, adding dust and similar things). I think that's how this model works.
@mm-rj3vo
@mm-rj3vo 9 месяцев назад
I can't wait to experience a VR experience that incorporates graphics that are impossible to discern from reality
@jareddias7932
@jareddias7932 6 месяцев назад
What if this is it... 😉
@DFrizzle
@DFrizzle 9 месяцев назад
Your cadence gets more powerful with every paper.
@Uhfgood
@Uhfgood 9 месяцев назад
This is cool, given enough samples, they can generate reality in realtime ;-) -- I still want to see someone using AI to restore old films. Talking 30's cowboy b-movies -- which upon travelling often get scratched torn, respliced, and then to make matters worse these old films are then transferred to video, at a poor quality. AI could really fix these to make them look like they were when they were new. Probably even replace missing footage. I've seen some AI upscalers and they end up just looking weird, with faces all distorted and such. If you can take real world lighting from a video, and apply it to video game imagery to make it look realistic, you could certainly take, textures, lighting, actual actors faces, from a multitude of video sources, to generate a clean new image.
@ArIyan_yt
@ArIyan_yt 9 месяцев назад
We will really need a dynamic range an color corection update in real world rn
@laz001
@laz001 9 месяцев назад
ok - this is the most impressive paper i've seen in a long time....wowwowowow
@eladwarshawsky7587
@eladwarshawsky7587 9 месяцев назад
On my way to read the paper. I hope i can contribute later on, wish me luck
@vandel_
@vandel_ 9 месяцев назад
0:36 I had to turn up my video quality to see the fingerprints
@lamhkak47
@lamhkak47 9 месяцев назад
Those fingerprint and smudges gonna be visually impressive, while also annoying simultaneously, until that visual impression slowly fades off, then it'd be just nuisance
@jeffg4686
@jeffg4686 9 месяцев назад
I had a thought last week that we'll just "talk the games" - no code at all, pretty soon. The "IDE" will simply be a LLM prompt type tool where you input desired functionality and characteristics, and get 10 different options of games to choose from. You choose the one you like, then it gives you 10 more options similar to that one. Could iterate even more like that, and then do some adjustments to it with more chat type prompting. Zero Code. Zero DCC work.
@jeffg4686
@jeffg4686 9 месяцев назад
@@psyker4321 - prepare for the day. It's gonna mean a better day for all. Much less work all around.
@zeekjones1
@zeekjones1 9 месяцев назад
The real time noise just looks like an old digital camera, rather than CG.
@JuliusUnique
@JuliusUnique 9 месяцев назад
what a time to be alive!
@gentleandkind
@gentleandkind 9 месяцев назад
Never once asked to squeeze my paper tight. Károly Zsolnai-Fehér has changed...
@VR_and_Non_VR_Gameplays
@VR_and_Non_VR_Gameplays 9 месяцев назад
Imagine teapot simulator with such graphics. True nextgen.
@BryanBortz
@BryanBortz 9 месяцев назад
Why is this compression and other models are not?
@carvalhoribeiro
@carvalhoribeiro 9 месяцев назад
Thanks for sharing this
@berkeokur99
@berkeokur99 9 месяцев назад
I wonder when can we run this kind of simulation in real time in our phones
@swordofkings128
@swordofkings128 9 месяцев назад
These scenes aren't even much of a challenge for powerful GPU. It'd be nice if they showed a real stress test, like a forest with thousands of really detailed trees. Or materials with lots of SSS or transmission. Not saying this isn't cool but how does this method scale?
@LabGecko
@LabGecko 9 месяцев назад
Things that look something like that are easy, sure, even if they take days to pre-render, it doesn't matter to the gamer playing. Things that look _exactly_ like that though? Nothing on a GPU does that in games yet. Read the paper. The physics are far too much to process at 60fps on mass market GPUs, but because of research like this that won't be true for long.
@ginogarcia8730
@ginogarcia8730 9 месяцев назад
Man when will we get a program for easily prompting a whole movie and it spits out a created 3d movie haha
@dprezzz1561
@dprezzz1561 9 месяцев назад
Great video. Amazing how neural networks tend to shape information technology nowdays. Of course we only hear the buzzword: AI. But deep down, neural nets are responsible for a lot of it. Thank you Károly! WT2BA!!!
@AricRastley
@AricRastley 9 месяцев назад
Dude, the matrix is going to be rad
@ProjectOniricDEV
@ProjectOniricDEV 9 месяцев назад
well, so that's something that will be present in like, unreal engine 6? i mean, how long so we can see games starting to use those techs?
@TheStabbedGaiusJuliusCaesar
@TheStabbedGaiusJuliusCaesar 9 месяцев назад
Is it possible to get our hands on this AI software and use it for old VHS videos? Would be fucking epic if so.
@niki9881
@niki9881 9 месяцев назад
My eyeballs need this upgrade
@IvanSchoeman
@IvanSchoeman 9 месяцев назад
I dont know what exactly is going on here. Does this method reproject textures with shading baked in for every frame? Instead of ray tracing the scene?
@commentatorboy
@commentatorboy 9 месяцев назад
What a time to be alive!!!
@shayneweyker
@shayneweyker 9 месяцев назад
Curious as to what frame rate the new method could handle in consumer graphics cards and be able to produce images as good as the final image a bit after movement stops.
@edwardcieplehowicz9712
@edwardcieplehowicz9712 9 месяцев назад
we are going to need a totem like in Inception
@exoticspeedefy7916
@exoticspeedefy7916 9 месяцев назад
Would be nice in real time gaming. Would need a huge amount of VRAM probably
@memorabiliatemporarium2747
@memorabiliatemporarium2747 9 месяцев назад
What about the objects interiors? You know, the collision simulations, the physics?
@icefire5799
@icefire5799 9 месяцев назад
we gone cross the uncanny valley witj this one guys
@globalvillage423
@globalvillage423 9 месяцев назад
IT really looks amazing.
@lloydfromfar
@lloydfromfar 9 месяцев назад
Everyday amazing progress! :O
@Andytlp
@Andytlp 9 месяцев назад
First game developers to use these will be making fat stacks
@hd-be7di
@hd-be7di 9 месяцев назад
Neural BRDF... omg. It doesn't even have to simulate anything... just give me the shader parameters and apply it as a material.
@testsubject318no6
@testsubject318no6 9 месяцев назад
That is the most awesome cheese slicer ever
@chaserivera1623
@chaserivera1623 9 месяцев назад
Nvidia’s AI research is happening at incredible speeds, I can see a future where they implement all of this into their GPU’s so we can restore old outdated games instead of relying on mods or remasters.
@vectoralphaSec
@vectoralphaSec 9 месяцев назад
Hopefully Nvidia uses all this research and development for Nintendo next generation Switch 2.
@NeoKailthas
@NeoKailthas 9 месяцев назад
Amazing. It's happening!! It's happening!!!
@korenn9381
@korenn9381 9 месяцев назад
You mean to say the teapot was not left to soak in beer overnight?
@ivoetzelt
@ivoetzelt 9 месяцев назад
Excuse my question, but how is this different from traditional rendering as in blender cycles for example? I can create models+materials with the shown characteristics and they will render pretty much exactly as shown here in the "mind blowing realtime view". In some cases it's even possible to achieve the same visual results in unreal engine lumen, which truly is real time. Refraction is still bad in lumen, but the teapot would be entirely possible in lumen, not even to speak of the cheese grater, which in my opinion looks super easy to render. Am I missing something here? Not trying to hate, just honestly wondering 🤔
@eddyrose3254
@eddyrose3254 9 месяцев назад
These have the capability to be rendered in real time. Meaning you could render millions of these before u render a blender one
@MonsterCreations662
@MonsterCreations662 9 месяцев назад
this is Amazing , that how good nvidia is Doing !
@ryanchenoweth5673
@ryanchenoweth5673 9 месяцев назад
What a fucking time to be alive!! 👏👏👏👏
@oisiaa
@oisiaa 9 месяцев назад
Can't wait for video games in 2030!
@Dimencia
@Dimencia 9 месяцев назад
When you showed vs the reference, I thought aha, there's its limitations... until I actually read the labels for which one was the reference This is crazy But even the reference simulation is amazing, haven't seen anything that detailed before in computer graphics. I guess because it's too expensive and takes too long to do, so far
@RandyJames22
@RandyJames22 9 месяцев назад
5:05 Where's the pinch button??
@dguisinger
@dguisinger 9 месяцев назад
ok... but how do you catalog and generate training data for 40 billion materials in the first place? that number is unfathomable to me
@getsideways7257
@getsideways7257 9 месяцев назад
Paint me impressed. Would be even more impressed if that was demonstrated in HDR. Speaking of light transport while being restricted to mere SDR is just not serious...
@eclipse4419
@eclipse4419 9 месяцев назад
crazy, just crazy... wow
@Greedygoblingames
@Greedygoblingames 9 месяцев назад
In the next 10 years virtual worlds will be indistinguishable from reality at this rate!
@Bartskol
@Bartskol 9 месяцев назад
Looks like we have an upgrade in software, hardware, data transfer between devices, generative content and synthetic data, VR technology getting cheaper and better, brain wave reading and interpreting by llms, there is no other way then us having multiple lives. One more step would be increasing our brain activity and making simulation very slow but for us experience it very fast so it seems that you live 90 years but you had only played your game for few seconds then next simulation playes automatically with randomly set generative world.
@zrakonthekrakon494
@zrakonthekrakon494 9 месяцев назад
How exactly does time dilation in a simulation work? If time is dilated in a simulation imagine how much more work we could get done…
@punithaiu
@punithaiu 9 месяцев назад
Real time path-tracing is almost at arms reach.. as you say - what a time to be alive 🔥
@wowforeal
@wowforeal 9 месяцев назад
what a time to be aliiiive
@bohorquez92
@bohorquez92 9 месяцев назад
even faster and saves on gas crazy
@firestarter4247
@firestarter4247 Месяц назад
You're like the David Attenborough of computer graphics research.
@infographie
@infographie 9 месяцев назад
Excellent.
@dinkledankle
@dinkledankle 9 месяцев назад
Lambda is maxed out 98% of the time. Good luck nabbing an instance to do anything on.
@P.Aether
@P.Aether 8 месяцев назад
Realism does not equal beauty, you alien robot
@mizoik9893
@mizoik9893 9 месяцев назад
So ai now can pbr
@msidrusbA
@msidrusbA 9 месяцев назад
matrix looking not so far on the horizon... the fact that we made a neural network that can simulate intensely realistic 3d objects is insane. the world is moving fast and all i can do is watch and accept the reality we are thrusted into. all hail our future ai overlords.
@Slav4o911
@Slav4o911 9 месяцев назад
The neural network in this case is used only for the materials, not to design the object itself.
@Auziuwu
@Auziuwu 9 месяцев назад
My question is how nvidia got ahold of 40 billion materials
@iqbal_pradana
@iqbal_pradana 9 месяцев назад
this is more realistic than real life
@KkommA88
@KkommA88 9 месяцев назад
I wanna use it on MY PC, not some rented stuff. Don't care if it takes longer, results count!
@hughjassstudios9688
@hughjassstudios9688 9 месяцев назад
We need more local runtimes! Doesn't matter if it doesn't in todays GPUs VRAM but it will certainly always fit tomorrows.
@Deathend
@Deathend 9 месяцев назад
Oh shit. It's finally happening.
@nicholaspostlethwaite9554
@nicholaspostlethwaite9554 9 месяцев назад
This looks great and being a research paper means ordinary potential users need more, or extra information. How does this (or similar) Actually work from a user point of view? If it will run on a 4090 then it is potentially home desktop capable of being used. Would I need a real object and particular, and good, cameras to gather the images it needs to make the textures? Does it have to make the object as well, photogrammetry like? Is it making a texture from a real object that can then be used on any 3D model like we think of PBR textures now? Would it the software, ai stuff, likely come as part of, in this case an NVidia new graphics card? SP we install it and then hook up a camera, or recorded images and point this ai at it and say Mak the materials on this item. They we use those materials on something else we may be modelling? It is great you show us this stuff but it is at a disconnect with hands on for ordinary non scientists actually to make use of.
@technewseveryweek8332
@technewseveryweek8332 9 месяцев назад
He said its not based on photos, but ground up simulation
@nicholaspostlethwaite9554
@nicholaspostlethwaite9554 9 месяцев назад
@@technewseveryweek8332 Um, thanks what does that mean? Not made with procedural nodes presumably? This goes in with the problem of converting info into user style understanding. Where is it getting the data from which to 'invent' the materials put on the mesh object?
@Slav4o911
@Slav4o911 9 месяцев назад
Nvidia big Supercomputer has made the initial textures i.e. "the ground truth", then you put your object tell it you want a "blue ceramic material" on your object and your RTX 4090 basically bakes this material to your particular 3D object (*this takes a few hours). Then you use this "baked" material on your object and the AI recreates it in your game or 3D application. But it's not a normal "texture" it's a whole material with the bump maps and the other things integrated into it... it also seems to be ray tracing "compatible". In other words this works like some super advanced DLSS, where you use the already simulated materials by Nvidia for your game and thus make your object more realistic just because the initial materials were already simulated. Right now there is a difference between ground truth from your game and the Nvidia supercomputer and that's why DLSS is not perfect.
Далее
Ray Tracing: How NVIDIA Solved the Impossible!
16:11
Просмотров 791 тыс.
The Fastest Maze-Solving Competition On Earth
25:22
Просмотров 19 млн
Я НЕ ОЖИДАЛ ЭТОГО!!! #Shorts #Глент
00:19
ЮТУБ БЛОКИРУЮТ?
01:52
Просмотров 820 тыс.
Training an unbeatable AI in Trackmania
20:41
Просмотров 13 млн
How This Fusion Reactor Will Make Electricity by 2024
23:06
NVIDIA’s New AI Did The Impossible!
9:26
Просмотров 264 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 857 тыс.
50,000,000x Magnification
23:40
Просмотров 5 млн
How do Video Game Graphics Work?
21:00
Просмотров 3,4 млн
This Is Ray Tracing Supercharged!
8:17
Просмотров 148 тыс.
Nvidia's CEO being hilarious for 3 minutes straight
3:01
How Far is Too Far? | The Age of A.I.
34:40
Просмотров 62 млн