Woah this is really impressive. I wonder if this could be used for skeletal meshes as well one day. Like when in animations hands can unfortunately clip through the body, etc.
Great work! There are many applications where these stability and non intersection guarantees would be very welcome. Any chance we will see a public github with code?
Our research is just published this month. It could sooner be applied to products. But for games, as the current engines already provide nice visuals, intersections may not be that crucial? Not sure what game developers and players think about it.
@@minchenli2292 VR applications probably would make more use of this kind of algorithm where precise collisions are preferred. Also crunching simulation time down in a offline renderer is always a good thing.
@@minchenli2292 It will be very important for robotics simulation, such as in isaac sim. a lot of tasks in robotics require robot able to practice virtually at very fast speed and with very accurate tiny realistic contacts, not approximations for 'visual' effect. Congrats!
I really enjoyed this paper, but now i'm curious. Will this actually come to Blender as an add-on? if so, will it be a free or a paid one? i'm really interested in this tech
@@minchenli2292 Wow, you actually gonna make it available? OMG, can it make it to Houdini too? If this simulator actually have connection to reality I just gonna cry lmao...... I always be dreaming about actually simulating things like sound, temperature, air, metals, cloth etc.
I have a doubt regarding the visualization of models. Here and in other SIGGRAPH papers when we see cloth simulation ( the visualization of it) how do you make the visualization. Like is there a library for cloths, textures, hair on which you can directly apply and test mathematical models or equations. Or do you have to create the cloth each time using OpenGL and programming and then apply models on it. Please answer this question. Also is there a tutorial where I could learn that. I wish to try and implement SIGGRAPH papers on my own hence the question. Thanks
For simulation what we mainly develop is the algorithm to predict the geometry at different times. The rendering is usually done by using software like Houdini or Blender, etc.
Very impressive , does it work on inside outside "eversion retraction" deformations too without exploding? to recreate organic effects like snail eyes or blood worm mouth or insect molting or soft robot eversion like this project : ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-n5e118traDc.html
2019 "You cant just set incredible forces in a simulation just for expect interesting resoults. The objects will just be bugging through the wall and everywhere." 2020 "Hold my bear!"
2:00 Damn, when I watched this I was worried the entire time that there would be geometry going through itself ^^ That is very stressing (pun intended)
whoa this is amazing. would be wonderful if it could be implemented into blender, don't know much on code or anything. so I don't know what would work or wouldn't.