2nd test of virtual cinematography tools for filmmakers. Shot in real-time in Unity. #VirtualProduction #Unity Assets used: Malbers 'Realistic Elephants' model & Unity demo forest environment (BOTD). Music by Lawrence Whitehead.
I started my raytracing experiments with a DOS app called “Vivid” over 30 years ago. It would take hours to render a single frame. The volume of data that must go into a short like this is mind boggling. Very impressive. Soon, we won’t be able to tell reality from fiction. I’ll take the red pill.
Thanks Mark. Yes the environments and elephant models here were created by some of the best 3D artists at Unity and working in games. We're just trying to frame and shoot these assets in a more filmic way with a virtual film style camera - it's certainly a new world of possibility, and evolving fast! Exciting to think you will have this level of detail in VR in a couple of years.
cheers! And thanks for pointing out the ear clipping, it's a simple rendering error we hadn't spotted in a late night upload. One problem with realtime is there is less time to spot human errors. Easily fixed tho :-)