The Unreal Engine 5 cinema camera makes everything look fake. In this video I take a look at what we can do to get more photorealistic renders from Unreal Engine. Download Virtual Lenses: www.joshuamkerr.com/virtualglass
This reminds me so much of the guy who made a working camera in blender, I love the solution "build it as real life" and bam, you get realistic result, what a surprise :D
@@user-is8nn1sb1n Nope, because you simulate the bending of light rays as they do in real life, and that affects how the elements are perceived by the camera, not just the colors, but the different "layers" like background, midground and foreground, plus you get accurate light effects when hitting a lense. All of that are impossible to get on post (for a video at least)
Oh cool. I only saw this after I asked if this would be possible in Blender. Is this camera build something that can be purchased as an add-on or something?
@@eladbari damn if only we were on something almost like a search engine where you type into a search bar.. such a shame.. But because you can't do this yourself apparently "Make Your Renders Unnecessarily Complicated" & "Achieving True Photorealism With Lens Simulation" are the two videos I ASSUME people are referring to p.s copy and paste those titles into the magical search bar
@@eladbariit's just a RU-vid video. They've seen it years ago and would have to search for it as well, just like you. It's not an addon because it's completely impractical. Fun experiment, yes, but the added render time is enough to learn all about lenses and then about post processing and then create a realistic result that way and still have time left.
"Hey look i made a system to get lense aberrations!" "but we already have that, that's what all the options and sliders, etc are for ??" "yes but i made "virtual lens" system that has the same effects, with more steps, and it renders slower too!" "this is beneficial to me HOW exactly ??"
@@pieppy6058 "incredibly fast". ... so just to get a standard 24fps video, it takes 360 seconds, that's 6 MINUTES, per second of 24fps video. 360 minutes (6 hours) to render 1 minute lol. a 10 minute video would take 60 hours, that's two and a half days ... "incredibly fast" 🤣🤣🤣
This could work for a still image, but the excessive bouncing light rays will spike your render time and heavy noise especially for animations. These tasks are usually handled in post-production for a reason. Unreal Engine's advantage is real-time rendering, but this will make it slower than traditional software and compositing.
It is an increase in render time for sure but still very possible to render animations. The big leap that will make this approach more viable is the introduction of a good denoiser, which I believe is coming in 5.5. I'm putting this together with the future in mind, not the present.
@@JoshuaMKerr It's possible, but refractive materials significantly increase render time, especially if they cover the entire camera view instead of just a part of the frame. Even small speckles in reflective objects can cause this issue. Adding a refractive object over a virtual camera will make rendering animations very challenging. It's a cool idea and technique, though!
@@cgartperson plot viability over time and eventually this approach will cross the line of trade-offs. I think what's important is that this is the correct approach for realism (simulation). All it needs now is optimization and/or time.
@@NOMONEYTV Once we have realtime raytracers with hundreds of bounces, glossy reflections, refraction calculations, etc. By the time we have that technology to do this instantly, these type of rendering systems will be outdated and probably replaced by neural rendering that fakes all of this. A denoiser isn't going to fix this approach. These pathtracing engines have existed for many years, do you think none of the PHDs who developed these programs never thought to put a virtual glass in front of a camera? There's so many reasons why it's not done this way, doubling down on the argument and saying 'this is the future of vfx' is spreading ignorance.
@@cgartperson If you have a given frame budget, with a focus on a natural looking render, and this fits within that, why not use it? It might cause a 10x slowdown, but that could still be viable if you want the upsides. Getting good distorted bokeh is virtually impossible with post-processing, and the process of ensuring consistency with application of lens effects in post can be a nightmare. This guarantees that when modelling a lens, you're going to get consistent distortion, vignetting, bloom and other optical effects. Full lens modelling IS the future, the process just isn't optimised yet. It's like when people were shunting raytracing in video games just because the first few RTX implementations were slow, and now Lumen is used everywhere.
This is AWESOME! I will defintiely buy those, thanks for making one free to test. Anamorphic definitely wanted as well. What is your Graphics card and does it slow down your render time?
Chromatic aberration and lens dirt effects are the first things I always remove in games settings. My eyes don't have those things. But it is surely possible to recreate eye lens effects no? Some fatigue, dryness, floating dirt in the eyes or on the surface, astigmatism..! Now that would be real 😉 Still amazing virtual camera lens you have there to transform light paths! Crazy tools for ultimate cinematics!
As a filmmaker / cinematographer, this is BRILLIANT. Such a smart, creative solution. If we're making virtual worlds, why not use a virtual camera with a truly virtual lens? Love it.
Joshua, I admire your approach and result looks phenomenal. I have few questions as "conservative CGI artist" who used to DaVinci and Fusion and all that "2d" filter stuff. Why not Post Processing Material? It might be less performance hungry and can do the same effects. Yes it is much harder to make proper and need to know blueprinting or sometimes even C++. How much samples would you need for this to render without fireflies? So it all boils down to: how much performance is affected with this approach? Keep doing great stuff man!
@@JoshuaMKerr, I wish you had compared it to an image that included the post-processing that can be added within UE5. You can approximate much of what you are doing with pretty minimal overhead. Of course, the through-the-lens shot will look more real than the unfiltered 'perfect' Cinema camera. That's why the other stuff exists.
i wonder how far you could control for this with just a post-process material and render targets. like others have said, it looks like this will affect render times by a lot and create a much grainier image
That is so cool. When I did that in Modo about 10 years ago it took forever to render and I didn't even have the lens dirt and grit. It wasn't manageable for anything but still shots.
That's all great, but it just doesn't make sense. Using stouch glass won't help to calculate a scene in real-time in Unreal Engine, and since UE is only used for backdrops in movies, it doesn't really make sense either. In Blender, 3DS Max, Cinema 4D, Houdini, and others, there's a function to adjust the lens, as if the image passes through glass. It's not a post-effect; it's simply the lens emulation, and it works without an add-on. Another issue is whether you want to simulate a specific lens or glass with unique characteristics, but in that case, you can't be sure that your material settings in a 3D program will correspond to the physical properties of the real object 100%. So, again, the idea doesn't lead anywhere. What would really be useful and worth doing would be to create LUTs for old films, which could, if possible, not just reduce colors to the final result but also emulate chemical reactions in film, producing the "real color".
What always amuses me is how hard we strive to make the thing you're looking at look like it's shot through a lens even when it isn't. Like, games where when you're in the rain the drops are running down your eyeballs, or you get lens flairs, even when the character isn't wearing glasses.
Agreed: In all seriousness, the amount of science that goes into making high end glass for cine is far beyond looking up a couple "simple" lens configs on the web or in a book for this or that focal length, creating a glass like virtual material (which is an entire OTHER ball of scientific wax), and slapping them in front of the virtual cam...Optical Engineers most of us are not...I'll take my chances using VFX programs and compositing things together for filmic pieces; and, anyway, gaming doesn't "need" such photorealistic effects, which would likely take away from the feeling of "presence" that is unique to that medium.
Excellent work, I thought about that myself every time I try and render a cinematic out of unreal. Glad you found the answer!! Definitely going to purchase this.
You can see there are a lot more noise that takes much longer to go away when using the lens, but I am sure denoising will be strong enough to remove them in the future
Looks cool, but it's going to be hell for sampling! You'll get a much grainier image, or spend much more time rendering. The best place to do this stuff is in a projection shader (or whatever name unreal uses for that step), so the bokeh and lens roughness can be importance sampled. Also if the engine treats camera rays as special (many do), it would preserve that classification.
Thats probably true. Im more interested in the possibilities across the next two version of unreal. adaptive sampling in 5.4 and an excellent new denoiser in 5.5 is coming.
Can you model specific classic lenses, like Cooke Panchros, Zeiss super speeds, Super Baltars, and since you mentioned anamorphic, are you going with the panavisions?
Honestly, I don’t see why the lens manufacturers themselves don’t provide digital maps for their lenses that allow for this. It would greatly help in the virtual production process.
The difficulty is getting access to the lens design, the curvature of each lens element, how many groups and air gaps. These all need to be modeled and without the info you just have to make it up.
Octane Render has all the things you have reated just built in from custom aperture shape, bokeh bias, all sorts of lens distortions, exponential glow, spectral dispersion glare, chromatic abberation, ACES tonemapping etc. I have never tested Octane in UE but since it´s free, it´s likely having less features, But if those feature are existent in Unreal, you will find them at octane camera settings and octane render post settings. Render settings is global. Camera setting override global individually But anyway, creating all of this from scratch is crazy awesome. Kudos!
What he's doing is a little on the nerdy cinematographer side, so i can see how some would say its not needed considering UE already has camera effects. What's cool about it is that it opens the door for a similar process that cinematographers use which is choosing vintage, flawed or just hyper-specfic glass to achieve a certain look. And it's not just increased chromatic aberration or a certain boke shape, it's usually a mix of things that create a "secret sauce", a complex mix of variables (not just specs) that shape a unique look. What you get, and what cinematographers tend to look for, are those unexpected effects that are natural artifacts of the lens. Someone can engineer these types of lenses for other users to download. Like a Helios 44-2 Russian lens or Ultra Panavision 70 anamorphic from the 1970's.
Nice work! I've seen this method used in blender, man used real lenses patents to make such lenses. Is there any way to achieve something similar using Lumen?
Love - we've done this in the past in Lightwave, Blender Cycles, etc. So cool that Unreal can do it as well - and love the range of effects you're getting. Cant wait to have a look!
Very interesting! I have one of my favorite lenses sitting on my desk (the Helios 44-2 58mm f/2.0) Might that be one of your reference lenses for the swirly bokeh? :D
Hey Jon, I love your channel! Yes I have that exact lens sitting on my desk right now allongside the 44-M, i'm very heavily inspired by those lenses. In fact I'm trying to make a virtual copy of the 44-2 right now.
I just bought the pack! What setting in the blueprints or component of the lenses make them specifically compatible with 16:9? Can we adjust something to our liking so they work with other filmbacks?
They were modlled to cover a 16:9 Digital Film sensor. They don't automatically scale fro different filmbacks at the moment. Deffinitely something I'd like to implement.
I like how artificial imperfections seems more "natural" to us when consuming entertainment. Just like 60 FPS movies aren't a thing because we are too used to the lower FPS formats, and the higher FPS ones feel like a bad TV show. There are a ton of examples like this, like lenses flares added to game/movies artificially, even if it's something that appears on a camera and doesn't happen on an eye. So bascially "realism" is just recreating something that we are used to, not really getting close to the ground truth when it comes to light.
There are about three people (that I know of) who are doing awesome things like this but their work wasn't part of my process of learning and building this, and I'm creating for different software.
Good for him. Maybe this guy's not in a position to give this away for free. Either way, it was a ton of work, and looks like it has real value. If you don't want to buy it, that's fine. But don't complain that it's not free. Go make your own
It's not his job to market the others guys stuff. He made his own thing with hours and hours of work. He's not entitled to do anything. Actually that's more on the consumer to do their own research and find free alternatives if anything.
Amazing! So simple an idea but it took a brilliant mind to come up with it. Great job! Can't wait to play around with the 85mm and then I'm sure I'll spring for the others.
All of these things you describe as being (potentially) great are fine, as long as you are making Interactive Movies/Stories. They are in fact garbage if you are trying to make Video Games or Simulations, in which the player/user is _not_ looking through a camera, but only out of his own eyeballs.
not everyone's vision is crystal clear 20/20. We have floaters, focus issues, near sightedness, far sightedness, uneven strength in each eyeball, so next would be to model a pair of eyeballs to look through
Actually, a lot of new games that people think look very realistic are achieving that effect in part by employing visual artifacts associated with cameras and imperfect lenses. Look at a game like Unrecord for example.
@@JoshuaMKerr My apologies... I didn't realize that you weren't speaking with respect to video game development. P.S. I had been watching videos on video game development and commented too hastily.
I’m really curious to know how this looks both in motion and in games (if any of these fx are light enough on render power that can be done in realtime).
I don't understand? Why use a realtime rendering system just to make it no longer real time? Why not use a traditional renderer and get better results? This makes no sense
Hate to break this to you, but all these effects can be achieved a lot more efficiently in compositing. What would be really cool would be to figure out how to achieve them with a post process material that runs in realtime.
What is your render time with them vs without? Most people would render plates and cryptomates to take care the real camera lens looks in post what would be the real benefit?
Do you by any chance have plans selling the lenses on the Unreal Marketplace? I'm having trouble processing payment through your website cause of some system maintenance
I will be looking into that but im not sure how long the process takes to get approved. Not sure about the system maintainanece issue, you could try clearing your cache.
@@JoshuaMKerrIt turned out to be a maintenance issue on my bank's side, your website works perfectly fine :) Thanks for the tip on your Marketplace plans. Oh, and, yeah, both the idea and the execution are mindblowing, can't wait to try 'em out :)
@@chelo111 whether it is in editor or rendered is irrelevant - it's cinematic preview, if anything it's just a lower quality preview of the final render.
@@y0blue thank you bro, i like that, i just wanna see in a real render, i wanna see the asset in action if you know what i mean....that's all. josh a real one we all know that
Also in VFX we delens plates in Nuke.....often in CG we dont render with lens distortion and apply the lens distortion in Nuke. Also in VFX we do NOT use Bloom. You might want to try manual focus in your camera settings. Your Bloom looks horrific...and comp sup in VFX would slay you. Your example of Bloom looks like someone licked the lens LOL. You might want to rent an Arri, Sony or Red camera.
Yes its possible for sure because its all physically based. But as it worked with cine camera focus I decided it wasn't worth the extra work. How are you these days mate?
how does this effect how you approach virtual production? Do the artifacts and virtual focal lengths make it harder to realistically composite your live action footage?
Yes, I like to hear more about other lenses. This is going to help big time with what I want to do in future. I'm just worried about rendering time. So saving up to get this sorted out to take advantage of this. More on lenses, and other lenses in fact. Cheers!
As a VFx artist we use cameras that the production shot with. You can get camera data out of Nuke.... There are also cameras you can get cinematographers guild with exact camera specs with lens packages from the academy... No offense your home made cameras are trash.
This is so amazing! I have been trying to recreate a lense on Adobe Substance to some degree i made it work but it was not perfect. This takes it to another level. It's so bizarre and unbelievable to know that a lens has been virtually created and has working functions just like in real life. Awesome job!
Love what you've done here Josh! But I have a question. It may be just me, but I think the focus should still be able to become tack sharp, like real cinema lenses, but that's not what I'm seeing. Also, in your examples did you show the frames with Path Tracing in mid-render? Or is that the final rendered frame (for example the frame at 5:49). If it is mid-render it would be nice to see the final render. If it is the final render it's super noisy. Maybe I'm just not understanding the examples? Thanks, Gary 🙃
Hi man, they're, all mid-render. I show final rendered images on the site with before and after examples. The lenses can absolutely become sharp, but my focus (pun intended) of this project was to try and get the aberrations first and foremost and a lot of that is driven by materials on the lenses and the aperture planes.
The lenses themselves were built and tested in blender so they absolutely work. But the shaders were built in UE5 and that's where a bit of the magic happens.
Nice work, ive been looking for something to match my dslr camera or iphone 15 pro shot with unreal but I think that the whole issue is in the sensor size and lens
how do I actually buy this? The website lets me add it to cart, but there's no cart area. After it says it's added to the cart, there's no cart icon or for it to kick me to it.
Big yes for anamorphic! I'm using lens distortion and other lensing effects in UE atm, so having those all be in the actual "glass" would be awesome. It's a shame these only work in PT, but it makes sense. Really just means that for hero shots I'd use "glass" PT'd shots and others I'd use the rasterized stuff with the in-camera add-ons.
Great job! Just bought my set ;-) And of course we want anamorphics ... Can these effects also be used during live virtual production as for example with LED walls or does it only work with post editing? Maybe this is a stupid question. But I'm not very experienced with virtual production yet ...
It seems to me that if you were shooting with a VR wall, you would want a pristine image, because a REAL camera lens is going to be capturing the scene and you wouldn't want to double up on imperfections.
The extra bloom look reminds me of an old Black Pro-Mist filter on vintage glass. This is very impressive, Joshua! My main question is how this would or would not work with the Octane renderer in Unreal. You, more than anyone has championed and pioneered Octane with Unreal, and I really hope your new lenses can work with it. Or does Octane have enough customization in the attributes to replicate the amazing camera lens look your virtual lenses achieve? Would love to hear your thoughts!
BTW, I forgot to comment on how excellent your included video tutorial was. (I don't normally do any kind of path tracing, so just going thru the motions with your tutorial was very helpful). (And of course, I cut back on the roughness, and dirt a bit hehe)
I love this. It may not be efficient for workflow, but it takes the finickyness and mysticism out of how to get from point a to point b When you're searching for a look and you don't know how to get there, and you don't have real life vintage camera's and lenses to use for reference. You also get an incredible result in a way that moves several layers of abstraction. This is such a great tool, even if it's just used for learning, or a keyframe diagnostic/reference when you send renders down the pipeline.
Really cool, just a couple of weeks/months ago i saw someone do something similiar inside of blender. What i am missing here in the demonstration is an image with actually usable noise levels and how many samples do you need for this ? Maybe the same thing could also be achieved (and much much faster so) by training a neural network for style transfer, i mean it would be easy enough to generate the training data...
This is terrific. I mostly deal with long env shots for anamorphic would be amazing. How does this effect longer and wider shots. One thing I struggle with is how cinecamera makes things look miniature or shot with that weird effect that makes everything look miniature. Great vid and thanks!
Glad you like the idea, I think the easiest thing would be for you to test the 85 and see how it handles. I think if depth of field is too shallow you can definitely get a bit of that miniature effect. But I haven't seen that with these yet.
Great video, when Cyan created their masterpiece 'Riven' they used real-world textures to map onto their geometry, they are currently remaking this in Unreal and the textures don't look anywhere as realistic as the original. Great advice here, love it.
In game development you'd usually achieve this using shaders. Either by adjusting/rewriting the standard surface shaders or adding different shaders for certain objects. The standard surface shader in unreal does have a cartoon look to it.
@Zircron45 just finished a very similar project and made a video detailing exactly how to create custom lenses in blender, more specifically how to recreate real life lenses straight from their patented specs. both of you were probably working on these lenses at the same time and I’m curious how different you guys approached it.
It's amazing idea. I'm about to use it on feature film or mini series length of feature film and later on anything else. It's great approach. It looks good and even rendering time is good in my case. So even if I will make more versions due to seeing screenplay, revision after render and rerender, I'm sure that is the actual way to go.
What a treat. To have access to tools that bring realism to UE. Currently, my iMac Pro isn't powerful enough to run UE without waiting eons for it to even open. Someday, I hope to get a faster computer to start working with UE. I would really like to use it for my narrative projects. Thanks for the demonstration.
Rendering through a plane with fine bump map would at least take care of the bloom effect, that's probably the coolest to have. Of course, accurate refractions are more important and there is probably a benefit to having exact lens replicas, but I could also just do 2 flat planes of virtual glass with artificially set refraction indexes that add up to 1. Wouldn't be scientifically accurate, but I wouldn't have to do the complex math either. Wouldn't have to worry about the lens breathing. There are ways of doing quick patch solutions to get 99% there with minimal effort. 6 minutes to wire the shaders instead of 6 months. Will go try it in Maya and Blender. Cool tricks still. Cinema buffs would probably want that.
This is impressive. Well bloody done! I don't have a use for them yet, but if I ever do, I'll be supporting you. And I'll be telling everyone about this.
bought them and they look amazing! Although I'm using an HDRIBackdrop and for some reason it makes it seem as if that object is disabled when the lens is active. If I hide the lens or all 3 glasses in the details pane, then the sky comes back along with my skylight. Really weird behavior. Wondering if you know why this is happening? as it's probably something to do with the post process fx or materials or something when using the lenses. My 1st thought was translucency, but didn't see anything that affected anything
threw in an emissive double sided sphere as a replacement for the hdri backdrop for now and seems to have more or less the same effect as my skybox for this specific dark scene, but would like to know if there is a solution to this.
absolute genius! just bought the set, can't wait to test them out. I feel like this is a product you might have some success with in applying for a Megagrant.
How does the lense array affect rendering time? Convergence? Edit: You confirmed in another thread that it is slow. I wonder though, can paths through a lense array be precomputed? Instead of casting rays from the camera, precompute how those rays interact with the lens, and then just cast the ray from there. Easier said than done of course, as modern ray tracing is way way way more sophisticated than just shooting a ray. If that stuff is interesting to you, you should read the Metropolis light transport paper.
Joshua, this is so cool! As you may or may not know I made a video a while back about real life anamorphic lenses in Unreal Engine 5, but this is just taking that to next level creativity - I love it. Also as mentioned, it reminds me of the camera video from Blender haha. Shame it's not for Lumen which is what I use 99% of the time. 🤔
You should market this for the specific camera/lens. If you had this set up for a Canon Rebel with a Canon EF 100-400mm f/4.5-5.6L IS USM lens and a kit of calibration tests like an led point source system, black, white, and gray cards, color correction cards, and matching virtual objects, I would buy that.
Randomly ran into this and know it's far beyond my knowledge. And I hope my amazement at what you as a single persin has done here is warranted. Amazing.
Grainy image, you mean due to the path tracer? or something else? (I bumped up my Directional Light / Intensity to 70 Lux to better see the Japanese Komainu, but if you want to decrease grain appearance, you just have to let it keep rendering until the progress bar is completed)
This is INCREDIBLE!!! I really love seening the push in technology to go beyond the old limits of 3d work. Going from trying desperately to get a human looking shape on screen to being able to virtually replicate how a modern camera lens affects small details which drastically change the image.
Can we use lenses like that to get rid of aliasing? For a long time, I'm thinking that it might be a better solution, than digital anti-aliasing technics. Because real life lenses help us a lot with removing sharp edges pixelation.
The absolute simplest form of anti-aliasing is to just render at a multiple of your target resolution and then resize it down by averaging pixels together. It works perfectly, but is very slow (4x the dimensions, 16x the work needed). All other techniques have been invented to get similar results with less computation. In a real camera, it's not the lens that reduces aliasing, but a special dedicated filter in front of the sensor that does a very slight blur to remove spatial frequencies higher than what the sensor will be able to capture. It's basically the equivalent of adding a 24kHz low-pass filter between your microphone and Analogue-to-Digital Converter when recording audio at a 48kHz sample-rate, because anything higher than that will add aliasing. The same principle applies to the camera, except the filtering is being done in space, rather than in time.