I kind of prefer this educational style more over the over-charismatic jumpy forced fast forward style that many creators do with their blender videos. Keep up the concise and precise explanation. 👌🏻
Sadly everyone is on that jumpy time quick saving 1 minute video train. Like I like detailed videos explaining to me like an adult or some Ted talk lol.
Your videos are killer. It’s hard to find great Blender content this in depth aimed at an audience that already has a good understanding of the underlying fundamentals of 3D/Blender.
Thank you for explaining this. No one (I know) can explain this kind of technical data better than you. I'm not sure if anyone else is even bothered or knows how to explain it. This is a great service to the Blender community.
I have noticed this problem since 2014 and I was trying to learn computer science and programming to create a color system since that period, but because of the war in Gaza since 2014/2016/2018 unfortunately, I was not able to do that, but now when I leave Gaza and my circumstances improved, I discovered that someone had preceded me in this 😂😂😂 Sorry for my bad English
Chris, let me take the time to give you some feedback here in the comments. I'm putting it under your latest video, but it applies to all your content so far. It has been an absolute joy to go through your material up to this point. Even though I'm experienced enough in Blender not to "learn new stuff" per se, I absolutely love to watch this kind of thought-through, well-paced, clear, and quality-driven content. Many years ago I spent my time in Cinema 4D and used Maxwell Render as a plugin. With a photographer background, I really appreciated the artist-friendly approach that Maxwell allows: model to scale, describe the material properties (Maxwell gives you great control over Fresnel and falloff), set up your camera as you would a real one and just press render. No tricks-just beautiful light (if you wait long enough 😇). Since I moved to Blender years ago, while I like Cycles, I've always missed that 'chef's kiss' details in the plausible natural lighting that I feel I got from Maxwell. While I haven't jumped into Blender 4 yet, what I'm seeing in AgX, I feel that might bridge a big part of what I've been missing. The larger dynamic range and the natural desaturation in the highlights will do wonders for photorealistic rendering. It is an absolute crime that you have only got around 6900 subs. But PLEASE stick around and keep doing what you do. I would really love to see more from you using Blender 4 with its new technologies going forward. I love the look of your renders and would really like to see more videos focusing the balance between model quality, texture complexity, and lighting setup. It breaks my heart that "commercial success" on RU-vid doesn't follow the quality of the content, but it's hard to change. I hope the artist in you can find some amount of consolation in the fact that there is at least a small community here that really appreciates this type of content. I will try to get my act together and contribute too.
I’m going to share this video with colleagues who use Blender. Will do my part to support your channel 👍
Год назад
At 17:22, if you use the ACES from Blender, the OCIO config send with Blender, it's not the real ACES, not match (too desaturated). The one with Blender is mapped trought the view transform inside a REC.709 profile. I use the real ACES 1.2 using OCIO variable and it's closer to the standard sRGB but with a better control if not clamped in indirect (ok, the six color problem still here but can be solved in comp). I'm not so quite happy that Blender devs choosed AgX because the principal problem that we are not using Blender in studio is because the lack of industries standard and here, I think, it's the worst error they made. AgX is nice to eyes directly and ok, less work to do in compositing, no more six color problem but if you light your scene correctly and you keep the rules of "no color in real life have a saturation other a value of 0.95" all is good. The same problem is present with USD, this last one is imported, not referenced using the USD layer system that is the base of the USD goal, so here again the devs close the door to use Blender in studios... And I say that because I like a lot Blender and want to use it in studio but choices like this put Blender again at the side of the road. Maybe I'm wrong and AgX will become an industry standard but I think not because it's not only a color system, it's also a LUT and in VFX / 3D movies and series we need to match color directly with a MacBeth table and not spend time to estimate it because a LUT (look profile) is inside your render and need to no wich one. Great video on the subject, thx.
I'm not sure what you mean by not real ACES? Presuming you are referring to the "ACES Linear" colour space, that is the ACES 2065-1 (AP0 primaries) not AcesCG (AP1 primaries). If you are loading ACES Linear output from Blender as AcesCG then it will look desaturated as they use different primaries and AP0 has a larger gamut. I do agree adding AcesCG as an input space within the Blender default OCIO would be a good idea though but the beauty of OCIO is you can modify to match your needs as an individual and as a studio. Studios often don't use the default OCIO set ups anyway, they set OCIO via environment variables. As for VFX and Macbeth charts it doesn't clash with ACES at all. AgX is not a colour space it is simply a view transform. The view transform doesn't need to be applied to your output from Blender, export linear. The scene data will be the same and you can still use Aces interchange colour spaces if you wished then use AgX for your view transform in Nuke or whatever software you composite in. It won't affect matching Macbeth charts at all as you are matching the colour **before** the view transform is applied to your image data whilst it is still linear. It's best not to think of AgX as an ACES alternative or competitor. ACES is an encoding system that happens to have a view transform within it. That view transform isn't ACES (as much as people think it is). You can still use ACES in your pipeline and then opt to not use ACES pre-packaged view transform. AgX is a better alternative to ACES ODT not the whole encoding system. Hope this helps.
Год назад
@@llennoco It's exactly that (AP0 vs AP1 and of course the rec.709 profile forced in the Blender 3D viewport). I use AcesCG in Blender for all (color wheels and co), so I have an AP1 trought all my pipeline. Not solve the rec.709 hardcoded profile in the viewport (background image in camera) but in blender compositor it's ok and in Nuke and Davinci too. I'm working only with exr file in AP1 linear in all my softwares, less gamut but constant in all softwares. And yes, AgX is better to the eyes directly, but not well incorporated in differents softwares, so a bit tricky right now to have a pipeline with it.
Every video of yours is incredible, they are the right amount of beginner friendly info to in-depth info without being overboard. The many examples you provide in various scenarios and demonstrating what's happening and why this/that is better... just amazing. Thank you so much for making this content and I look forward to every video you make!
I have never seen such a clear and well demonstrated explanation of what gamut amd view transforms are.and seeing the shift between them really makes it so apparent. Having ALL of that color data there, even if desaturated, is mind blowing to see. It is so clearly all there.
Thank you for this awesome video! Quick question. In your Blender viewport there's a sort of "display linking" button enabled on the upper right. I never saw it before. What is it for?
Yes, it's a fantastic add-on I can't live with now. It makes it so when you switch from one viewport to another, the are matched up. It's frustrating when you go from one viewport to another and have completely different camera and shading. Someone wrote an addon to sync them up. I did a quick video on it with a link to the addon. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-jSnLLPQhDsc.html
All of these look absolutely terrible and unrealistic. I like the way AgX goes to white at overbrights, but that doesn't explalin why the reflections in the balls of the light sources also wehnt to white. The reflections were not overexposed so they should show pure saturated colors - that's what happens on real world, a neon sign photographed head on gets desaturated and white, but it's reflection in a table top is extremely saturated. None of these methods show a physically correct result - that is worriesome, because this shows an inherent problem with the renderer. Also the ACES comparison is pointless - it's just a color space the data is written to. If it looks bad, then your color transformation from ACES to sRGB for your display is flawed. Or the data in ACES is flawed but that again means the renderer is the problem, not ACES :/
funny, the filmic examples retained more saturation than the agx ones. Agx looked as if someone just reduced the saturation and turned down the low mid tones a touch. At least filmic will still be an option. Leave them all in so people can pick whatever they want!
Yes, the way it interacts with color is different than filmic. One of the 'looks' specific to AgX is 'Punchy' (which I didn't really show) that restores the saturation and contrast. And I agree, use whatever works for you.
Thank you for your videos. Why can’t they just have ACES and AgX built in? ACES is the standard for a VFX workflow. It’s frustrating to manually install ACES for every Blender version I have.
I agree, I don't know why they didn't do that, but I suspect there might be some technicalities under the hood for proper support. Just a guess. I'm hoping that by the time the spectral Cycles is up and running they'll do that.
This is gonna be a game changer. Blender renders always had that "Blender look" (not in a very good way) because of filmic VS other industry softwares like Maya or 3dsMax which use ACES. Now content created with Blender (even by beginners) will have by default a much more pro look, apparently even better than the competition. This will literally change the colors of how Blender art is looked upon for the better!
After the view transform is done, what style of transformation does Blender use to convert from the luminosity-adjusted full-CIE space to the display space? E.g. relative colorimetric?
This level of understanding of how technology works is just truly fascinating to listen to even for someone who is only a fan of 3D and tech. What a great content
Wouldnt it not matter if you export using Linear gamma with 32bit float file like DPX? As to my understanding after watching your video, AGX is just a way to transfom gamma?
The view transform is separate from gamma correction. Linear floating point formats like EXR don't save with the view transform applied, they are reducing al lthe dynamic range down to under 1.0 and you don't want that to happen with a float format. So view transform is only for renderings that are being saved in 8 or 16-bit integer formats.
Nice! I thought this was going to be a difficult video, but you explained it so well! And thanks for talking about ACES too. I thought it was the gold standard already.
I personally hope they decide to support the ACES pipeline fully in the future. But for now not. You can installed ACES since Blender uses OpenColor IO however.
This was my question through the whole thing! It's doubly confusing as Unreal's "filmic" is an ACES transform (with a *lot* of knobs to tweak), so I was really surprised to see that Blender was using an SDR transform.
There's been research development done in that direction. There was actually a working build earlier this year that I briefly tested, but it's not ready for the release as a lot of internal 'hive mind' work is going on with a series of developers to figure out the best way to handle various issues. So I don't think there's any definitive timeframe for it to happen. However, AgX was a necessary step to allow for spectral to be introduced downstream.
I use ACES but i don't know how it would compare to agx. I think ACES is still better and agx now has a proper implementation in blender that would make it a viable option for people who do not understand ACES. However it would be good to do a comparison between them as ACES loses the looks feature after implementing it so a pros and cons list can be made too.
Yeah, I'm not sure of the decision to not also add ACES as an option. Cycles is moving towards becoming a spectral renderer and it may be that they're holding off until that's done. Just a guess.
AgX is the much better color transform close to how an ARRI film camera captures color and exposure on film material. I made a couple of tests that show serious issues with the ACES protocol. This confirms the color problem with ACES similar to Filmic in the video. ACES has issues with color shifts in the blue range and collapsing HUE colors in my test while AgX is transforming the results perfectly and constistent. In a client project, I was able to solve color issues just by switching from Filmic to AgX. This solved color temperature issues like a charm. The AgX Beauty, the Filmic Beast and the ACES Monster (HUE + Exposure) ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-7-zFWruNJ7U.html
There are several people working on it, but I'm not sure if it's been made an official goal. There was a fairly functional build earlier this year I tested out.
without this demonstration i don't think i would have figured out what the difference was alone, thank you for the showcase of filmic vs AgX difference
Glad they are adding a better view transform. One thing I noticed when trying to manipulate my renders in a photo editing software was that lack of dynamic range and presence of saturation. Worked to my advantage as I was not making anything that was meant to be subtle, but I can only imagine how annoying that could be for professionals making high quality renders for commercial applications.
I just had a render client provide me with a file needing AgX and it's the first time I used it. Definitely retained some of the blown-out sky in his render and slightly muted things but an effect helped with that. You did a phenomenal job in explaining the sRGB problem as for many of us that's all we know, and how things like AgX move beyond that limitation. Great video, subbed.
Incredible samples to understand standard, filmic, AgX... thanks! Just one question, when Blender 4.0 gets released, would AgX would be set by default?
Yes, AgX is the new default, but Filmic is still there for older files or if you want to use it. Blender uses OpenColor IO which means view transforms and color spaces can be plugged into it.
Fantastic explaination and examples. I've been using AgX in Blender, Substance Painter, and Fusion for a few weeks. It really is extremely good in practice. Great to learn Blender are including it in 4.0, but it's very easy to install right now - so I'd advise people to chuck it in and have a play around.
Looks like switching to AgX will be super simple in Blender. I have NO IDEA why they make it so complicated in Max, where working in ACES is an absolute pain.
Hi Christopher. As a long in the tooth CGI guy who has switched over to Blender... Kudos on having hands-down the best channel on the technical side of cycles materials, lighting and colour management bar none! Do you have any solutions for getting blender AGX into resolve for Black magic Wide Gamut or ACES?
Wow, this was so informative. I've always struggled to understand the view transform but now see how much my renders could benefit from it. I'm excited to try AgX now. Thanks so much!
Saying that the display devices are getting a few more options such as the wider gamut P3 will confuse a lot of noobs and is in a way misleading. Why not just say how it actually is - P3 is the color space of P3 displays so all the Apple users with them will finally be able to see correct color for their displays?.. Right now I bet it sounds as if it's a cool new option for seeing more color to all the people who do not understand color management. It's NOT an 'option' for a display device. It's a setting that tells Blender what the color space of the actual display device is.
What the... Im speechless... AgX looks so much better than Filmic. I didnt even know that issue existed, that might be one of the actual reasons why Blender renders from hobbyists and so on sometimes seem to look off and not realistic, but this huge!
Ps lovely model of that product shot. I would make the meiscus a bit more prominent and add bubbles. Did you model the interior correctly. There is a special method of modeling need to render this "correctly".
Just a few words on aces… aces takes the display primaries into account. You need to use the correct odt. So a mpb with Display p3 as the Color gamut should then use the display p3 odt. Also the Color saturation shifts around white are intentional, to mimic film stocks response to light. That is the rrt in aces…. All Color management only works if the screen is set to the same primaries than the view transform or odt in aces. Otherwise it all falls short…
I made some tests in the past to compare Filmic, AgX, and ACES in coordination with Troy Sobotka. As you figured out, too, ACES has issues within the range of blue with color shifts towards purple and collapsing HUE colors under extreme lighting situations. In my tests, I animated the color of an area of light through the entire HUE spectrum with high exposure (intensity). ACES is not able to color transform this properly while AgX is handling this perfectly, smoothly and consistently.
@@christopher3d475 Interesting. I wasn't aware of this. I also hope they implement Photon Tracing for caustics. Do you know the three-part series "Unlock Better Color in Blender" by Riley Brown?
Yeah, I tested out a build of the spectral version of Cycles earlier this year, but I haven't followed it closely so don't know where it's at. But it's able to handle tough color situations better than direct RGB rendering. Weta Digital has their own in-house renderer called Manuka that's a spectral renderer, they do this because it allows them to match real world camera color response quite closely. I hope Cycles makes it to the light of day. I've seen a bit of one of his videos, I should probably watch all 3 parts.@@3dvfxprofessor
Just wanted to mention that as of ACES 1.3 this is resolved using the ACES Gamut Compression fix. You can grab that fix in several floating configs out there, including my custom RS config. ACES 2 will ship with this directly resolved. @3dvfxprofessor
This is a fantastic video, thank you for going through and explaining everything from the top too! I learned a ton from this video, it was a great watch ^^
Compiling my top 20 go-to RU-vid TUTs - you're def way up in there C3D! - Many thanks for your education:) There's a buzz I get when someone like yourself explains something in way that even I can understand, I love learning and understanding how things work. Short cuts/add-ons are great if you know your stuff, ie when in a production environment - can save time. But if you're learning I say, don't get too distracted with all the unnecessary add-ons - instead do the spade work in blender vanilla and understand how to get there/problem solving etc. In the long run you will save more energy and time than any add-on can provide.
Dude, You've done an amazing job at explaining everything😀! I'm always scared by all these color transforms and stuff because they always go over my head and seem very very complicated, but you explained it very nicely. Though, I didn't understand everything, I learned alot 😀! Thank you so much for the amazing video & sharing the knowledge! Blender 4.0 is very exciting and amazing :D
I knew AGX was a big deal. Never liked Aces, made my lighting look weird and the colors weren’t exactly the ones i wanted to take. AGX handles color blown-outs A LOT BETTER than Aces, and keeps the color you’ve selected the same without having it look weird. Agx is superior, we should all use it for our renders.
Top notch explanation and step by step comparisons for semi-laymen like myself >> it seems AgX gives you a 'log profile' effect so that you have all the colour information to allow for freedom of post processing decisions >> which will make for some fantastic 'next level' image renders from all the super talented folk out there... excellent work👌💯👀🎯😎🌟
@@sanjacobs6261 hard to implement, I think. And that there’s a bunch of patch work because of its shaky basis which is understandable I think considering everyone just now learning properly color theory, or lack of it
i have NEVER understood color better than this video. Thank you! A few follow up questions: Is the recommendation to use AGX permanently now? What about people who are stuck with an sRGB monitor for now, like me? Should i even bother with DCI P3? Or is simply using AGX sufficient?
AgX is now recommended (and it's the new file default), but filmic is still there if you want to use it. You can use P3 but if your monitor isn't P3 gamut capable, then just stick with sRGB. You can use P3, but your monitor jus won't display it quite right.
While Agx is indeed a great improvement and fixes some issues ( issues i never encountered personally even with daily uses) with ACES, this workflow presented here is wrong and is oriented at hobby users not professionals. here you show how to use Agx as a LUT . You never ever give a direct render to a client, and any self-respecting artists will never send a render directly from the software to the client. So what you see in the rendering window should never ever be the final product. You should always render in the largest color space available - like Linear ACES, save as 16 or 32 bit float exr, depends on you, comp in ACES, grade in ACES, then conform to rec709 or srgb and deliver. I worked and work with blender for close to 20 years, and with many 3d and comp software on the market to. My workflow, since the implementation of aces, is render everything in Linear ACES - compositing in nuke/fusion in ACES, grade in resolve in ACES then conform and export to sRGB or Rec709 color space after. You get minimal loss in gamut information, proper color transforms in compositing, proper blends, highlights, proper dof blurs etc. And if you want less headaches, you download OCIO, set environment variables for your system to use ACES by default for all 3d and comp programs and then you are set. Plus, something very very important, Monitor calibration, unless you work on a calibrated screen, you can ignore anything you read here or see in the video above.
0:04 For reference, this isn't Cycles exclusive, but a general update to view transform functionality that is used with any other render engines including Eevee too lol
Man, the naming of these in Blender is confusing at first but makes sense if you mentally translate them into their actual terms. Display Device - Color Space. View Transform - Tonemapper. Why is Blender like this, just name them normally for goodness sake. The "Display Device" I can understand as it's effectively asking you to input the color space of your display. But “View Transform”? That means something completely different to what it’s asking of you. A “View Transform” is the transformation matrix of a given view, not a tonemapper. You could argue it’s referring to “transforming” the color range or something along those lines, but that’s grasping at straws. Why haven’t they just given it a normal name like “Tonemapper”... Oh, Blender that’s why. This is coming from someone who uses Blender professionally, but man is Blender just Blender at times. At least they didn’t call the subdivision modifier “turbosmooth”.
It's OCIO terminology, Blender uses OpenColorIO pretty much as-is (with a few modifications like view-name-based look option filtering), so it's only natural to use OCIO's terms as well. It doesn't make sense to call display devices "color space", in OCIO's context, everything you can tag your texture as, including "sRGB", "Non-Color", "Linear Rec.709", "Linear FilmLight E-Gamut" etc., are all colorspaces. It's important to make clear we are talking about your monitor here. Regarding "Tonemapper", it is another over-used term that shouldn't be used anymore. What is "tone"? Are we mapping "tone" with Filmic/AgX/TCAMv2/OpenDRT etc.? Can you find me a standard CIE definition for "tone"? If no, how can we map "tone" without knowing what it is? The terminology is messy currently, ACES people call it by three names, ODT, RRT, Output Transform; FilmLight calls it DRT; On AgX side we call it "Image Formation" (implying the RGB data straight out of Cycles/EEVEE is not image, it needs to go through the view transform to form the image); On OpenColorIO we call it "View Transform"; ARRI etc. camera makers call it "Color Science" (such a dumb name BTW), FujiFilm calls it "Film Simulation" (Fuji's Eterna simulation seems very very bad BTW). I do hope people can have an agreed upon term, but currently you just have to look out for all these terms and know that they refer to the same thing.
with agX can blender finally support hdr? imagine owning a hdr monitor and work in viewport with full hdr, hdr10, Dolby vision and that stuff. i know blender can render a hdr picture, but it require a lot of setup, so it needs work.
4.0 does support HDR display on capable monitors, but not with a view transform enabled. It works on my MacBook Pro. It doesn't work with view transform because those are by designing moving all the lighting data into the closed domain of a typical 8 or 16bit color depth.