I once wanted to make an animation where the main camera looks into the viewfinder of a photo camera (and sees the whole scene through it), but I could not manage it. With these ray portals, I might finally be able to do it.
The hottest feature would be if they revert principled BSDF node back. Dropdowns menus is burning my chair. This is literally clicking just for clicking
Thing I wanna add: the Kronos PBR Neutral isn't necessarily a replacement for AgX. What this Transform is good at, is representing the actual sRGB from the materials in the final render. For example product renderings will benefit from it. Blender developer website has some good examples.
@@DECODEDVFXBlender deprecated the entire Python API regarding "legacy Addons" and extensions how they call them now installed from online repositories is the only official way. Installing extensions from Disk is also hidden behind a tiny arrow drop menu in 4.2. Not to mention the braindead renaming of functions and variables in the PythonAPI in 4.2. Its a hellscape for Add-On devs currently. Changing asset_lib to asset&library, to asset_library_ref to now finally arrive at asset_library_reference. Release after release breaking any addon with each release is insanity.
Very nice video! I love that kind of videos where like I trying to do that all these futures, but that feeling is really tough to point..very nice keeping goings, and good luck with your classes!
a lot of nice features, but we need HDR in viewport. right now the checkbox is non interactive, and the apple shit massed this up so please revert the apple support and give us hdr.
I know Khronos wasn't in mind when the scene was made, but there's some crazy compression in the highlights, still feel like AGX with some grading looks pretty solid; Khronos from what I get is meant for more poppy stylized rendering to counter the realism of AGX, like It looks good, but it feels like it's meant for Mograph, commercial, and non-PBR projects because of how precise the colors have to be in that sort of scenario, right? It's tied directly to sRGB sorta feels like a more advanced replacement for Standard?
@@DECODEDVFX That's fair, wasn't sure if it was me or what, and thanks for your experimentation and explanations on these topics, it really helps sort things out Random question, sorry in advance but I gotta know; I get that your course is more along the liens of ArchViz, but are you possibly contemplating adding a paid extension to your course about EEVEE Next, and optimizing for feature animation?
I have actually never managed to force it into muscle memory that "i" brings down a stupid list... i always get surprised / annoyed, even for the thousand+1'th time, that i would be animating, and i'm stopped in place by a list every time. I prefer a pair of keys to be able to quickly do both... however, if we could choose what type is that "universal" / general key that happens for the single keystroke, that would be lovely.
Now this update really seems to be great like the compositing timer and the tone mapper. I kind of stopped using Blender since Geometry Nodes a while ago but the newer smaller updates seems to bring back the fun working with it :D Oh and I missed the new thickness option in the material output
@@DECODEDVFX I don't always add things like weighted normal until later in my workflow.. I just think being able to pin top and or bottom would be nice. Maybe you don't.
@@DECODEDVFXAny modifier you have at the bottom wil stay at the bottom naturally when adding a new one since they are added as the first element in the list. But EVERYONE wants the SubDiv modifier as the TOP most modifier. Even if they add other modifiers that change the object underneath. They ALWAYS wanted it to stay at top most modifier for SubD.
....I see it so... the low hanging fruits first & a lot of stuff is & must tackled first.... otherwise back & forth time wasting development smashed the whole b3d gaining process while more & more devs. are tangled in ongoing repetetiv task... Hope that makes sens to you ... p.s. its mostly one dev.(filmic, etc) that handle the overall experimental develeoping stuff and he is well informed in that topic and work since the begining for the institute with some breaks but he is back in the game and with some great implementation that help ppl. like now f.e.commercial product content creator & he forced not the whole team( features) to stop or to hold some development back (f.e.EVEE)...
Read the hitchhiker's guide to colors made by sobotka who made AGX, tldr its superior to ACES as a Display Transform. Then read up Chris Brejon on how to use ACES the system and how it doesn't matter which rendering primaries you use or rather how you make everything worse. Also, if you want to use it, just download ACES 2.1 and setup an environment variable named OCIO pointing to the folder. Every program that supports OpenColorIO will read that config from there now and you have "ACES". But which one do you want? AP0 or AP1 primaries? aka ACES2065/ACEScct or ACEScg? What's with spf13? TLDR saving an image as 32-bit exrs will always just save the pure incoming light energy per pixel. Also changing the scene color space to anything other than a linear one, aka using ACES primaries to render only has an affect on renders that render R, G, and B components separately, aka has no affect or worsen results of spectral renderers such as Renderman, Octane, Guerilla, etc. etc. How you interpret that data and with what IDT and ODT for your display doesn't matter. Remember an exr has to be rendered to be viewed.
@@DECODEDVFXRead ChrisBrejons blog. Also Hitchhiker's Guide to colors by sobotka (the guy who made AGX and previously Filmic) Gives you a better understanding and how it literally doesn't matter.
@@sebbosebbo9794The guy who made AGX and Filmic (Sobotka) is in no shape or form affiliated with the Blender Foundation and is and never was at any point a dev of Blender or contributed to its development. Blender is simply incorporating other Open Source tools out there just like the NeutralPBR "color space" made by the KHRONOS group. I am confident in saying Blender Foundation has no idea what they are doing anymore, if they ever have. Just remember their Principled BSDF was coded by a 17-19 year old and is not energy conserving, who just recently revisited it after being at uni for 3 years and its now v2 and a little better. There's just so much wrongful confidence and blatant ignorance in your other statements not worth getting into.
@@monarchfilmspx0955Its only free for you bcs other people are paying. And apparently OpenSource yet its gatekeeped by full time employees that take grants of millions and donations. 😂 Ignorance is bliss i guess.
I'm most excited about Khronos PBR. AgX was cool & all but felt undersaturated. This looks like it's taking the best of both AgX & Filmic worlds and smashing it together into something better.
You mean the course trailer? Yeah I was very pressed for time and this is a very noisy scene at the best of times, so I had to render out a very noisy sequence. You'd be impressed if you saw the state of the frames before denoising.
I think the Ray Portal could be useful for things like TV screens, or windows in spaceships and other vehicles; places where you would've used a pre-rendered frame sequence as an image texture. This way you can animate what's going on in-scene, and off to the side, so any changes to the timing can be done right then and there. Imagine a character in a video call with another character, for example. You could create both scenes on one set with 2 rooms and use the Ray Portal to put both sides of the conversation onto the screens of both characters. Or if you had a moving shot of a building and there's an animation happening on a large LED screen and you wanted a certain action to happen at a specific framing, you could take care of that all in one scene.
@@Frigus3D-Art Yes, it would have a performance hit for certain, as it would be rendering two scenes simultaneously, though it probably wouldn't be much different than having to render one scene separately and then render it as an image sequence texture in the other scene, when you tally the two render times together. As the background of whatever would be seen on the screen would be flat anyway, you could use an image texture backdrop for the character on the screen, which would save on render time.
Yes. I was going to use the example of a scene I made a year or two ago with a bank of TVs showing CCTV images. At the time I had to pre-render each feed, but now I could just make each screen a portal showing the same scene from different angles.
Definitely some great improvements, as always. Every time I hear reference to the Pin To Bottom feature I think of a comment I saw where someone was complaining about the devs and saying what they need is a Pin To Top feature, completely oblivious to the fact that whatever modifier you have at the top stays at the top. 🤣