Тёмный

To Blur Spinning Things 

Sphynx_Owner
Подписаться 2 тыс.
Просмотров 5 тыс.
50% 1

Опубликовано:

 

14 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 37   
@possiblyzslot838
@possiblyzslot838 2 месяца назад
Incredible! As an intermediate developer, I never thought about how to consistently make propellers/wheels blur
@drinkspartypack
@drinkspartypack Месяц назад
this is incredible you are so talented
@elviokill
@elviokill 2 месяца назад
I wish to be this good at something one day. Excellent work!
@Splarkszter
@Splarkszter 2 месяца назад
"Practice makes the master"
@bunnybreaker
@bunnybreaker 2 месяца назад
Love to see the progress on this. Keep up the good work 👍🏽
@alesjelovcan6810
@alesjelovcan6810 2 месяца назад
Awesome work! Love it! Also the structure of the video - first outlining all the problems... it's a drama, the solution comes as a twist to the story when I thought all hope was lost. I do wish that the last step with enveloping mesh somehow gets to work behind the curtain with zero user setup. Good luck :)
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
Thank you!! That would have been nice. For what it is, I would consider it minimal work, as the mesh to make can stay rather simple. As said in the video, you could also use a simple cylinder (which you can generate on the spot) instead, at the cost of accuracy and cleanliness.
@stratos2
@stratos2 2 месяца назад
interesting implementation. I wish it didn't cause the artifacting over a complex background, I personally find that quite distracting. It's definitely a hard problem to solve, made even harder for me because I usually work in sandbox games in which any post-processing needs to account for anything the player might build, and as such needs to deal with fast rotating things of which the size or rotation axis is not known and can not be pre-defined in engine.
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
Thank you for your feedback. I am not sure which of the (many) artifacts you refer to, and there is still a lot of room for polishing in my current implementation, so it may be something I will be able to fix.
@stratos2
@stratos2 2 месяца назад
@@sphynx_owner8224 I'm specifically talking about the issue where it has to infer the background behind the mesh from pixels around it, which, when the propeller spins over the godot logos, causes the white lines of the logo to be dragged out in the wrong direction. (perhaps this could be remedied by just not rendering the spinning mesh at all over a certain speed and only the blur?)
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
Remember that i am using the screen texture to genrate the blur around the mesh. If the mesh is not rendered in the first place it cannot be blurred.
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
This is one of those issues that are inherit to the nature of post process motion blur in general, so for what it is, its as good as it can get in that regard without needing extra work and making it more complicated.
@stratos2
@stratos2 Месяц назад
I had to come here again because I thought of something that might fix general rotational blur. My original thought was: "What if we knew the rotation axis of any object, then we could correctly sample in a circle without the sampling deviation", which I realize you already covered in this video. I did extend this thought though to how this could be pulled off for any general frame. My general idea would be to have a second full frame buffer similar to a normal buffer, but with every pixels r, g and b value indicating the worldspace rotation axis of rotation. Though thinking about that, that would require two buffers, one for the rotation vector and one with a vector pointing towards the rotation axis. Maybe it could be compressed into one rbga buffer by using a quaternion, though I am not familiar enough with them to decide if this was possible. So this approach may need two extra buffers. Of course the second issue is how to find the rotation axis to populate the buffer with. My first thought was using the position of our point on the last frame and this frame to compute the rotation axis, but that would be a screenspace rotation axis, so kinda useless. However, the CPU doesn't send mesh data to the gpu again every frame for an object that rotated. instead it sends a quaternion to encode the rotation and the vertex shader does the actual rotating. So if we write a custom vertex shader that captures this information, possibly also this information for the previous frame, and put it into a struct that we pass on to the fragment shader, we could create the rotation axis and rotation axis offset buffers by doing a bit of extra computation. This for sure is the most expensive option out of all methods provided, but it would have the benefit of being generally applicable without artist-authored blur meshes. What do you think of this? might this work?
@sphynx_owner8224
@sphynx_owner8224 Месяц назад
you would still need geometry to be rasterized around the space the mesh sweeps through, which would still require some encapsulating mesh. Also i have no access to the vertex shader and any additional buffers from GDShader, so i cannot write to custom buffers from surface materials.
@sphynx_owner8224
@sphynx_owner8224 Месяц назад
I love your passion however, this is a very impressive suggestion.
@stratos2
@stratos2 Месяц назад
@@sphynx_owner8224 yeah. The motion blur would be a post processing effect sampling the final image according to the information in the rotation buffers. Finding enough space for two extra buffers and fitting a custom vertex shader onto every single object in the scene could be rather difficult, likely the simplest way to accomplish it would be to make the camera render two more unlit passes which render the needed information to the two buffers. That part of it would be possible in unity I believe. Passing extra information from the vertex shader may get a bit annoying as one might run out of keywords. Maybe a global buffer would work, this would likely be very engine dependent and complex. But I think it has potential.
@stratos2
@stratos2 Месяц назад
​@@sphynx_owner8224 I thought about it a bit more. If it is possible to define custom buffers, one could get away without any textures. One would use one of the few free spaces in the v2f struct, maybe TEXCOORD2 (varyings in godot I think). which contains an index representing the currently drawn model. This index in the custom buffer would correspond to the float4x4 rotation/position etc matrix of the model to be accessed in the blur shader. If processing of the matrix for faster per pixel execution is needed, this could also be done in the vertex shader before committing the transformation matrix to the buffer. This way the matrix is only stored once per object, not once per pixel, which makes a whole lot more sense memory wise. Injecting the custom shader may be possible by using Camera.RenderWithShader() in Unity, not entirely sure if that can include vertex shaders, but it does look like it. I'm currently stuck deep in other projects, but maybe this approach could be something you can implement? Otherwise I'll probably look at this idea again eventually:tm:
@Polygarden
@Polygarden 2 месяца назад
Loved to watch the video. These are some great experiments! I have never done motion blur in a shader, but only used velocity vectors in fluids. I wonder if it would work if you buffer 2 past frames and interpolate non-linearily between 3 frames, perhaps you couls get a curve this way?
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
The whole idea is that the blur interpolates between the current frame and the last. Accumulative approaches are not unheard of, but they are not part of the goal with this one.
@xymaryai8283
@xymaryai8283 2 месяца назад
oh, i had an idea, would this work with techniques like Asynchronous Reprojection/Timewarp AKA rendering screen space rotations between world space frames? theres an Async Repro demo that Linus Tech Tips did a showcase of, with an improved world space transformation approximation that has interesting behaviour, but even without that, accurately blurring world space as you move through it while being able to turn the camera at higher framerates would make it a lot more comfortable, if the motion blurring can be done efficiently enough
@holleey
@holleey 2 месяца назад
cool. so do you think this could be usable in 2D games, too? and if so, would you recommend it?
@harelshakedgolan4726
@harelshakedgolan4726 2 месяца назад
awsome explanation and tutorial fire video🔥🔥🔥
@fuzzyhenry2048
@fuzzyhenry2048 2 месяца назад
4:55 the car front hood is ghosting in front of it. That might be a problem of the normal motion blur.
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
yep, that's an issue I have had for a while and I need to fix it.
@Limofeus
@Limofeus 2 месяца назад
8:24 Hmm, I wonder, since you know that the error would always accumulate outwards (due to the way sampling takes in to account linear velocity more than angular) maybe you can just nudge the result towards the center? I'm not sure but I think the error is linearly proportional to spin speed (or some other more/less simple proportion) and figuring out this proportion you could know how much you need to nudge the result towards rotation axis for it to be accurate. Although I might be completely wrong here...
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
That is a valid offer. I also had that in mind. You are suggesting I account for error when objects are rotating, and be able to not even need this dedicated radial motion blur mesh in the first place. the question is, how do you know where the center of rotation is in the first place, if all you do is work with a 2D image of velocity value splotches? how do you know that the next velocity is not on a different object entirely? Also, how would you handle cases where the rotation axis is not facing the camera, but viewed at an angle? or the object is not fully visible? or how do you even differentiate objects in the first place? You would have to infer a lot heuristically about the nature of the environment and objects in it to then try and extract rotation axis from it using nothing but the depth, normal, and per pixel velocity images. I don't think its impossible but as I see and from my experimentations it its very impractical for me to pursue at the moment, and I can imagine something like this would also render the practicality of the effect itself obsolete, as you now have to do a lot of extra processing just to account for a few rotating elements in your view. Also note that in 8:24 I'm showcasing an environment where the velocity values are tangent to the pixel's path, which is not how Godot's velocities work, and I talk about it in the video too.
@Limofeus
@Limofeus 2 месяца назад
@@sphynx_owner8224 I see, I guess you're right, maybe there's a way to calculate more accurate pixel path but it would probably be very hard if not impossible to do in screen space and the other way is to use CPU to get more info on rotation, then send it to GPU, but at that point your approach with helper meshes would be more efficient
@Thomas_Lo
@Thomas_Lo 2 месяца назад
Once again, may I ask , will you be at GodotCon?
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
As interesting as it sounds, I don't know how possible it is for me.
@Thomas_Lo
@Thomas_Lo 2 месяца назад
@@sphynx_owner8224 Let's get in touch and see what's possible
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
@@Thomas_Lo No problem. You can find me in the Godot Effects and Shaders discord server discord.gg/MhfCkhV42T
@Shack263
@Shack263 2 месяца назад
Thank you!!
@jujuteuxOfficial
@jujuteuxOfficial 2 месяца назад
Do you reckon it's possible to use it to make your own pre-rendered animated sprites from that? so you get the best of both worlds, where it uses that in an empty world/green screen, and creates the animated sprite from it in real time then what you see is the pre-rendered sprite animation, which won't have the deformation the shader has when rendered over things.
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
I don't know. You would definitely lose a lot of the information cause its no longer in 3D
@jujuteuxOfficial
@jujuteuxOfficial 2 месяца назад
@@sphynx_owner8224 i mean you can always have 4 variations at 22.5° steps from front to sides, and have the result be a mix of the two nearest with some paralax stuff
@Stylpe
@Stylpe 2 месяца назад
To solve the lighting issue, would it be possible to separately perform the blur step on the normals buffer before lighting is computed, and also rotating the sampled normals? I don't know the Godot pipeline at all so I'm just thinking out loud.
@sphynx_owner8224
@sphynx_owner8224 2 месяца назад
maybe there is an approach that would yield a better result than the current one using what you are suggesting.
Далее
This is better than @export | Godot Tutorial
18:20
Просмотров 25 тыс.
Jiggle Physics in Godot
13:31
Просмотров 84 тыс.
КТО БОИТСЯ КЛОУНОВ?? #shorts
00:20
Просмотров 460 тыс.
How I Solved Real Time Motion Blur
10:20
Просмотров 47 тыс.
The making of history's first 3D animation
19:16
Просмотров 162 тыс.
I Built a Robot that Plays FPS Games
21:23
Просмотров 440 тыс.
The REAL Reason Unreal Engine VFX Looks FAKE
6:58
Просмотров 469 тыс.
Why Devs NEED TO know about Render Matrices!
11:31
Просмотров 93 тыс.
A simple procedural animation technique
8:31
Просмотров 418 тыс.
The Tesla Robotaxi is Confusing...
19:14
Просмотров 744 тыс.
Mirrors, Every Way You Can Make Them In A Video Game
8:14
Optimizing my Game so it Runs on a Potato
19:02
Просмотров 620 тыс.