Thanks for the amazing tutorial, may i ask you if is possible to make the cam change direction on specific point and at which tick i am in a specific moment?
just a headsup, it looks like u didnt link the texture baking video in your descriiption, nor any other tutorials which presumably you meant to coz u have a tutorials subheading in the description with nothing under it.
damn, had to do a project in graphics programming and did not hear about anything said here, i even went (probably too quickly) trough books and did my project and only got here because of being curious about gimbal lock. This is making me afraid of all the things I miss trying to keep up with everybody instead of properly learning. Great video anyway
I appreciate your feedback. There are a variety of ways to move around a 3D space in ThreeJS. For this video I tried to reduce the amount of code required to understand how the process of importing the model works. Threejs has some interactive examples available to demonstrate ways to move around a scene. I have also linked a separate video below where the camera moves around this 3D scene along a path. Scroll along spline path in threeJS: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-58k6PLYnOuI.htmlsi=9GM3hwXM9PzfO_NR ThreeJS example: threejs.org/examples/?q=controls#misc_controls_pointerlock
When exporting from Blender as a glb/gltf, on the window where you name your exported file there should be a drop down on the right marked "Include", in this drop down "Punctual lights" must be selected, with a checkmark in the box. After confirming this I would attempt to view the exported scene on www.threeJS.org/editor If the lights do not appear in the inspector on threeJS.org/editor, there may be an incompatibility with how the lights are written to the file by Blender and read by ThreeJS. In this case the lights may need to be recreated. This can sometimes be done on threeJS.org/editor, and then exported into a new object.
I believe the Raycaster from ThreeJS, loaded in node_modules\three\src\core, is being used in the file: RayCastHelper.js on the line: const raycaster = new THREE.Raycaster(); Perhaps there is more raycaster functionality that may be beneficial from that file. three\src\core\Raycaster.js: github.com/mrdoob/three.js/blob/dev/src/core/Raycaster.js RayCastHelper.js: github.com/ClassOutside/Clicking_Objects_ThreeJS/blob/main/src/helpers/RayCastHelper.js
I appreciate you sharing your experience. I have not personally identified any viruses on my personal machine related to this software. Sometimes, antivirus software will operate with the idea "better safe than sorry" and sometimes this can lead to safe programs being flagged as potentially malicious. While when I downloaded it, the software I received I believe is safe, I cannot personally guarantee it, or your individual download of it, are safe. If you choose not to use this software, I do hope you are able to find an alternative route to any solution you are searching for.
11 дней назад
@@classoutside i didnt say about the software but the link to goes to the page you can download it. not github. but the software's link for windows version
Thank you for clarifying, I must have misread your comment. If the link to GitHub.com is being stopped by your antivirus, I can say that is surprising. GitHub is a commonly used site owned by Microsoft and should be accessible. If it is the link on the GitHub page to download the software, i am less familiar with that link. According to the GitHub description you can compile the project yourself with some guidance provided in the README. You could try to compile the repository yourself if you would like access to the software. This video is intended to assist with fixing holes in meshes that result from using the instant meshes program, not holes that already exist in the mesh when in blender. Is this the solution you are looking for?
Thank you for sharing your experience. The features displayed in this video should be implementable in react 3 fiber. The repository may need some adjustments to properly conform.
Hi there, thank you for posting these tutorials! I have followed three but am now getting stuck. My armature does not follow the same moves as the one I have downloaded; It only moves around in the T position like a scarecrow dancing. I can see the armature I downloaded dancing, and it looks as if my armature wants to follow but nothing but the hands move and the rest remains in the T position. Any idea what I may have gotten wrong?
Hello. If the downloaded armature appears to move, and the mesh does not, it may require some troubleshooting to solve. One thing you may be able to try is setting the rest pose. I have another video linked below that describes this process. Another thing that may be worth checking is the weights. When you apply the armature to the mesh with "Parent with Automatic weights" the system will attempt to weigh how much influence the bones movements should have on the vertices of the mesh. It is possible that the automatic weights did not synchronize the bones with the intended vertices. If this is the case, you can try to make modifications in the Weight Painting Mode, found near the Object Mode and Edit Mode in the top left drop down of the 3D viewport. I hope these suggestions may help you towards a solution! Set the rest position: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-suEtfwP2Dwg.html
@@classoutside Thanks for taking the time to reply. I ended up fixing the issue by uploading my model to mixamo, then downloading the armature with the animation set to my model. Then I imported the armature as you did in your video and that did the trick. I am not sure why, but downloading the standard mixamo armature and animation would not work, it would only work once mixamo interpreted my armature to it's animations.
It is good to hear that you found a solution. I believe Mixamo attempts to set bone placements and weights with their software, similar to what can be done in Blender. Perhaps their solution worked better for your mesh in this case compared to Blender's.
well actually, normalized quaternions may be used to represent spacetime and relativistic calculations and transforms between reference-frames, where w then is time of spacetime. But this is usually not a good way to do those calculations, almost certainly too slow AND too low precision in the needed context, compared to simpler functions that calculate the same, without relying on quaternions or complex numbers.
for almost all rotations, euler rotation will not return the smallest possible angle. for small angles, quaternions suffer a too big precision-loss. matrix of axis angle rotation is identical to quaternions, but axis-angle-forms needs more memory and multiplications, usually not worth the gained precision, unless you do less than 2 rotations in a chain-sequence and precision really matters. the second life wiki has a function, that reasonably efficiently first calculates the angle, and then only does euler-rotation if the angle is small enough so that precision of the result matters more than rotating around a single axis.
I appreciate hearing your input. Taking the shortest path, or smallest possible angle, when rotating is often found when using spherical linear interpolation (SLERP) which is more common with quaternions, not euler rotation. As precision needs grow, it can be important to ensure calculations take those needs into account. I believe for 3D art and animation quaternions or euler values may be precise enough for most applications. I would have to research further to identify where either begins to be less precise. Computational efficiency can be very important. I was not aware of the second life function, thank you for sharing.
When animating in unreal you may be able to mark keyframes agnostic of rotation type (Quaternion vs Euler) and instead rotate around the world axis with other controls. I would recommend trying this first and monitoring the output. If you were to use Quaternion values to represent the keyframes in the rotation, you may need to use 2-3 keyframes. While under some circumstances it can be argued that Quaternions can represent a 360 degree range of values, it may not be able to accurately represent a desired angle taken to traverse from 0 to 360 to perform a full rotation. I hope this helps.
Thank you for sharing. Effectively communicating concepts with others can be challenging. Pronunciation rules can be relaxed in some settings and still lead to effectively discussing the same topics. I have met some people who find it very challenging to understand when pronunciation is improper. This can especially be true when trying to listen and understand a language that is new to someone.
I appreciate your compliment and feedback. I will consider this for future videos and try to find ways to make the code being discussed more visually clear for those viewing.
What direction forward is can depend on interpretation, and possibly how the user chooses to scroll. In the mainView.js file, the camera.lookAt value determines the initial direction the camera is pointing, before any scroll occurs. It is currently set to point towards the final pint in the path used (0.99). To face the other direction you may try setting the value to (0.01). Depending on how you intend the user intends to scroll, adjustments may also need to be made to positionAlongPathMethod.js mainView.js: github.com/ClassOutside/ThreeJS_Camera_Follow_Path/blob/main/src/views/mainView.js positionAlongPathMethod.js: github.com/ClassOutside/ThreeJS_Camera_Follow_Path/blob/main/src/positionAlongPathTools/PositionAlongPathMethods.js
Hello. The armature appearing differently in different modes can occur for a variety of reasons. Sometimes troubleshooting can be destructive, so if you have made animations or setup desired poses, it may be best to make a copy of your blend file first. One thing you could try is to apply any transforms (movements, rotations, scales) that have occurred to the armature. To do this, in object mode you can select the armature, press control + A, and select All Transforms. Another option that may be worth trying is to set the rest pose. I will share a link to another video where I describe how to perform this process: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-suEtfwP2Dwg.htmlsi=MbhkcpO0BmTaVkYh
Hi there, I’ve created a steel cupboard 3d model using Blender and then I exported it as .glb using cycles engine, to use it in r3f. Now the problem is that, whenever I try to add AmbientLight or HemisphereLight the model is not getting lit up, the model is lit up only if I use DirectionalLight. What could’ve possibly gone wrong? Any tips or tricks to fix this?
Hello. Lighting across different rendering engines can be challenging. Blender uses the cycles engine to calculate what the scene should visually look like. Other 3D software like game engines, or react 3 fiber (r3f), may use their own rendering engines which will calculate how the scene should look. For example, R3f uses the WebGL Rendering engine, which calculates differently than Blender's Cycles. This often results in the scene appearing visually different, including strange lighting. To troubleshoot this, I would start by taking a look at the material in blender. Texture maps like specular, metallic, roughness and normal can all affect how light interacts with an object. If you have any of these applied, one or multiple of them may cause the light to appear strange when in r3f. Complex materials exported from Blender are not always properly read by other programs. One way to see how your model, and its material, may appear when rendered by WebGL is to use the website: threejs.org/editor/ On this site you can load your 3D model along with its materials. You can add lights, and see what the object will look like under different conditions. You can also inspect the materials tab and review if all of your material maps have been loaded. There may also be values you can adjust, along with adding your desired lights, to test how the material will react and display in WebGL. I hope this explanation may help you towards finding a solution that fits your needs.
This was the video that finally makes me "get it" for gimbal locks. Maybe its just the fact that I have been watching several videos of gimbal lock but walking away with a half-ass understanding of it. I think the way you put it - the gimbals have to make a wierd detour to get to an angle which requires rotation in that "lost" degree of freedom and the fact that when the gimbal lock occurs any rotation applied to that axis goes to the "stable platform" hence losing the priced orientation it contained is what made me add it up.
I am glad this video could be supportive in growing your understanding of gimbal locks! That detour concept can be tricky to comprehend at first, in combination with that "lost" degree of freedom. I appreciate hearing your experience, thank you for sharing it! :)
Hello. I appreciate you pointing this out. I believe the problem may stem from the terminal command 'npm run start-dev'. 'start-dev' is a script found in the package.json file, and represents a series of commands. This was written in a way that is readable by windows machines, and not MacOS or Unix machines. I have added a new script 'start-dev-unix' which should run on MacOS or Unix systems. To run this, you would need to either pull the change, or download the updated repository, and run the following in a terminal window 'npm run start-dev-unix'. I hope this may help. Also, thank you for pointing out that the spline is always appearing closed. It would take some further investigating and testing to confirm why this is. I believe at least part of it may be due to this curve.closed value being hardcoded to true for some reason: github.com/ClassOutside/ThreeJS_Camera_Follow_Path/blob/main/src/curveTools/CurveMethods.js
Hello. If you are trying to add an armature to a mesh, the process may be different than what is shown in this video. This video is to remove an armature from a mesh. One thing I try when I do not see the mesh follow when moving the bones is to check which mode I am in. In the top left of the 3D Viewport, there should be a drop down menu. This menu should have options like "Object Mode", "Edit Mode", and "Pose Mode". To see the effects of the bone's movement on the mesh, first select a bone, or the armature, then in the top left go to "Pose Mode" in the drop down. If the armature is applied effectively, the mesh should move with the bone(s).
you saved me a lot of pain. I was trying to arrange the UVs manually which took me so many hours and it still didn't turn out good. This was the perfect solution that I should of just researched earlier, d'oh. I'm new to blender and I have no idea how texture baking works.
Hey, hope all is well. I’ve become a fan of your works especially the use of 3d. I’m currently working on a project that blends 3d with 2d characters. I like to do my character animation in 3d as a reference. This comes in handy with action scenes and complex camera movements too. My question is can you animate multiple animations in blender using the action editor. Tutorials I’m watching show this as a possibility. It looks like each animation would be a separate action and then edited together in NLA editor. Would all these animations have to be done with the character in place or does it matter. Hope this makes sense. Any feedback would be amazing, as I’m so stuck. Thank you and take care.
Hello. I am glad to hear we both share an interest in 3D and I am happy it sounds like my work has left a positive impression with you. Blender can support multiple animations, and these can be set up in the action editor. The non-linear editor (NLA) should be helpful for blending and sequencing these animations together. For your question on if the character should be in place, when creating an animation it is common to begin from a static, common pose like a T pose or A pose. New animations would be created relative to that pose and added. It sounds like you may be on the right path. I hope this may assist you towards your solution and project!
For the normal map, and other maps to be copied, you could try and set the diffuse node to the normal map file, and then follow the same process. The maps (normal, roughness, diffuse, etc.) should all be image files, like a png or jpeg. In the shader editor view you should be able to change the image connected to the diffuse or color part of the mesh. If you change this for the other map you would like, it should copy to the lower poly mesh as the color map, if you follow the steps of this video. Then, in the image editor, you should be able to view the image and save it to a file. Afterwards you could repeat this process until all maps are saved. Then, in the lower poly model's shader editor view, you could import the maps in the image texture nodes and connect them to your material. I hope this may help!
Hello! The GLTFLOADER in ThreeJS should be capable of importing either GLB or GLTF files. There are other available loaders in Threejs if a different format suits your needs. I believe gltf is a common format and that is why I chose to use this.