I know this is a late comment but your model is massive, probably around 600 to 1000 polygons, worked with the PS1 way back in the day, most hero models, that's the character the player plays had 200 to 600, Crash Bandicoot had 612 and Lara Croft just 260, this was because of the way the PS1 executed instructions, the system specs say 10,000 polygons per frame but the actual number was more like 3000, this was because of the system cache's and RAM, it wasn't just textures and polygons we were sending, it was data sets on raster's, this was to tell the PS1 how many polygons each model has, were it was and how far away it was from the player character, the PS1 didn't have a z buffer, we had to program our own, it also lacked floating point data, hence the wobbly graphics, all those instructions had to be executed before anything was rendered, notice the specs say in *hardware* , that means if the hardware was just making polygons, clever marketing trick from Sony, all the system information had to be written in machine code, just like any computer ever but the game code was written in C, the PS1 stores data from the disk into the caches on the board, then the CPU draws from those to organise and execute instruction sets, to draw frames the PS1 had a frame buffer that the CPU would call on to draw the polygons on the screen, the buffer was limited however and writing the code to call on those instructions, textures and game code was a nightmare, it was fast in hardware but slow in software, we got around a lot of these problems by reducing the amount called on the screen at a time, freeing space in the hardware to make the games run at a smooth 30fps, making low poly models leaves more polys for the environment, sound and effects, so Lara Croft was low poly to make the environments more transversable, whereas Crash Bandicoot needed to be higher in polygons because he took up much of the screen, blender can't replicate the effects of the PS1 graphics because it was hardware limitations that made the games appear that way, a good example is emulated games, same game, more likely crash dumped from the disk but the effects of z buffer wobble and dithering are not present, I suppose if there is a way to tell blender to randomly skip some frames in a render you might get closer to the original, we dumped loads of frames when making in game cutscenes, you can't even tell, Resident Evil has tons of them, which is weird because all the backgrounds are prerendered images!
Dude, this is INSANE!! Im still new to the whole 3D modeling thing so Im worried this might be out if my league for now, but I'll def be saving and rewatching repeatedly over the coming months!! My goal is to make an animated Broadway style musical, which requires some *intense and intimate* facial emotions, but I'm scared of 3D face rigs lmao
This is going to be a bad question but when making models are we relying on the vertex lighting and colours to shade the model as opposed to selecting shade flat/shade smooth?
Hey Sickly! First off, I want to express my gratitude for your tutorials. I think, as many others have pointed out, they're very well structured and it's clear you have real intentions to actually *teach* your viewers, not just brush over the steps for a quick video to get views. That said, I have to ask: is there a new series in the works? You replied about a year ago to another comment that you messed this one up and that you had a better system. If you can't or won't make a new series, I completely understand, but, in that case, are there any other rigging resources you would recommend to beginners? I'm trying to rig my own characters and I'm really struggling to find resources with the same level of quality as your tutorials. There're plenty of tutorials out there but most of them don't go into detail as to *why* they're telling you to do anything. Again, I really appreciate the work you put into your videos, and even if you feel you messed up, I know you have the best intentions.
I have had in the works for a while a full character tutorial series from beginning to end covering modeling up to rigging. Unfortunately, I've been finding myself burnt out from working on it recently, despite only needing a little bit more to do before I'm finished with getting all the footage I need. I do have intentions to make this series however I am taking a break from Blender for the time being, so I can comeback refreshed.
I mean, you could, but since bullets typically travel in a linear direction and don't really swing you'd be better off just having a blend key of a trail that's part of the bullet object if you want something simple
My right eye goes the opposite direction it should on the x axis, makes my goober look drunk! Any ideas on where I might’ve went wrong? Everything else seems to be working great so far, thank you so much for this 😭
I’m working on a geometry node setup that adds as close to a real lighting system as i could make where the end user does not have to touch geometry nodes at all, just apply and setup the modifier and apply the shader I hope to release it soon. Edit: with tools like box fill where you just grab rotate and scale a box into place and set a color, along with point and spot lights and whatever i’m still coming up with
Ah yes random and very specific short that I have no knowledge about yet again showed up in my recommendation. I didn't understand a single word nor purpose of it. Cool.
@@DenerWitt To some extent this is inevitable, but it is much less painful to do it in rizom, modo, maya etc. Blender can be pumped up with addons, but their compatibility and support are questionable. I usually praise Blender, but Uv unwrapping is obviously one of its weakest parts
@@TheSicklyWizard In Rizom`s terms constrains are missing, autostacking of similar islands, grouping for more contextual packing, islands orienting etc.
I keep coming to this video like its gospel, thanks for the these videos! I got a question tho, how would I export from Blender to Unity while keeping the Vertex Painted Shadows?
I'm not to sure how unity works with it's shading system but I do know that the vertex colors are saved. Not sure how to access or use them in unity's system
@@TheSicklyWizard You were right about Vertex Colors being saved, I just had to make a custom shader using Unity's Shader Graph, by using Multiply between Vertex Color node and Sample Texture 2D in!
This video is so much more useful than I initially thought. One of the best. I took what I needed to get started, and every time I come back I learn something new that I can implement to take things to the next level.
@@theshuman100kind of? It does allow for proportional editing of the uvs, but in a more controlled way. You an pin select vertices and adjust others and the live unwrap in the UV editor will adjust proportionally to your changes when you move the pinned vertices
What's the point of art? What's the point of hobbies? Why do we even have fun at all if it has no purpose? Monkey do funni stretch, no one needs any other reason.
I will preface by saying this is a technique I learned from an actual game dev and they were using it because they wanted to have direct control over the motion trail of a sword swing ingame. Conversely id argue it's actually less expensive to do it this way as the trail is a baked effect and not procedurally generated particle system in game which is computationally more expensive, at least as I understand it. The problem with this approach however is it's not dynamic and requires you to actively animate the motion trail yourself. This is a, perhaps, antiquated way to approach the problem, but it is a solution that does work and does have a use case.
I just try this and its actually really helpful For example align rotation does not work somestimes and u wanna straight the uv of ur mesh. U can pin the corner top and bottom point and scale it to x axis and press 0 and it align the uv perfectly without deforming it. It just work like the pin tool from photoshop and its really handy