I teach Virtual Production and performance capture at Drexel University.
Before teaching, I worked in feature film VFX working on Matchmoving (Syntheyes, PFTrack, MatchMover Pro, etc.), Compositing (Nuke, AE, Shake), Pipeline Programming & Scripting (C, C++, Python), 3D (Maya, C4D), Photogrammetry, and Motion Capture (Vicon, Optitrak, MotionBuilder), but after Epic's Unreal Engine Fellowships in 2021 and 2023, most of what I do now involves Unreal Engine for in-camera-VFX, nDisplay, live greenscreen VP and of course mocap/Metahumans. That doesn't mean that I've left my MotionBuilder / Maya / Nuke / Houdini roots behind. :-)
I'm particularly passionate about the power and potential these technologies have for communications and interaction in general, and I hope this channel helps others get up to speed with these tools more easily and quickly.
Excellent video going through steps in detail. I have 8 characters. This worked fine on 6 characters but the same steps didn't work on 2 characters. Somehow I changed one that caused all to lose facial motion even though all have FaceValue set. Another character's body froze up when setting up face. Diving into debugging it. Is there any distinction between the metahumans (sizes, sex) that affects the head/body differently? They all use the same 'base' and same 'head skeleton? And each blueprint is per character?
Hey Scott! Each MH should have its own individual blueprint that brings everything together, but there are some shared ANIMblueprints within that structure that control how ANIM data from different sources are brought together to result in the final result. I’m traveling at the moment but can plan a zoom call to help on Thursday if that’s not too late for you. (You can ping me on LinkedIn if you’d like to plan a zoom)
The install and license acceptance for XCode is required for Unreal to run on a Mac, but then you can go into Unreal Engine->Preferences to change the "General:Source Code" section's "Source Code Editor" parameter to choose Visual Studio Code instead.
Great video series thanks! Can you help with an issue please? The materials which I made in the new material designer look great in viewport, but when rendering with the movie render queue all materials are missing and render black with the following error message: “Detected 2 read-only materials with missing niagra usage flag required to work properly with cloner” In other posts I’ve read to open the materials in the main UE mat editor and make sure that in the materials niagra usage settings that “use with Niagra mesh particles” is checked… But mine are already checked!? Any advice appreciated Many thanks
Hi sir I’m working on a simple animation project in Unreal Engine 5.4. It’s a 10-15 minute documentary-style video for RU-vid in 1080p, featuring MetaHumans and city life scenes. I’m sourcing assets from Sketchfab and will do color grading in DaVinci Resolve 19. Since I’m not creating games, I’m focused on keeping Unreal’s size small and optimizing performance for smoother viewport and faster rendering. Could you make a video covering all possible ways to optimize Unreal Engine for animation projects like mine? It would be incredibly helpful to learn how to manage MetaHumans, streamline rendering, and keep the project lightweight. Thanks so much in Advance.
I think this is a “glitch” in this still-experimental toolset. I work around it by selecting the text once I have it the way I like, and then go to the Actor menu to use the “Convert to Static Mesh” command. This makes a static mesh asset with all the selected materials applied that can go into the place of the AvaText actor for saving & rendering.
Here’s how to render from a Rundown playlist… UE5.4 Motion Design Rundown: Render Selected Pages ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-v9ZdYx_Mlp0.html
Hmm.. this is confusing. The green axis arrow is pointing to the left (horizontal). He makes the rectangle wider (horizontal) by upping the x value. What? Then a few moments later states that "x is our front to back in motion graphics" Never heard anybody say that x would ever be front to back (depth) and it do not make sense. 2d animation Y is up X is left right and Z would be depth. I'm confused :D
When Unreal was first launched, it used a "floor plan" arrangement, where X&Y were along the floor and Z is up. At some point in it's history, the default camera position moved to facing down the X axis..... so here we are now, adding 2D motion graphics, with the default camera facing down the X axis, so Left/Right is Y axis and Up/Down is Z 😛(but we'll still label the local values for the shapes in X & Y) Have fun! 🤣
Great videos, thank you! This takes me a huge step forward. Just tried a bit for myself and I get a strange problem. Maybe someone can point me in the right direction. I animated a Text-Actor wich is masked with a Rectangle and Mask-Modifier. In my scene viewport everything works just fine, but as soon as I add the scene to my rundown the text isn't visible at all. Not in preview and not in broadcast. Any ideas?
Technically, yes… I haven’t tried it though. I think OffWorlLive’s Spout plugin (or maybe NDI?) could be used. You would have a setup a Motion Design output channel to go to a RenderTarget asset, then have OWL Spout or NDI relay that render target out of Unreal for other tools (including OBS) to pick up.
Thanks for making this tutorial available. I've published here on RU-vid my version of the project: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-R7dTOL51aqg.html
Hey PixelProf! Thank you for your amazing tutorials on this awesome tool! Are there any online communities (discord or otherwise) that you find regular conversation about the live video aspects of motion designer? I've been having a hard time finding people who are looking into this and testing/using it regularly
I've been using UnReal on my M1 8g Mac Mini for over a year. I wish I had more memory, but it doesn't prevent me from having open a web browser, Photoshop, and several social media apps at the same time. It's funny that the Unity Game engine, which is specifically ported for the M1 chip, is so much slower than the other apps on my Mac. I sense I do have the least viable set-up in the Apple line-up that supports UnReal.
I was going to get Unreal and use this/the Fortnite editor on my Mac, but so far my research has been telling me that I have to have a PC to use Unreal. Can I use unreal WITHOUT having to download any third party converter like this?
@@kaimitchell2954 It's not a Yes, and it's not a No, but a Maybe. For example, on the Epic Game's UE Marketplace, there are some 2950 plugins offered. You select MacOS and the number drops to 1333. This is a re-ocurring theme when you operate a Mac. My tiny M1 Mac mini runs circles around the $2,000 PC tower my roommate has. I can crank out an AI image in seconds. They take minutes. But I can't play the latest game because it wasn't ported to MacOS. Earlier today, I discovered Niagara Particles for UI. I can't use it. It's only compiled for Windows. Also, I'm not exactly sure what you meant by third party convertor. If you mean, a system emulator like SoftWindows, they sort of work but they slow everything down. Since there is a mac version of Unreal, there is certainly no direct advantage. I hope this helps.
@@kaimitchell2954 UEFN is currently Windows only as far as I know. You can install the full Unreal Engine on a Mac as shown in this video to build custom apps/games for MacOS and iOS, but this version can’t be used to build content into the Fortnite ecosystem.
WE Have similar setup in our lab and I have need to calibrate vicon camera vantage again what to select in the SET volume origin same active wand? please reply
Yup. In the Calibration tab's Origin section, click the wrench to see a dropdown control with options for which calibration object to use. Should be able to select "Active Wand v2" as an option.
@@designmechanical If you look at where each LED light is on the wand, if there is one-LED per light, that's probably the original. The one we have has two LEDs per light position, that's the V2. Also, the back of the V2 has a sticker that says ACITVE WAND V2 on it. There's an image of this sticker you can compare to on page two of the Manual: help.vicon.com/download/attachments/12386675/Active%20Wand%202.0%20User%20Guide%20Jan%202024.pdf
Hello, Thank you very much for this tutorial. I have a question, I use take recorder for facial expressions. Is it possible to add sequence animation just of the face coupled with body animation? I can only find tutorials that couple two body animations. Thanks in advance!
Thanks for the effort! My issue is a little more complicated. I have a sequence that is playing on loop. Everytime the pawn passes through the collision box, I want to stop the looping animation and play a different looping animation. I want to have one of 4 random looping sequences trigger each time you overlap the box.
@kitworkz... This should be doable by inserting a “stop” node at the start of the execution flow (to stop playback of the existing loop) then using a “Multigate” node to randomize the playback of the next sequence. (One multigate output per sequence, set the multigate to randomize)
@@PixelProf Awesome thanks! I've tried a couple stop nodes, but I'll give this a shot! If this works you are the true goat. I haven't found it anywhere!
I’ve only ever been using this for visuals and haven’t tried anything for audio. Will check into it to confirm, but I think this is intended as a visual tool, expecting that audio would be handled by another connection to the meeting.
@@hannerikruger7216 Correct, all audio work is done outside of Unreal. For example, when I'm using this for an on-site display to bring zoom engagement to the "in-person" folks for a hybrid event, another computer is connected to zoom to get the audio into the venue's audio pipeline. Likewise, if the plugin is being used to feed graphics into the zoom meeting, audio is delivered to the meeting through a different client connection. (the zoom output can be pinned or screenshared to be the primary visual without interrupting audio from the other connection)
@@PixelProf Yea I just saw this morning! Personally I have found creating my own mocap is a more creatively empowering endeavor for me but these free mocaps are definitely useful. I'm glad to have found your channel Professor Pixel! I've been picking up quite a few things from you. Thanks for all your hard work ❤️
thank you for teach. i want ask: i have bad eye site and i use 2 monitor set up for blender and c4d. i want ask i can use 2 monitor to for unreal also? i want kep all work space on 1 monitor and tool on 2 monitor. pls can you tel in video how i do this pls? sory my englis in bad.
is there any way to make this blueprint reusable? I'm making a collider version of this and I want to change the sequence on the duplicated collider, so that the same sequence doesn't play everytime
Just two more SIGGRAPH interviews to edit. Will be back to Unreal this week. Sorry if this is “noise” in the Unreal content. Anything in particular you’d like to see a tutorial covering?
@@PixelProf Hi Nick , I'm more interest in the pipeline process, Vicon>>Unreal, live link, NDisplay setup, more information on the camera lens calibration ( Lens distortion, colour correction, also is the calibration for a prime lens ? what will happen for a zoom lens, Does Vicon take in consideration the distortion and nodal point shift of the lens during focus or zoom change ? Sorry if I'm asking a lot.
Hey! These videos are awesome, I've been watching them all. Do you know if it's possible to play a Motion Design sequence using a blueprint? This would work wonders for me! Thanks :)