I'm right there with you at the end "this is crazy... this is crazy..." That natural camera movement, real time lighting and the dof... The result really is astounding! My only question was wether having characters run off the suit in real time was possible, and you're saying "it should be easy." That's all just mind blowing! Congrats!
As soon as I unplug my CONTROLLER from usb port of my computer I loose the connection. I DO NOT have dongle. I have 1 base station and a controller as rig. Any support how to fix this.?
Came here looking for An explanation step by step of how to actually hook everything up. Some good info But also lacking in how to actually make any of this happen. Would love a step by step tutorial about how to link the Vive to Unreal and get it to recognize it then set it all up to function.
Damn. Nice one Matt... so I guess you could wear a VR headset while operating the camera so that you could also be immersed in a VR scene with the mocap characters?
Wow you just blew my mind love it 😍🎬📽 I see you have an Atomos Monitor on your rig imagine using an HDMI transmitter to send you pc signal to the Atomos Monitor 🎬📽😍 that would be so Awesome : ) keep up the great work 😉
Hi Buddy amazing video, would it work with a third controller for the Oculus Quest to do the tracking and use of same software, as the Quest is wireless?
Does anybody know if I need a vive pro for this or does the regular vive work as well? Or is it worth spending more money and get the pro for some reason?
Great video! Thanks for posting. I'm thinking about assemble a similar setup here for real time Chroma Key compositing in my studio in Brazil. I would like to know if with the HTC vive, the old one, I'm able to work with stability. Thanks so much
NO. This needs Room(lighthouse) tracking. Tracking in Rift S is done through front cameras(very limited). Room tracking will be nearly impossible to do with Rift S.
Hi Matt, just a simple question.. do you think there is a solution to connect the old HTC Vive,.. As I noticed everyone is using the pro, do you think there is a way to use the old one?
Wow! Im really excited to try, but an HTC vive is ridiculously expensive in Brazil xD Does anyone know if its is possible to have high quality with any other VR set ?
Hi, thanks for this tutorials, but have one question. This camera is correct for work with UE4?. -Digital Video Camera: Brand: Andoer Color: Black Chipset: Adopt for Novatek 96660 Sensor: Adopt for Panasonic 34110 14MP CMOS Sensor Lens: f=7.36mm F3.2 Screen: 3inch Capacitive Touchscreen Video Resolution: 4K (2880 * 2160) 24fps (interpolation) (2560 * 1440) 30fps (1920 * 1080) 60/30fps (1280 * 720) 60/30fps Photo Resolution: 48MP(9212 * 5184) (interpolation) 36MP(7936 * 4480) (interpolation) 24MP(6340 * 3600) (interpolation) 16MP(4640 * 3480) Format: JPEG (photo); MP4 (video) Continuous Photo: 5photos Self-timer: Single 2s 5s 10s Loop Recording: OFF 3min 5min 10min Function: WiFi Anti-shaking 16X Zoom Face Detection Motion Detection Date Stamp White Balance: Auto Daylight Cloudy Tungsten Fluorescent Color: Colorful White Black Sepia ISO: Auto 100 200 400 USB Port: 2.0 Exposure: 2.0 Auto Power OFF: OFF 3min 5min 10min TV Mode: NTSC PAL DC: 5V/1A Language: English/French/Spanish/Portuguese/German/Italian/Chinese/Russian/Japanese Battery Pack: 3.7V 2500mAh 9.5Wh Lithium Polymer Battery 45.5g One Cell (Included.) Memory Card: Support up to 64GB External Card (Not included.) Operating System: Support for Windows XP/Vista/7/8 for Mac 10.2 or Above Item Size: 11.8 * 5.5 * 6.5cm / 4.6 * 2.2 * 2.6 Item Weight: 326g / 11.5oz (With battery.) Thanks for you help.
Hey Matt, thank you for all the videos you are making!!! I am just getting started with my Rokoko suit and am setting up a virtual camera rig like the one you have on one of your videos. How do you connect the little monitor on the shoulderrig to show what unreal is showing? What kind of little battery monitor are you using on the shoulderrig? Thank you!
Thank you for showing how to wire it up in unreal! What do you think how much time is needed to figure out how to input a live camerafeed into Unreal, do a key there and also record the cameramovement and match it to the video? (well, you could just use a flate to sync the raw video and a videofeed recorded within Unreal I guess, but maybe there´s a smarter way)
Oh wow can you live stream from that camera itself? If you can you can assign that camera a position in game. To stream the graphics in. You still need a player camera on the i stall
if I use get tracked device with the vive first version it doesn't work, I have changed the channels but nothing.In other ways like using the paws it works well.I only use motion controllers, I don't have a tracker
hi there! would you be at all willing to do a more involved tutorial setting up this system from scratch? especially for other controller variants like the index controllers. thanks.
Quick question: Is there a way to record the tracking information and use it later? Say a character is shot against green screen and all we want to do is record the tracking info and use it later while compositing
Just curious, but how does the Vive headset fit into this equation? In theory if you get the trackers and the base stations separately is the headset even necessary?
just to make sure., the virtual camera, you can record the motion into a track inside ue4 right? im thinking about using this to record camera movement into the cinematics, but not just recording video that the camera is seeing
You can use either Sequencer Recorder or Take Recorder for that. It's super easy to use Sequence Recorder for this. You just add your camera actor to the recorder, start your simulation, and hit record. It'll give you a countdown and then record the camera data. The one thing is if you try to render out your recorded sequence in-engine, you'll need to disconnect the tracking code from the Event Tick, otherwise when the engine reruns the sequence simulation to export your video, it'll be overridden by your virtual camera.
Vive has better quality tracking and you can track multiple objects. Both controllers and then multiple Vive Trackers in the same system. Starting with the iPad app is great, but doing actual Vcam is mixed reality with it you will find it’s find it’s limits quickly.
Matt this is FANTASTIC!! THANK YOU for this video series! I had a question...could you tell us the specs of your computer and how many we might need? I've heard that the big time setups require upwards of 7 (or more) for different VP responsibilities, but, I'm eager to learn all of this goodness! Thanks again!
The original Vive should work fine. The newer Base Station 2.0 are able to make bigger tracking volumes and are preferred. But getting started the 1.0 version will work.
Hey Matt, thanks for making this video. Have watched almost all of your VP videos and am setting up my own little studio at home. I'm working on the virtual camera now and was wondering if you could post the parts you used for your handheld rig online?
It makes so much sens to use cinegracer like that (and the direction I wished for). But In best of worlds we could hit button on the virtual camera to start stop the recordings. I would love to se some livekeying with people intigrate with a 3d set.
Joel Arvidsoon I have it setup where you can record footage onto the Atomos, so the cam operator controls the recording. I’ll do some simulcam with green screen soon.
I've been following your channel for a long time. The content is cool. Can you explain how to set up a large screen, such as LED screen as the background for shooting, because in the short term, I think virtual production may be based on the fact that the foreground actors and props are real, and the background LED is the development direction in the near future. Hope to get your reply.
This is really cool!!! Just one question: Why even with the camera rig the VR looks little shake? Does not should looks more stable, because of the camera rig? Thanks!
I think it literally translates controller movement into virtual units without any smoothing. To add that sort of smooth movement you just need to skip virtual camera movement when its less than some threshold, or instead add some delay(0.1 sec) to smooth movement and skip those intermediate movements and move virtual camera to final position. In general there are a lot of different approaches how to smooth movement
Question from a total newb: Primarily for live compositing. Is the entire vive set necessary or could I get results with just the controller and base stations? Thanks for sharing!