The vision of this channel to bring volumetric viewing to the masses. Why look at the world in flat when in reality it is not? We specialise in 3D visualisation in its various forms including holographic art, light field displays, polarised projection, lenticular prints and other stereoscopic technologies.
We believe that it is a shame that most of the spectacular creation around us is minced onto a 2D flat sheet for viewing, losing much of the rich information contained in perspective and depth. Join us in enriching our captures by opening both of our eyes - and see the world the way it was meant to be seen.
Very greatfull to your excellent informative video. I have had the Fuji 3D since 10 years, and thanks to you, I just got the QooCam EGO. They complement each other, so you need both 🙂
You don't need any SBS or other conversion for video production, MAGIX Video Pro X knows everything and can set the appropriate format and aspect ratio automatically. In addition, it can also adjust the parallax of partial images in manual or automatic mode. Saving can be in anaglyph-3D or sbs-3D or in normal mode as mp4, avi, etc. format as well. Instead of the slow wave stabilizer of Adobe Premiere, it uses the fast and much better proDad Mercalli, which is part of the video editor. Sony Vegas Pro can do almost the same. But Corel VideoStudio or CyberLink PowerDirector can also be used at a basic level (with stabilizer). Last but not least, in terms of price and hardware requirements, they are all a fraction of Premiere Pro. By the way, thanks for the videos, I've owned a flawless, brilliant Real3D W3 for many, many years. All the best !
What I like so far is that you mentioned the old Fuji 3D camera. I bought it around 2010 or 2011 and still have it, but the poor camera is no longer usable... this looks like a nice follow up. I also like the viewmaster comparison. It's maybe because I used both Fuji and viewmaster..
When I paste For %f in (*.AVI) do ffmpeg -i "%f" -filter_complex "[0:0][0:2] hstack=inputs=2 [out]" -aspect 1.7777 -vcodec libx264 -x264opts frame-packing=3 -map [out] -map 0:1 "%~nf_2x1.mp4" into the command prompt I get the message "Windows cannot find 'For'. Make sure you've typed the name correctly and try again." What am I doing wrong? Can I put this into a batch file and run it?
As someone who has been actively involved in stereoscopic photography for about 60 years, and has experience in just about every format since 5-perf and 7-perf I found this review refreshing Unlike other reviews, you didn't just go bananas about "Wow it's in 3D and like nothing you've ever seen". You went into details that we want to hear, without getting too technical that you needed to be an electronics genius to understand. In particular, I wanted to know why I would want to buy a LumePad 2, already having used a LumePad1. You explained that very well. I like the forward and reverse lenses, but mostly the fact that the inter-lens separation is increased. I've never been happy with recent stereo cameras that have ILS less than the average human interocular separation. Yes, for closeups sometimes a reduced separation is useful for hypostereos. If I could improve upon this tablet I would use three lenses, with the ability to switch between two or three interlens separations, as needed.
This guide will be helpful with Sonic Generations my Xbox 360. There is one question you haven't answered that I'm concerned about. Combining clips into one video... Wait, THAT'S IT!!! RU-vid has it's own editor. Simply upload the 3D video in private, trim it down using RU-vid Studio and release the finished video to the public. It's sort of unconventional since I like to have fully finished videos before uploading but this would definitely work. Virtual Dub and Avi Demux won't help me here.
Great vid, but noob question here... how do you tell youtube to play a 3d sbs video in a way that can be viewed by either a pair of passive or active 3D glasses? Or will it only play on a genuine 3d tv? Thanks for any help.
Great video! I wonder if you could support two viewers as long as they didnt occupy the same view points. I guess to reliably do it you would need some way to dynamically change the angular divergence of views so you could ensure both viewers had a unique viewpoint by expanding and contracting angular divergence as necessary. This tech is already cool enough, we'll save two viewers for another generation ;)
I want to watch my Qoocam Ego videos with a 3D projector via a blue ray player and with 3D glasses. How can I convert the files to a format that works for that set up? I would eternally grateful for some help and I'm sure a lot of other QooCam users are in the same predicament, thanks!!!
May 2024. They've fixed some things and broken others. It is no longer necessary to edit the video with RU-vid Studio so that it correctly reads the frame packing or the stereo mode metadata, but now it does not process 3D videos at 2160p 60fps and it does not understand DAR in the same way for mp4 and mkv. So if you upload a 3840x2160@60fps half side by side mp4 video with frame-packing=3 metadata, it marks it as 3D (at last!) but does not process it at 4k. After waiting 48 hours for it to process the video at 4k it simply interprets it as 1080p, 4k processing disappears. UPDATE1: It may have been a bug for a few days, but now it is fixed and everything works fine, it finishes processing at 4K without any problem. UPDATE 2: Today, 2 days after update1 I have the same problem again with 3D 4K processing. 4K processing without 3D metadata is done without any problems. What a mess!
Cool tech but it has to be a display not a tablet to have some reach for the likes of me, cant do development with limited 3d processing power, I would expect to have a 3d scene in unity, unreal or touchdesigner, just pipe it in with a displayport and have the 3d work. It might sound a bit cocky but Looking Glass got the point really well on execution, highly compatible higly accesible, especially their 300$ display, i hope these guys will move forward tword an realistic scalable approach and not as a "everyone will have it in their pocket" mentality.
How do you export the photos and video clips to watch on a 3D television? Can you put the files on a USB stick and watch them in side-by-side mode on your 3D TV? Can you burn them to BD-R?
Does it really have 960x533 pixels in 3d mode😢? That is worse than my Ps Vita. I thought it was 1280x800. Why is everyone saying the resolution looks good?
Doesn't it overheat? I bought mine from Amazon in 2022 for $500 and I returned it for a refund within the return period because after 15 min it was practically boiling. I have the Sony Bloggie 3D MHS-FS3 and the 3D effect is pretty amazing ...
Hello, very good and understandable tutorial. However, not everything works like in your video. The command "arp -a | findstr dynamic" doesn't work. Syntax.Error! 'arp -a' alone works though. But: if I switch the camera off, the camera's IP remains visible even after 5 minutes. The camera's IP is only deleted after disconnecting the router from the Internet and then reconnecting it. And unfortunately there's more bad news: the camera doesn't connect via VLC or OSB. But if I enter the IP into the browser: the following message appears: SyntaxError: 'JSON.parse: unexpected character at line 1 column 1 of the JSON data'
Thanks to an update by RU-vid, all methods should work again without workarounds. See: twitter.com/nimazeighami/status/1778546947110191411?s=46. I've updated description with this also.🎉
it's grandiose, your byutyful video creation. Canon camera available Dual pixels RAW file, only image...maybe next time create video SBS process technology 🤔 one camera one lens one sensor, what not. Wont view this, or non WAR Ну, а пока ... Piece 3Da
Hi. Thanks for the video/ I have the camera and I can watch the video's side by side in 3D in Quest 3 file browser and on RU-vid VR. But I was wondering if you can watch them also as 180VR formaat. I saw a a 180VR video on youtube shot by this camera but could not figure out what the workflow is. Can anybody help me with that?
Awesome video! Eye tracking is great! I had a question about the "Views Demo" and if you used one of the Unity Shaders for that. I had no luck when I tried getting the different views a few months ago. Tried emailing you but email bounced. Tx
Hello I apologize, it’s a machine translation 🙂 FFMPEG works great, but I have a question. FFMPEG doesn’t use HW acceleration, HandBrake does. Is there a way for FFMPEG to also use HW acceleration? I don’t need to say how slow it is without it.
I've not tried it myself, but I believe that hardware acceleration is supported but depends on your drivers and hardware of course. (example I found: "ffmpeg -c:v h264_cuvid -i input output")
Hello, an extremely good tutorial about this camera and a very good comparison to the Fuji W3. I myself have the W3 and also a Sony HDR-10TDE. This makes very good videos, unfortunately with limited depth due to the low stereo base. It is also too big and too heavy. The Fuji W3, on the other hand, takes good pictures in daylight but miserable ones at dusk and even worse video. This is probably due to the video codec (Mpjeg). In order to always be able to set the right Stereo base, I have 2 GoPro action cams on an aluminum rod. The rod sits on a gimbal. Fantastic for landscape shots, but too bulky for snapshots. I have now ordered a Qoo Camm Ego and hope that it is a good compromise between the W3 and the gimbal. How does my video editing program (Vegas pro 18) have to be set to progressive or deinterlaced. Thank you!
Hi Andreas, very good video. Even for me, who has Not used the command line for more than 3 decades it was possible to convert some ten year old fuji 3D videos. The loop in your script did not work for me on Sonoma. But I could handle it. Unfortunately it did not work to view it as a spacetial video on Apple Vision Pro. Do you know the correct Parameters for it?
Hey ... I am not able to stream with the instructions of this video.... I hv found out the IP address of my camera ... But still VLC player says .. error taht " can not connect to IP address... Seeking solution
Great video. I love this camera, there should be more like this. The video desync with the viewer, if you look closely one side of the frame is upside down. Likely a thing with the mirrors, they might be doing some form of manipulation at playback to get the frame to be upside down and that could be adding the fraction of a second difference between eyes. They might be able to in software fix it by delaying the non-flipped eye? I'm working on some ffmpeg scripts to output anaglyph for both photos and video (though VLC can do video playback manipulation for anaglyph). If the subject is too close to the lens the anaglyph output needs shifting to reduce eye strain / sickness. Don't go too close to the subject and it should be fine. Shooting straight on seems to work best.
i bought it 10 years ago when i was a university student.i have pictured lots of 3d photos and video over the past years .in 2021 September i bought meta quest2,and i really i came back to the past by watching 3d pictures and videos through it. my tears go down,thats the meaning of 3d!our body cannot get back to the past but memories with 3d can do❤ i love my fuji w3! only with vr(quest2/3,apple vp) can w3 show its potential! i am chinese,few of the peole here own 3d cameras or vr,i am proud of my choice of buying them, and i am grateful for your technical support in 3d which helps me a lot in editing the 3d contents!THANK YOU❤
@brettharrison8280, haha... I hope that is a good thing, I want to stay in the good books :) how are you using the 3? Are you using them together somehow?