Тёмный

Light Fields - Videos From The Future! 📸 

Two Minute Papers
Подписаться 1,6 млн
Просмотров 135 тыс.
50% 1

❤️ Check out Lambda here and sign up for their GPU Cloud: lambdalabs.com...
📝 The paper "Immersive Light Field Video with a Layered Mesh Representation" is available here:
augmentedperce...
❤️ Watch these videos in early access on our Patreon page or join us here on RU-vid:
- / twominutepapers
- / @twominutepapers
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Haro, Alex Serban, Alex Paden, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bruno Mikuš, Bryan Learn, Christian Ahlin, Eric Haddad, Eric Lau, Eric Martel, Gordon Child, Haris Husic, Jace O'Brien, Javier Bustamante, Joshua Goller, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Michael Albrecht, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Ramsey Elbasheer, Robin Graham, Steef, Taras Bobrovytsky, Thomas Krcmar, Torsten Reil, Tybie Fitzhugh.
If you wish to support the series, click here: / twominutepapers
Károly Zsolnai-Fehér's links:
Instagram: / twominutepapers
Twitter: / twominutepapers
Web: cg.tuwien.ac.a...
#lightfields

Опубликовано:

 

7 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 530   
@brickdesign6438
@brickdesign6438 3 года назад
This is absolutely incredible. Imagine a movie where you can actually look around the scene! What a time to be alive!
@marcozolo3536
@marcozolo3536 3 года назад
Upskirts finally take on a whole new definition
@Manks08
@Manks08 3 года назад
It would make for an incredibly immersive crime movie where the viewer could look around a scene for evidence at the same time as the actors. Eg looking behind a picture frame or ornament.
@billbauer9795
@billbauer9795 3 года назад
>What a time to be alive! You are right. 10-20 years from now most of us won't have indoor plumbing any longer.
@sgbench
@sgbench 3 года назад
It has its limits. As soon as you leave the volume that was originally occupied by the camera sphere, there are gaps in the data that become increasingly obvious.
@ScorgeRudess
@ScorgeRudess 3 года назад
not just look arround, but walk inside it!
@ManMadeOfGold
@ManMadeOfGold 3 года назад
This is one of the few papers that really made me think "this is the future". Feels like wizardry to rotate and pan around a video, rather than a still image. Not a super practical thing, but it's dang cool!
@LimitedWard
@LimitedWard 3 года назад
Two immediate applications I can think of: 1. making better video instructions for the assembly of complex 3D parts in manufacturing. 2. VR porn. Because, why not?
@kamathln
@kamathln 3 года назад
Not a super practical thing? Ask any Movie director
@javierflores09
@javierflores09 3 года назад
​@@kamathln I mean, not for normal videos, sure this has its own utility but you normally want to center on whatever the thematic of the video is and not something off the scene
@Kram1032
@Kram1032 3 года назад
@@javierflores09 I think regular movie directors (i.e. not ones who want to focus specifically on providing lightfield movies) will actually want this as well: It means they can much more seamlessly adjust the camera position in post, allowing for insane levels of fine control. So they'll end up with a static view (or perhaps a binocular view ala current 3D movies) but it's extremely finely adjusted to their every whim. It's also gonna improve the workflow of what I *think* is called Z-compositing? Wherein all your data is in RGBD and so you can composite some CG elements into the scene at exactly chosen depths and because your footage has depth info too, clipping and occlusion will automatically work correctly instead of requiring expensive, error-prone semi-manual frame-by-frame masking. Essentially this gives even more information, including to stuff that's completely occluded by other stuff. Like, you could digitally add in a reflective surface and it would potentially reveal what's behind a thing *in the digital reflection* etc. So yeah I'm sure film makers will love this.
@kamathlaxminarayana301
@kamathlaxminarayana301 3 года назад
@@javierflores09 Try centering on hyper-active children/pets :D
@lunafag
@lunafag 3 года назад
Moving the camera while seeing paused motion blur is such a cool effect.
@MrDontworrybehappy10
@MrDontworrybehappy10 3 года назад
Yeah the shot with the flame got me a vibe of star trek "holo situation" or something like that. Or even an advanced futuristic police crime scene analysis feel
@ikagura
@ikagura 3 года назад
Like glitchy?
@moayadbassam
@moayadbassam 3 года назад
Yeah Almost like the bullets in the matrix
@CosmiaNebula
@CosmiaNebula 3 года назад
also moon?
@ebog4841
@ebog4841 3 года назад
@@CosmiaNebula Yes, my sun! Hi, noon!
@TheLastCodebender
@TheLastCodebender 3 года назад
"Hold on to your papers" that joke always makes me smile 😂
@dustinwaree
@dustinwaree 3 года назад
Imagine People walking around with that Monster taking Selfies 🤭
@jearlblah5169
@jearlblah5169 3 года назад
*karen has entered the chat* What do you mean “it’s not for sale”???? I DEMAND TO SPEAK TO YOUR MANAGER!!!!
@loleq2137
@loleq2137 3 года назад
@@jearlblah5169 😐 why
@MarioManTV
@MarioManTV 3 года назад
This was exactly the sort of thing I was looking for after the last video with the video selfies. This looks like an incredibly promising way to make VR video more immersive than ever before.
@McDonaldsCalifornia
@McDonaldsCalifornia 3 года назад
VR porn ist gonna be a saviour for early funding of this technology I'd wager
@skylerlehmkuhl135
@skylerlehmkuhl135 3 года назад
I could see this being used for movie production; imagine being able to film once and then tweak the camera movement in post production.
@NoogahOogah
@NoogahOogah 3 года назад
It would quintuple the data storage requirements tho
@tomnewsom9124
@tomnewsom9124 3 года назад
That's what Lytro were aiming for with their Immerge camera. I don't think anybody was interested though, cos they folded and got gobbled up by Google.
@Nobody-Nowhere
@Nobody-Nowhere 3 года назад
imagine shooting with a rig that has like 20 movie cameras attached to it... maybe its just easier to know what you are doing in the first place
@yaelm631
@yaelm631 3 года назад
It is possible ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-9qd276AJg-o.html Look at intel volumetric video studio
@DodaGarcia
@DodaGarcia 3 года назад
@@NoogahOogah which almost all technology advancement has done lol
@Chain83
@Chain83 3 года назад
Download the VR version from the website! I realised this was the first time ever that I could experience *true* 3D video (not just stereo). It is something else! I kept wanting to pet the damn dog! :D
@Andreadel96
@Andreadel96 3 года назад
I just tried their demo in VR and it was very impressive. Things felt so real, like I was there (I know, of course it's VR, but still!). Now I am sad, that I cant watch every video like this. Also the one sitting at a lake or at the ocean were very relaxing. This has definitely the possibility to be used for virtual tourism. AMAZING!
@microwavememes
@microwavememes 3 года назад
it feels so weird for humanity to be so perfectly preserved for the first time ever
@TheZenytram
@TheZenytram 3 года назад
at each video, we get closer and closer to magic.
@overloader7900
@overloader7900 3 года назад
Any technology advanced enough is indestinguishable from magic Any magic arcane enough is indestinguishable from science
@user-iv1qq3vb9j
@user-iv1qq3vb9j 3 года назад
@@overloader7900 I really like how you said that:) I'll be quoting you
@martiddy
@martiddy 3 года назад
@@user-iv1qq3vb9j Is one of the three Arthur Clarke's laws
@overloader7900
@overloader7900 3 года назад
@@martiddy And its variation, thanks for telling
@koz_iorus1954
@koz_iorus1954 3 года назад
1:27 That animation was so cool!
@yaelm631
@yaelm631 3 года назад
I first saw this tech used in vr with google lightfields (on steam), if this becomes affordable I would be really happy
@yaelm631
@yaelm631 3 года назад
We can download a 8GB demo ! augmentedperception.github.io/deepviewvideo/
@CIinbox
@CIinbox 3 года назад
That was already a few years ago. Although they were still pictures, it still looked amazing as if being there.
@dru4670
@dru4670 3 года назад
Imagine watching back your memories played back in that.
@kotokrabs
@kotokrabs 3 года назад
Wow, add tobii tracking device and this will revolutionise the media!
@jambalaya201
@jambalaya201 3 года назад
What is a tobii tracking device?
@kanishkachakraborty
@kanishkachakraborty 3 года назад
@@jambalaya201 Tracks exactly which part of the screen your eyes are looking at
@rtyzxc
@rtyzxc 3 года назад
What about just using a VR headset? This is basically designed for VR.
@kanishkachakraborty
@kanishkachakraborty 3 года назад
@@rtyzxc It will forever be less convenient than looking at a screen. With tobii eye tracking you can reach a middle ground between immersion and convenience.
@geekswithfeet9137
@geekswithfeet9137 3 года назад
Just use camera eye tracking.... It's almost like we know a guy with a bit of expertise in camera based detection...
@pinustaeda
@pinustaeda 3 года назад
My jaw dropped when I saw the dog behind the fence! It was almost perfect, such an amazing implementation.
@TessaBury
@TessaBury 3 года назад
This feels way more enjoyable to me than 360 VR. I like the idea of a 3D video space where there's parallax if I happen to shift in my seat or tilt my head. And more importantly, the viewer has an actual sense of scale and depth- They can lean in to get a closer look at something in a Maker video, for example.
@finnaginfrost6297
@finnaginfrost6297 3 года назад
This could work to accelerate VR-as-a-service platforms, where the remote game engine renders an entire light field for the approximate location of the VR headset, which is delivered once every second or two with extremely high compression. The client's PC decompresses the light field and maps the actual head movements and rotations onto the now-old data, simulating a highly responsive remote rendering service of a semi-static environment.
@spartv1537
@spartv1537 3 года назад
watching these videos only for "What a time to be alive!" words
@0mnishade
@0mnishade 3 года назад
I really hope this leads to better predictive image in-painting soon. Having something of this quality on a phone with minimal hardware changes would be absolutely incredible
@c0dexus
@c0dexus 3 года назад
For VR, it's not just about rotating your head, it's 6 degrees of freedom. The image is still correct when you move your head and this is why lightfield images feel so real in VR, it's like the realism of a photo/video combined with the freedom of real-time 3d.
@stevenru4516
@stevenru4516 3 года назад
There's so much development in this field, please, cover them more *waiting room for 5 papers down the line*
@EVILBUNNY28
@EVILBUNNY28 3 года назад
I watched this entire video with my mouth gaped in awe. I cannot wait until 2 maybe 3 papers down the line where the artefacts are less noticeable and the content may be optimised in a way so that it could work over a limited connection such as mobile data. Hopefully another 5/10 years down the line we will be able to achieve the same camera angles all available on your smart phone. who knows! truly what a time to be alive.
@michahermann7869
@michahermann7869 3 года назад
Having both played Cyberpunk and just seen these videos on my VR Headset, it is almost frightening how close this is to a supposed Brain dance Experience from the game. The indefinitely repeating glitches on the side of the view field make it even more realistic. Awesome how close it even is from being actually there in the scene. Just want to up the resolution a notch. Now you have to interpolate space from only two cameras and their movements in space, get them small and fit them into eye implants. Voilà, Kiroshi eye implants from the game.
@Fighter_Builder
@Fighter_Builder 3 года назад
I really can't wait for this to catch on. True VR videos? YES PLEASE!
@Poney01234
@Poney01234 3 года назад
I've been having this idea for years: Take a huge concert in the daylight. The lead musician asks everyone to take a short video at the same time, for example of a jump he does or something cool. Then these 30,000 people upload their video on a given platform (RIP 4G antennas). => You get a messy, inconsistant, but free and unimaginably huge dataset you can perform 3D reconstruction on! After a lot of cleaning and reconstructing, you could walk through the crowd, turn around the stage, etc. I know some stadiums are equipped with 4D Replay or other technologies that allow for similar experiences, but only for a very limited portion of the venue.
@sharkbeats1397
@sharkbeats1397 3 года назад
Wow that was amazing.
@kendokaaa
@kendokaaa 3 года назад
For anyone wanting to view Light Field photos (not video) in VR, Google released a free app for SteamVR in 2018 that's pretty awesome: store.steampowered.com/app/771310/Welcome_to_Light_Fields/
@natevplas
@natevplas 3 года назад
I think lots of people on here are confused about the difference between 360 video, photogrammetry, and light fields. It's not just about stitching multiple videos together and slapping it on a 3D model. A true light field is the 5D value of every point of light within a volume. They are capturing a few 2D slices (where the sensor collects the photons) and the algorithm is filling in all the rest for the (hemi)spherical volume. Then it's compressing that down into something that can be streamed, which is probably a 1,000:1 ratio. It's like compressing an IMAX feature film into the size of a JPG file!
@johnnyswatts
@johnnyswatts 3 года назад
Light fields are amazing - that's the field in which I'm doing my PhD. There is so much promise in this technology, and we've come a long way since the brilliant Lytro camera.
@powergannon
@powergannon 3 года назад
This technique with 360 degree viewing looks like it would allow for really immersive VR movies
@LKDesign
@LKDesign 3 года назад
This needs to get promoted big time. It needs a web platform providing content from spectacular places around the world. It needs a dedicated app for the wide spread use with mobile devices via browser and cardboard experiences. Go to rallies, fairs, TV studios, landmarks and great vistas to create a bulk of content. How odd that the relatively young concept of wide angle stereoscopy already seems to have found its match. I'm curious to see what's the future will hold for this technology.
@nelsondiaz5415
@nelsondiaz5415 3 года назад
Imagine now how Walking Simulator games, would be! WHAT A TIME TO BE ALIVE!
@ilazerxxx4894
@ilazerxxx4894 3 года назад
What's weird is that the camera of the future looks a lot like the human eye. What a time to be alive!
@fr3zer677
@fr3zer677 3 года назад
The Google lights fields demo on my vive left me wanting more amazing light field content. I'm happy to see the amount of progress being made in this fascinating field.
@MariusLuding
@MariusLuding 3 года назад
There is a Demo for this in VR, which makes it even more impressive, as the content can be rendered in stereoscopic mode, because all information necessary for that is included in the dataset.
3 года назад
This is actually the future. Can't wait to watch these kinds of videos on my VR headset.
@adriaanb7371
@adriaanb7371 3 года назад
I just realized that it makes a lot of difference in watching VR. It is the small head movements that do not show in the image that cause nausea and break the illusion of reality in VR. So yeah, much more relevant than a silly 3d effect. Remember the head pivots around the neck joint, not the eyes, so there is always a parallax effect that needs to be handled even when the viewer is sitting still in the vr scene while just looking around.
@yade5979
@yade5979 3 года назад
Imagine they implement this technology in VR movie experiences where you become a character inside a movie like a horror one for example, and than you experience the entire spectacle first hand.
@AlexandreBizri
@AlexandreBizri 3 года назад
This domain always has been fascinating to me. I once reimplemented on GPU the method of the paper "Soft 3D Reconstruction for View Synthesis", and it made me learn so much about light and depth.
@Denyernator
@Denyernator 3 года назад
Virtual theatre plays! Live streamed in VR! Potentially a great way to keep theatres open during a pandemic, if only this were commercially available
@fanxia3234
@fanxia3234 3 года назад
This is so magic… I was think about this yesterday and don't know what keyword I should search in Google, then you creat this video today! WoW!!
@Game_Sometimes
@Game_Sometimes 3 года назад
360 degree cameras can accomplish some of this, which is ok, but we definitely can improve.
@EVRLYNMedia
@EVRLYNMedia 3 года назад
the thing with 360 degree cameras is that it still only captures from one position, that being where the camera is. with this technology, you can kind of move around as the camera and experience real depth effects, unlike a 360 video where you basically move around a gigantic frame, but depth may not be accurate
@vincent4652
@vincent4652 3 года назад
I can already imagine how much more immersive this will make VR videos. No more of that jarring/nauseating feeling from having the world shift with head movement.
@Dismythed
@Dismythed 3 года назад
Here is how to do it with a single market-ready phone camera: create a camera app that when you click the button to take an image, it starts filming a video, allowing you to make a verticle circular motion with the camera in a 2 foot diameter, then you click the button the second time to take the photo. Then the camera app constructs the 3D image using the algorhythm and the images from the video. You can fight artifacts and blurring by letting the program draw detail information in a single 10x10 pixel area from a single frame that most closely matches the surrounding pixels.
@id104335409
@id104335409 3 года назад
To those who like this idea - take a look into Lytro camera. They managed to produce their first light field camera model available for purchase and then they went under. Not too many people interested in the technology. With this special camera you can capture more information and store it in a file that needs to be reconstructed in their software, where you can chose the angle and focus of your picture and then export it in a normal photo. This is mainly why nobody was interested. Other than that - this technology looks like magic compared to what we are used to. If 3D TVs didn't die out - this would have complemented them very well. But nobody wanted to invest more money in 3d. Companies that worked on light field on 3d TVs with no glasses needed also went in oblivion. If that didn't happen - today instead of fighting over hdr- we would have be looking at TVs with popping picture you can look at from all angles. In VR this light field technology allows for removing the bulky heavy lenses that sit on your nose and thus making VR glasses as thin as regular glasses. Also - this technology allows for your eyes to move around and focus inside the image, making the experience much more realistic and less vomit inducing. However - Nvidia - the only people behind it - also backed out and stopped developing their 3d and VR glasses. How about that?
@dryued6874
@dryued6874 3 года назад
I thought light field photography was an entirely different technology - one where you capture, well, the light fields, and can do things like adjust the focus after the fact.
@IceMetalPunk
@IceMetalPunk 3 года назад
Yeah, so did I... I guess they're using the same terms for different things, which is a little confusing.
@Ny-kelCameron
@Ny-kelCameron 3 года назад
Me too as well. Unless they are actually using the specialized light field camera, this just looks like regular cameras being used to capture the image/video then stitch it together, which still impressive, but not as much as what it seem that true light field cameras can do which I think everything can be done with one camera.
@IanGrams
@IanGrams 3 года назад
The way to capture light fields is by either having an array of microlenses in front of a single sensor or using an array of sensors. This paper took the latter approach while the now defunct Lytro camera used the former. en.wikipedia.org/wiki/Light-field_camera
@odw32
@odw32 3 года назад
I do think we'll see bitrate for these formats decrease in the near future. All current codecs including H265 are made to compress a series of nearly similar 2D images through time, not to compress multiple parallel streams. Just like how there's similar information from one frame to the next, there will also be very similar information from one stream to the next in this case.
@TwoMinutePapers
@TwoMinutePapers 3 года назад
Great piece of feedback. Thank you so much for posting this!
@willguggn2
@willguggn2 3 года назад
The moment you mentioned reflective surfaces I caught myself pretty much only looking at the mirror in the shop. :D
@DasIllu
@DasIllu 3 года назад
Reminds me a bit of the Lytro Lightfield Camera. Tiny box capturing all incoming light and lets you change the focal plane as well as your point of view by a few degrees. Imagine what you could do with an array of Lytros and VR+Eyetracking (incl. pupil dilation to calculate the eyes aperture to adjust depth of field) for replay. Total immersion
@jayxi5021
@jayxi5021 3 года назад
Can't wait for the Dall-E paper from OpenAi (and its video)
@MrJaggy123
@MrJaggy123 3 года назад
While we wait for this channel to cover it, I found Yannic Kilcher's video on it to be pretty good : ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-j4xgkjWlfL4.html
@jayxi5021
@jayxi5021 3 года назад
@@MrJaggy123 watched that already :3
@lucaslucas191202
@lucaslucas191202 3 года назад
Damn you got lucky then
@Veptis
@Veptis 3 года назад
There is a studio in France that uses a ligh stage and shoots 3D video. So they do a photogrammetry reconstruction for every single frame.
@Demnus
@Demnus 3 года назад
That's an old stuff that's going with it's roots up to Lytro cameras from 2006. But still the stuff doesn't stop to impress and be "the future" for 15 years straight in a row.
@IceMetalPunk
@IceMetalPunk 3 года назад
No, it's not. They're using the term "light field" to mean something very different than the light-field cameras like Lytro's.
@Demnus
@Demnus 3 года назад
@@IceMetalPunk Actually it is. The method of taking many images from many different angles at the same time, was described in 1908 and that was called Light Field and it's exactly what was done in this case. Lytro innovation was to make an optical raster film of micro lenses that's supposed to write all viewing angles in one single frame at one single exposure moment. Limiting resolution and angle differences and as a result ability to move camera. Still technology in general the same.
@LKRaider
@LKRaider 3 года назад
Means you can probably get results like these in a camera embedded on your phone
@Demnus
@Demnus 3 года назад
​@@LKRaider Technically yes. But the quality of execution might vary. And it will take a lot of fiddling. And the bigger the "virtual aperture"(distance between farthest points from which you got pictures) the more you'll have the ability to move the camera. And technically if you get pictures from every possible angle, you'll get the ability to freely move the camera inside the scene. If your "aperture" wold be small, you'll be able to move camera only in that vicinity, and may be have ability to change focus and some other properties, like optical aperture postfactum. Of Course also you can capture still scene, making many photos from many different points. Ether way, it's all by its core just interpolation between those frames, based on coordinates of actual cameras position and the position of the virtual camera. The main point of this research, as I understand in the actual way they do this interpolation that is more likely has better quality and may be a bit faster than previous attempts. Here they build some kind of mesh representation of the scene, which has some similiarity with 3d scanning. But 3d scanning by itself very similar to light field concept.
@markmywords5342
@markmywords5342 3 года назад
AI is going to teach us something about light we didn't already know existed in real life
@Baleur
@Baleur 3 года назад
100%, it's only a matter of time before AI "scientists" become a reality.
@moahammad1mohammad
@moahammad1mohammad 3 года назад
@@Baleur And then AI takes over and kills/enslaves us all
@sgbench
@sgbench 3 года назад
@@moahammad1mohammad What makes you think an AI would have any reason or motivation to kill or enslave us?
@markmywords5342
@markmywords5342 3 года назад
@@sgbench *Programs robot to walk* *Person stands in front of robot* *Robot doesn't see person* *Person codes robot to see person as obstacle* *Robot "overcomes obstacle"* lolol
@sgbench
@sgbench 3 года назад
@@markmywords5342 In your example, a person programmed the robot to be evil. The robot didn't become evil on its own.
@LanceThumping
@LanceThumping 3 года назад
Can't wait to see a paper that defines a new compression method specifically for light fields. H265 is good but I seriously doubt that it's designed with light fields in mind.
@andarted
@andarted 3 года назад
Combining image inpainting with the data of two or three extra cams seems like a natural next step, imho.
@daYps3
@daYps3 3 года назад
I know that Cyberpunk has been slated - but this reminds me very much of the braindances from that game - you can explore previous memories & events in a similar way to find out additional clues around the scene!
@superfluidity
@superfluidity 3 года назад
This would be amazing on a flat screen with a head tracker. Or even better one day if they can make a light-field display, i.e. a two dimensional array of tiny projectors that project directly into the eyes of any number of viewers.
@lilhuzky28
@lilhuzky28 3 года назад
Your channel is amazing man really make it enjoyable to learn and it’s interesting and needful concepts!!!
@TwoMinutePapers
@TwoMinutePapers 3 года назад
You are very kind. Thank you so much! 🙏
@quitegonejim1125
@quitegonejim1125 3 года назад
Fecking amazing! Tutorial videos (and all videos/movies) of the future will have so much more!!
@Knecken
@Knecken 3 года назад
I can see this being useful for video/movie editors to perfect that camera angle / pan after the fact!
@DanielRodriguez-gm1ih
@DanielRodriguez-gm1ih 3 года назад
Can’t wait to watch VR movies with this technique in the future.
@danielmastia87
@danielmastia87 3 года назад
This, in VR? Imagine watching your "old photos" this way. Popping a VR headset and just being there.
@raptokvortex
@raptokvortex 3 года назад
I can see this becoming a new method of viewing content... like video over books.
@caedmonswanson2378
@caedmonswanson2378 3 года назад
Something I'd be really interested in is AI video and image compression. Imagine being able to compress videos much higher than normally possible and still retain most of the detail. Stuff like that is kind of happening with NVIDIA DLSS (which upscales gameplay resolution from like 1080 to 4K), but I think that making a system dedicated to compression could be very cool. I'm not sure if this already exists, but I think that something like this will definitely be used in the future. Imagine being able to watch videos and movies while using 25% of the data. That could save a lot of money in ISP bills.
@user-ej4md7tm3y
@user-ej4md7tm3y 3 года назад
This is looking incredibly promising! I still remember seeing a demo at a conference of what LYTRO was doing with lightfields (2016-ish), I was blown away at the time. Too bad they went 📉 but I'm glad to see lightfield technology still going strong
@BomageMinimart
@BomageMinimart 3 года назад
This totally fucking rocks! Just awesome to see where this will go for entertainment and teaching/instruction.
@play005517
@play005517 3 года назад
We may need a custom entropy encoding format that look across all view ports for these highly coupled videos. Most of the image across different physical or logical cameras are essentially the same. Only some highlighted area shown in 1:44 are actually different so we can skip other part and only keep a key frame of it.
@AlexTuduran
@AlexTuduran 3 года назад
This is essentially what Lytro did years ago, but they fucked up the business when they tried to "sell" it to photographers as after-shot re-focusing and changing DOF when instead they should have pitched their product to the 3D-related communities and show that you can do more than re-focus or re-bokeh. It's like inventing time traveling and your strongest selling point is that you can go back 2 days in time in order to buy the t-shirt at a lower price. Not to mention, photographers would never trade their smooth bokeh fast and sharp lenses and the full-frame high res and high iso sensor to a Lytro optical system. My lenses focus so fast, that I never need to re-focus. Instead, the ability to capture light fields, essentially 3D scenes would have been praised in the game dev world or the vfx side of cinematography. They even had a video camera capturing 3D videos in which they've shown how they could easily insert geometry that fits with the scene, re-light the captured scenes and more. And this could have extended to even more amazing use-cases. Now facebook and otoy are doing it, intel is doing it and seems more and more giants dig into this tech. It's a pitty what a massive opportunity Lytro lost, they could have been market leaders by now, perhaps even creating glass-free 3D screens to preview their captures and push this tech even further.
@michaelmartinez8578
@michaelmartinez8578 3 года назад
The next step has to be doing this with fewer cameras. Being able to do this with 5 cameras would be a HUGE step.
@MrSaemichlaus
@MrSaemichlaus 3 года назад
Soooo, photogrammetry applied to each frame of a video scene captured from various angles to reconstruct the 3D scene?
@paulbunyangonewild7596
@paulbunyangonewild7596 4 месяца назад
my thing is, yeah we HAVE 3D videos. but the perspective is fixed, and often completely wrong. (at least for vr). but with this, the perspective is never wrong! and everything is correctly sized.
@ThunderDraws
@ThunderDraws 3 года назад
yeah I checked this out in VR a few months ago - absolutely the future!
@DerSolinski
@DerSolinski 3 года назад
Light Fields aren't new. But I have to admit that this is some outstanding work. Even so a camera sphere of half a meter is some what impractical. Usually small high speed light field cameras are used in the industry for quality control in fully automated systems.
@brettcameratraveler
@brettcameratraveler 3 года назад
The cameras and original raw footage are huge impractical for any form of wide consumer adoption but perhaps for the largest of film crews. A different approach that would solve these two issue is to crowd source cell phone photos and videos of popular sites to create a cloud based 6dof 3D model of the background scene in any direction and then to insert any live action into that model from the cameras already in the form factor of our phones. When 5G networks become widespread you might see the creation of something like this to relive your memories in fully walkable 6dof VR, etc.
@brettcameratraveler
@brettcameratraveler 3 года назад
Alternatively, you could record a normal video with your camera of the live action and then if you wanted to preserve the scene in 6dof then you would take 15 seconds to walk around in a 6 foot circle and film it in 360. With the creation of the right software, the resulting video could be pieced together into create not only a 360 background but a 360 6dof background you would be free to move a few feet in. The live action moment you first recorded would be inserted within the virtual space of that second 360 6dof scene. Anyway this is also another solution that works within the form factor limits of a phone so wide adoption is potentially possible.
@AlexSeewald
@AlexSeewald 3 года назад
Now THIS is something I am genuinely very impressed with - I was already impressed with the Tom Grennan video on the PSVR, but that was a lot of work. This does it completely automatically. I'm definitely going to keep an eye on those papers.
@tetlamed
@tetlamed 3 года назад
Still love hearing the word "doctor" at the beginning of his new videos!
@sdjhgfkshfswdfhskljh3360
@sdjhgfkshfswdfhskljh3360 3 года назад
Our world contains a lot more information than regular person can think of. Imagine capturing all of it :)
@nbohr1more917
@nbohr1more917 3 года назад
This is sort of what you would need for "glasses free" 3D TV. You cannot transmit 20+ parallax views in HD (or better) given the limitations of network transmission. You would need to transmit some images and some 3D geometry and then re-construct some of the parallax frames.
@anjaninator
@anjaninator 3 года назад
Being able to capture (or capture some and fill in the rest w/ AI) such footage is important for when VR tech is ubiquitous
@junoexpress6
@junoexpress6 3 года назад
This is like Photogrammetry but for video, very impressive
@Puffycheeks
@Puffycheeks 3 года назад
in VR this is mindblowing.
@chrismofer
@chrismofer 3 года назад
just amazing. I loved Google's lightfeild demo, I wonder if ai volumetrics is less computationally intensive to play back live than Googles implementation
@Zhaxxy
@Zhaxxy 3 года назад
yes
@MarcusHast
@MarcusHast 3 года назад
I'm convinced that I have seen a paper which combined multiple video streams into a light field video. And it was made to use standard video decoder hardware in creative ways to accelerate this process. I do believe it was presented by Disney, but I have not been able to find it again so I might have dreamed it. :-P
@rolfathan
@rolfathan 3 года назад
Absolutely stunning work.
@numero7mojeangering
@numero7mojeangering 3 года назад
It's like we have the real world on your computer
@ganjanaut
@ganjanaut 3 года назад
Would be cool in an interactive detective movie, pause time, rewind to inspect etc...
@Theminecraftian772
@Theminecraftian772 3 года назад
That's awesome stuff. I look forward to the hardware for it becoming cheap, commonplace, and integrated. I'm also looking for some more solutions to data compression. Have the AI scientists designed something for general compression of noisy data yet?
@antman7673
@antman7673 3 года назад
This tech seems like something, that will consume lots of energy, if whole movies on Netflix were streamed like this.
@IceMetalPunk
@IceMetalPunk 3 года назад
I don't think energy is the problem here; bandwidth is. But hopefully a few papers down the line, we'll have better compression tech and be able to reduce the bandwidth usage.
@--Arthur
@--Arthur 3 года назад
I bet within 15 years, someone will make a movie like this, so that you can watch 180⁰ with the 3D-feeling of being there. - Perhaps with haptic feedback? Imagine that.
@jamesc5801
@jamesc5801 3 года назад
Omg the reflections updating blew my mind the most
@li_tsz_fung
@li_tsz_fung 3 года назад
If the video encoder can treat the different perspectives as the same video, the compression rate can be much better. There must be so much similar information.
@cosmotect
@cosmotect 3 года назад
This channel is basically a window into the future
@frankiesomeone
@frankiesomeone 3 года назад
Check out "Welcome to Light Fields" on Steam for VR
@cybisz2883
@cybisz2883 3 года назад
Two Minute Papers is a bit late. You were able to download and watch these exact same lightfield videos in vr since Siggraph 2020 in August, via the github link in the description. That said, they are extremely impressive! They're the only 6DOF VR videos you can see today.
@deaultusername
@deaultusername 3 года назад
This reminds me of a kickstarter camera where you could change the focus of an image After taking the image. No idea what happened that tech but its very similiar.
@ewerybody
@ewerybody 3 года назад
*braindances* seem much closer now than 2077 🤩
@TheGoodContent37
@TheGoodContent37 3 года назад
Of course this will be done in the future with a bunch of tiny drones with wide angles lenses that position around the subject and follow. I give it 5 years top until it is commercially available.
@ikannunaplays
@ikannunaplays 3 года назад
So if I had eye's like a fly this is what it would be like to see through them, amazing
Далее
Ray Tracing: How NVIDIA Solved the Impossible!
16:11
Просмотров 794 тыс.
Electromagnetic Aircraft Launcher
15:09
Просмотров 884 тыс.
Cristiano Ronaldo Surpassed Me! #shorts
00:17
Просмотров 16 млн
Neuralink Isn't Telling Us Something...
13:42
Просмотров 147 тыс.
How do non-euclidean games work? | Bitwise
14:19
Просмотров 2,4 млн
Breaking ChatGPT (except it broke me instead)
6:18
Просмотров 512 тыс.
Lytro's ABANDONED 40k Resolution Cinema Camera
12:58
Просмотров 371 тыс.
DeepMind AlphaProteo AI: A Gift To Humanity! 🧬
5:55
What Happened To Google Search?
14:05
Просмотров 3,1 млн
Finally, Deformation Simulation... in Real Time! 🚗
6:56
The True Story of How GPT-2 Became Maximally Lewd
13:54