Тёмный

Light-field Camera - Computerphile 

Computerphile
Подписаться 2,4 млн
Просмотров 202 тыс.
50% 1

Опубликовано:

 

4 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 280   
@batfan1939
@batfan1939 8 лет назад
"…I looked through dozens of articles on how it works, and none of them told me." This made me lol.
@IceMetalPunk
@IceMetalPunk 8 лет назад
Interesting. When I first heard of Lyttro cameras, I suggested this tech be used with eye-tracking so that a video would always focus on wherever the viewer is looking, thereby mimicking real vision and making the scene more immersive. I still think that would be awesome.
@carrotSnack
@carrotSnack 8 лет назад
+IceMetalPunk Even better, use a light field display. It'd be like looking through a window.
@carrotSnack
@carrotSnack 8 лет назад
+IceMetalPunk Head mounted ones do. But normal viewing LF displays have been made that are around a metre tall using hundreds of projectors. You can even buy small displays like LEIA 3D, though they call it a holographic display.
@IceMetalPunk
@IceMetalPunk 8 лет назад
carrotSnack Leia is not a light-field tech, it's a 3D tech. It doesn't adjust its focus at all, it just sends a left/right image to each eye. Light-field displays need many images from many angles, not just two. As for the meter-tall light field displays, do you have a link to one? I've only been able to find info about head-mounted ones.
@LinkEX
@LinkEX 8 лет назад
+IceMetalPunk Dang, that sounds like a groundbreaking change for the movie media in the future.
@IceMetalPunk
@IceMetalPunk 8 лет назад
Mitchell Gillespie No no, you're misunderstanding. I'm not talking about focusing the *camera* based on where the *shooter* is looking. I'm talking about refocusing the existing video based on where the *viewer* is looking. As in, refocusing after the film has been taken, such as with light-field tech like this video describes, but choose the focal point based on where the viewer is currently looking with eye-tracking software. Both technologies have been around for some years now, but I've never seen them combined, and think it would be amazing if they were.
@ANTIMONcom
@ANTIMONcom 8 лет назад
So CSI can finally enhance photoes?
@EugeneKhutoryansky
@EugeneKhutoryansky 8 лет назад
Very interesting way to refocus an image. Thanks for the video.
@JMPDev
@JMPDev 8 лет назад
At about 1:00 he mentions that he'll talk about the more industrial variations of this technology "for a little bit". I'd be interested in seeing that footage if it exists :)
@Computerphile
@Computerphile 8 лет назад
Coming soon! (needs a bit more editing!) >Sean
@JMPDev
@JMPDev 8 лет назад
+Computerphile fantastic! looking forward to it :)
@xponen
@xponen 8 лет назад
found "light-field microscopy": graphics.stanford.edu/papers/lfmicroscope/
@tbg10101
@tbg10101 8 лет назад
+JohannesMP See the most recent video! It just came out.
@FlesHBoX
@FlesHBoX 8 лет назад
I wasn't ever really very interested in the light field cameras, seeing them as more of a gimmick, but seeing the tech behind them makes them incredibly interesting! Awesome video!
@CatnamedMittens
@CatnamedMittens 8 лет назад
Dude's hair is on point.
@JoshKemmerer
@JoshKemmerer 8 лет назад
+CatnamedMittens “Michael Bialas” Yeah, I'm jealous
@CatnamedMittens
@CatnamedMittens 8 лет назад
***** 2fab4me
@deus_ex_machina_
@deus_ex_machina_ 7 лет назад
CatnamedMittens How are you everywhere?
@logitech4873
@logitech4873 8 лет назад
I want to see this technology being used in very strong macro lenses, where you have to deal with excessive DoF by doing focus stacking. This tech could reduce such a photo down to only one image, and would make things a lot easier.
@Gamehighlight2023
@Gamehighlight2023 2 года назад
It has Macro from 0.0mm you can not have better macro than this on a regular camera so it is something
@peterlinddk
@peterlinddk 8 лет назад
Very interessting - I hadn't heard of light field cameras before, and now I really want one! Also, probably the best (shortest, most concise, most accurate, most memorable) explanation of why some things are in focus and others are not - thanks for that!
@Holobrine
@Holobrine 8 лет назад
Can you put everything in focus at once?
@gfx2006
@gfx2006 8 лет назад
+Holobrine Yes, you can. In fact, you can define a range of depth and objects within that range can be set to be in focus.
@JakeWitmer
@JakeWitmer 3 года назад
@@gfx2006 I imagine this would have to shift hard-edged boundaries, so you'd have to arbitrarily choose which ones to 'prioritize' would it not? (This could be done "automatically arbitrarily" I suppose.)
@hughiemac
@hughiemac 7 лет назад
Great video -- stable image, not shaky, yet cutting made it interesting without losing information. More like this!
@shadfurman
@shadfurman 8 лет назад
I remember hearing about this years ago, I think it was at an MIT research lab. They were actually using a perforated piece of paper like a bunch of pinhole cameras instead of a micro lens, but essentially the same thing. I'm kinda surprised it took this long to come out with a commercial product.
@coooooooooool1000
@coooooooooool1000 8 лет назад
This came out a few years ago
@terryallen9546
@terryallen9546 6 месяцев назад
Wow! After weeks of watch and reading, I finally found a helpful Lytro video. I have snapped a single picture yet because I wanted to understand the camera first. Thank you. I have both cameras.
@TheBluMeeny
@TheBluMeeny 8 лет назад
I love these hardware videos, please do more!
@BrianIrwin
@BrianIrwin 8 лет назад
Great video, explained something that has hurt my head since I first heard of it. Thank you.
@mrhappy192
@mrhappy192 8 лет назад
You guys should do a video on Li-Fi :)
@Navhkrin
@Navhkrin 8 лет назад
+MrHappy i totally agree. Althou most of us already wiki'ed it i would like to hear from an actual professor
@amusik7
@amusik7 8 лет назад
Fantastic video! Finally understand how this technology works. Simpler than I thought - super clever however!
@approachableactive
@approachableactive 7 лет назад
beautifully​ explained!
@jonathanpeck
@jonathanpeck 8 лет назад
How fascinating, I'd not even thought this kind of optical trickery would exist! :)
@power-max
@power-max 8 лет назад
Ahh, that kind of reminds me of the grid mask used in front of color CRTs that allow each electron gun to only affect the subpixels for that specific channel! Basicly the individual holes in that mask act like pinhole cameras, so that the way the thing is angled, the electron gun for the green channel can only reach the green phosphors on the screen!
@ulilulable
@ulilulable 8 лет назад
Finally! Thanks for explaining this!
@AJyoutubes
@AJyoutubes 8 лет назад
thank you for the video. well explained.
@sullivan3503
@sullivan3503 8 лет назад
Could you focus the entire image rather than just a region? I think that would be the coolest feature.
@vgfxworks
@vgfxworks 4 года назад
if it's like focus stacking what you mean, I can't see why not and with this same system.
@alexwang007
@alexwang007 3 года назад
YES you can. In real life you would need a tiny aperture (like a phone camera) to focus nearly everything, but you loose light. In the lytro software you adjust the effective aperture to a tiny value and acheive total focus without losing light. BUT in the norm of photography people think that big aperture looks better (more blurry foreground and backgrounds).
@Carrosive
@Carrosive 8 лет назад
Lytro have something coming out that apparently captures every light ray in a room and using a VR headset you can look around like you were actually there in photographic quality and can move to look around corners
@AlfredoRius
@AlfredoRius 8 лет назад
Finally a nice explanation! Not that I was deeply researching the subject but normally I can get my head around of how stuff works, but this wasn't that intuitive...
@pedrocollado935
@pedrocollado935 6 лет назад
Then this image can be showed in a ligth field display, and you can focus the background or first plane with your owm eyes , just like a real live scene
@KISHORENEDUMARAN
@KISHORENEDUMARAN 3 года назад
awesome video.. I only could comprehend without the math part though, but this is very well explained video and interesting tech
@Inertia888
@Inertia888 7 лет назад
wow they're using fractals for photography tech! This is awesome!
@DanDart
@DanDart 8 лет назад
ugh, proprietary software? I'm interested in getting one when someone makes an open source version.
@_Norran
@_Norran 8 лет назад
+Dan Dart this could be something for you :) watch?v=p2w1DNkITI8
@DanDart
@DanDart 8 лет назад
+Maik Kluwe thanks I didn't find anything quite there yet I could download and complete though
@michaelsorrow7490
@michaelsorrow7490 8 лет назад
+Dan Dart I'm glad I'm not the only one who thinks so.
@TheFakeVIP
@TheFakeVIP 6 лет назад
Definitely!
@kevind814
@kevind814 8 лет назад
This is fascinating. Would be a great device to be sold at a Sharper Image store.
@PixelCortex
@PixelCortex 8 лет назад
10:19 Fin the Human x2
@Ensivion
@Ensivion 7 лет назад
Idea: Attach one of these cameras to a drone that takes pictures of 3D objects and then try to reconstruct them in software, maybe even doing machine learning to teach a program to detect a certain object. You no longer have to worry about the machine itself focusing on the objects since the camera does captures all the data itself.
@twistedyogert
@twistedyogert 10 месяцев назад
And you don't need two separate cameras to capture a 3d image.
@hongt1930
@hongt1930 7 лет назад
The best explain on light filed camera.
@manis404
@manis404 8 лет назад
@10:20 He was like "You know...optics :D"
@osenseijedi
@osenseijedi 8 лет назад
could u use this to make 3d pictures? Generally speaking you need 2 view points to construct the 3D pic, but this techniques seems to show you only need to vary the focus point to get the distance
@awdasko
@awdasko 8 лет назад
How about using this tech together with 360 high-res video for VR experience application? You would basically look at different things inside your vr set and they would come into focus, making the whole thing much more believable. You would need to be able to precisely follow the eye movement within the vr set, but still, seems doable to me.
@MustafaOzanAlpay
@MustafaOzanAlpay 8 лет назад
brilliant!
@TomHamRomero
@TomHamRomero 8 лет назад
could you link a large amount of these cameras together and create an image with a wider parallax ? kinda like creating another micro lens, or macro lens.
@IARRCSim
@IARRCSim Год назад
What file format is used for the live photos at 11:40 to 11:45? Is it a conventional format like jpeg but segmented in multiple rectangles somehow?
@scheimong
@scheimong 8 лет назад
Sounds like they took the idea behind the complex eye and developed this camera. This is some amazing technology. At first I thought it would just be some clever software gimmick that makes the camera take many photos in a short time but I'm delighted to see I'm wrong. If this technology gets integrated into VR applications, combined with eyeball tracking, depth of view would be much more realistic. A simple example would be Google streetview. The future looks bright.
@SparkysBarelyMusic
@SparkysBarelyMusic 8 лет назад
We're getting one step closer to Deckard's camera from Blade Runnner. :)
@antiHUMANDesigns
@antiHUMANDesigns 8 лет назад
Wait a second, does this mean that the information in such a picture could be used to create a 3D model of the "depth"? I mean, it seems that the image contains depth information, effectively.
@BogdanManciu
@BogdanManciu 8 лет назад
Yes that is possible but the effective dept is a proportion of the baseline which is very narrow. For comparison tv gesture recognition use about 15cm to recover dept at 1-5 meters in the living room.
@Hunnter2k3
@Hunnter2k3 8 лет назад
+antiHUMANDesigns As Bogdan said, it is fairly limited in how much depth you can get from the image. The resolution isn't high enough to do something like that on such a small camera, you'd need a much larger setup for that sort of stuff. One thing I would love to see is a setup where it uses 2 light-field cameras to create a scene. It would allow for far better resolution of the full light-field and can be used to create 3D models of a scene (of what can be seen, as usual. damn 3Dness of the universe) Combined with some software, you could use it as a freeform 3D scanner that doesn't lose any focus as you scan the object in question. No more expensive rigs with motors, just hold it in your hand and get spinning.
@antiHUMANDesigns
@antiHUMANDesigns 8 лет назад
Bogdan Manciu Yes, the information would be quite... low quality, I suspect. You'd probably get a rather jagged model, but it could probably be smoothed.
@antiHUMANDesigns
@antiHUMANDesigns 8 лет назад
Kris Johnstone Yes, I agree, I was just speculating about the possibility of doing it with just one camera.
@ProGamer1515
@ProGamer1515 8 лет назад
+Kris Johnstone You could possibly tape multiple cameras together into a 5x5 square. It would be an intersting expeiment to see what result you'd get.
@jameslamb4573
@jameslamb4573 8 лет назад
If you have parallax information surely a 3D image can be produced?
@justmauldie
@justmauldie 8 лет назад
the HTC One M8 does something similar (effect, not implementation), no? but with two cameras?
@timandersson8289
@timandersson8289 8 лет назад
Amazing
@tomsparklabs6125
@tomsparklabs6125 8 лет назад
Second!
@AJyoutubes
@AJyoutubes 8 лет назад
Could a comparison be made between a human eye and a bug eye? (classic vs Lytro)
@youtubehandlesareridiculous
@youtubehandlesareridiculous 8 лет назад
Can you put the whole image in focus at the same time?
@trolledyou7032
@trolledyou7032 6 лет назад
Nikon, Canon, Fuji, Sony.. watch and LEARN, because this is BRILLIANT!
@johnnyswatts
@johnnyswatts 4 года назад
The problem is that everyone wants higher resolution pictures and this technique actually lowers the resolution. That's the trade-off for 3D images.
@noxiouspro
@noxiouspro 6 лет назад
9:03 is the best explanation.
@Bormeir
@Bormeir 8 лет назад
2:17 subliminal messaging? :P
@Lightningblade67
@Lightningblade67 8 лет назад
+Bormeir Looked like some graph
@Computerphile
@Computerphile 8 лет назад
+Bormeir Oh yeah - that's a proper glitch! - sorry about that - hope it's relevant to the video!! >Sean
@WickedMuis
@WickedMuis 8 лет назад
+Bormeir Illuminati confirmed =)
@jonathanmarino7968
@jonathanmarino7968 8 лет назад
+Bormeir I managed to pause it at the right frame. It's the figure from 4:41
@kahrkunne3960
@kahrkunne3960 8 лет назад
+Bormeir It's the graph that shows how an ordinary camera works, the one with the
@kimberleyhayes326
@kimberleyhayes326 3 года назад
I have purchased a LYTRO, but I cannot find the software to view. The link sends me to Raytrix, but they said they do not support it...any suggestions where I can get a download.
@disc_priest
@disc_priest 8 лет назад
What about cmos sensors? Are ccds the only sensors?
@sbalogh53
@sbalogh53 8 лет назад
3:41 . OMG. I have not seen line printer paper like that for almost 20 years.
@Computerphile
@Computerphile 8 лет назад
Clearly not watching our channel often enough 😉 >Sean
@DFX2KX
@DFX2KX 8 лет назад
+Dexxter I've seen the paper in reams, it's a working line printer itself that I'd like to see.
@sbalogh53
@sbalogh53 8 лет назад
+DFX2KX Same for me in a way. I have a carton of 2000 brand new 80 column punch cards in a sealed box, but I have not seen a working (or non-working) card reader since around 1990. I have been using the cards as notepaper for years after our university dumped all computer cards. When I got to the last box, I decided to keep it in case it became interesting for later generations. My kids will probably throw it in the recycle when I die. Oh well.
@biohazara
@biohazara 8 лет назад
11:24 explains why in this current form, this is not the future of photography... yet. For now it's completely closed and proprietary. We need an open standard for this and device agnostic tools (ideally open source ones).
@BogdanManciu
@BogdanManciu 8 лет назад
+zaco21 Remind me again who owns the Nokia patents now?
@Hunnter2k3
@Hunnter2k3 8 лет назад
+biohazara The problem, besides the patent issues, is the fact that it is still fairly low-resolution. That is more an issue with physics than anything else. A higher-res system will require either considerably more dense CCDs or a larger CCD. (so, in turn, larger camera) Larger camera will be a lot cheaper, which will then turn people off from it. (besides the really hardcore photographers, who would almost certainly welcome it) The CCD density issue will mean a sharp increase in price since it isn't just that which is increasing in density, you are also going to need to increase the density of the microlens grid to improve its accuracy too. Both of them work together to define the actual effective resolution that can be used for images.
@willisdagrillis
@willisdagrillis 8 лет назад
+zaco21 also its not cheap, reminds me of FLIR cameras
@biohazara
@biohazara 8 лет назад
+Kris Johnstone +zaco21 I completely agree this will take time to be something usable in the "real world". However, patents and the lack of open standards make progression even slower, much like 3D printing wasn't a thing until some of the patents expired.
@oBCHANo
@oBCHANo 8 лет назад
+biohazara It will never be the future of photography. No matter what advances are made they are always going to be splitting a normal sensor up into a lot of tiny ones. That is always going to have terrible low light performance and it's always going to be very low resolution in comparison to a normal camera. I would assume it's going to be incapable of achieving the same FPS as a normal camera too considering how much more information it needs to process. It's impossible for it to ever catch up.
@JustinKoenigSilica
@JustinKoenigSilica 8 лет назад
can you talk about RAW processing? also, i need a program that can make HDR... i have photoshop, but i don't have Lightroom...
@basilehenry
@basilehenry 8 лет назад
+Adrian Freund Darktable is great! But I'm not sure it's available for windows...
@basilehenry
@basilehenry 8 лет назад
+Justin Koenig To make an HDR I would recommend Luminance HDR and then photoshop/gimp/darktable to tweak it (Luminance HDR can take several types of RAW input!)
@basilehenry
@basilehenry 8 лет назад
I'm sorry to hear that, it works pretty well for me, no exposure problem (you can set them manually though if you can find the values in another software). It even has some great batch processing capabilities if you want to create hundreds of HDRs at once. It is very cpu demanding so I do understand that it may not be for everyone. I would advise finding the right profile/settings at a low resolution and only when you're sure let your computer run overnight to produce the high res of the batch of photos. Also make sure to enable multithreading if your cpu has multiple cores.
@ichhassdievoll
@ichhassdievoll 8 лет назад
+Justin Koenig You could trie Lightzone, played a bit with it some time ago. It seems to do what you want.
@PeJayCee
@PeJayCee 8 лет назад
+Justin Koenig you can use photoshop to make a HDR...
@wktodd
@wktodd 6 лет назад
Could the micro lenses be replaced with simple appertures . i.e. cheap pin holes?
@johnnyswatts
@johnnyswatts 4 года назад
Yes, and the mathematical models often assume that they are!
@Morkvonork
@Morkvonork 8 лет назад
So it is the effect of lenticular lens hologramms turned backwards. oO I thought it takes multiple pictures in a short time while moving the ccd back and thus having a picture with different focus layers that you later compile to a single picture.
@notanimposter
@notanimposter 8 лет назад
+Morkvonork You could do it that way but I get the sense that it'd be mathematically very difficult to figure out what was going on and reconstruct the rays. It'd also be quite complicated to move the CCD that quickly without changing its orientation at all. You'd be better off using a lens with a variable geometry to bend the light at a slightly different angle.
@ventzcn
@ventzcn 8 лет назад
would it be possible to focus every area of the image so the entire image is in focus if that makes sense, would just be wierd to see xd
@zolika1351
@zolika1351 6 лет назад
Could you reconstruct the image in 3d this way?
@johnnyswatts
@johnnyswatts 4 года назад
zolika1351 Absolutely.
@DynoosHD
@DynoosHD 8 лет назад
How much more space is needed to store this informations compare to a normal image of the same resolution?
@michaelpound9891
@michaelpound9891 8 лет назад
+DynoosHD Theoretically the same, but in practice the files are larger. I don't know the exact details of their storage format, but while you'd imagine it's just the same size as a normal image (because it's still a regular CCD behind the microlens array) they can't do as much compression because that would risk affecting the image reconstruction. My lytro images were about 10-15 MB, and the newer illium camera is closer to 100 MB per image.
@johnsixfiveohtwo1354
@johnsixfiveohtwo1354 8 лет назад
If you ditch the main lens and build a very large sensor panel, with some tweaking it could function as a 3D camera. You'd need a special monitor to display the result, but you wouldn't need 3D glasses.
@bouzou96
@bouzou96 6 лет назад
can you guys do a video on bokeh filters?
@Guitarm4n99
@Guitarm4n99 8 лет назад
Please help.. I don't get why you can't just reconstruct the image in the same way without the micro lenses by just keeping in mind that a certain amount of focus on the aperture will have light coming from specific sub-apertures go to specific places... He said something about how since it's not already focused, you have a lot of background noise or something... Let me know if you do.
@multilevelintelligence
@multilevelintelligence 8 лет назад
is that possible to extrapolate the shape of things out of this?
@legotechnic27
@legotechnic27 8 лет назад
so is it possible to focus on every part of the image at the same time?
@coooooooooool1000
@coooooooooool1000 8 лет назад
I played around with images that other people shot, i think you can
@AdrianReef
@AdrianReef 8 лет назад
I know *everything* on the subject in the video but I wanted to give it a go... Then, all of a sudden, 11:15 !! And I was like "You wish!" X) This whole explanation is purposely confusing and makes a burden of simple concepts which could have been described differently. I recognize that this isn't an "in depth" explanation but in +-12 minutes the whole algorithm can be shown easily and it's not "fairly complicated" but relatively simple if you have used it at least once in your life. :P And if you happen to end up in a "black box" it's because you need to search for "integral imaging" to see some relevant results on the matter and you'll discover that "the black box" is actually public knowledge by more than a century. :)
@YingwuUsagiri
@YingwuUsagiri 8 лет назад
So the HTC One M8 has one of these? But it has 2 cameras so how does that work then?
@YingwuUsagiri
@YingwuUsagiri 8 лет назад
Graham Smith Any picture you take can be paralaxed like seen in the video by tilting your phone (the "3d" option) and any picture can be refocused afterwards pretty much identical to how it looks in this video. You can also upload it to an HTC website page and send that to a friend so they can refocus online. It's pretty amazing but I don't know if it's the same.
@wobblycogsyt
@wobblycogsyt 8 лет назад
+Niels Schellekens I've just had a quick look to see if I could find how the M8 refocusing works, I didn't find a detailed description but I think my first reply was basically right. It seems the two cameras have quite different numbers of pixels. There's a main camera to take the photo and the second camera just captures depth information at a lower resolution. I wonder if they are having the second camera capture multiple shots at different focal points in order to gather the information necessary to correct the focus in the main image.
@majorgnu
@majorgnu 8 лет назад
+Niels Schellekens The video shows parallax along two dimensions. You can only do it along one, along the axis defined by the two lenses, right?
@YingwuUsagiri
@YingwuUsagiri 8 лет назад
Major Gnuisance It's tilting upwards and sideways if that's what you mean.
@YingwuUsagiri
@YingwuUsagiri 8 лет назад
Graham Smith Would be nice if someone knew how it worked. I'm by far a camera techy but I'm very much enjoying this series of Computerphile
@fishingtrippy
@fishingtrippy Год назад
What if you made a large main lens and multiple layers of micro-lenses?
@DrRChandra
@DrRChandra 8 лет назад
so why change the focus? Bring the whole picture into focus. Yes, it's a bit unnatural, because an eye could never do that (unless the iris was stopped down to a pinhole), but so what? Everything in view would be clear. Or would that be mathematically impossible?
@ChristopherMeadors
@ChristopherMeadors 8 лет назад
+rchandraonline It should be possible, but not with the microlens array itself. It would take additional post-processes, along with edge detection to know when a part of the image was at peak sharpness while working through the various focal depths. So basically a program would start at one end of the refocus range, and rank each part of the image as to how sharp it is. Then step the the next depth, and rank it again. Do that for the full range, and then blend the maximally sharp areas together. There will be even more work required to deal with occlusion of a part of the scene by the out-of-focus blur of a nearer object. Though that may fall out automatically from the sharpness evaluation.
@yawor
@yawor 8 лет назад
+Christopher Meadors I think that the Lytro software can do that easily. You can see at 11:45 on the macro picture of the fly, when he clicks on the fly's rear leg the whole the whole picture is focused for a really brief moment of time (it's better visible if you slow down the video). Also the technique you're talking about is called focus bracketing or focus stacking and can be done even on classic cameras, but to do that you need to take multiple pictures with focus slightly changed on each of them. Then you can use a focus stacking software and process the pictures. You can achieve effects like a macro picture with perfectly sharp background (the effect is stunning, especially if the background is really at some distance from the object being photographed). I've been playing with this technique myself few years ago with my Canon S5IS with CHDK firmware. I've run a Lua script on it that took a pre-set number of photos with focus bracketing :).
@funkysagancat3295
@funkysagancat3295 6 лет назад
Wasn't this done in Citizen Kane?
@bethanywatrous1774
@bethanywatrous1774 Год назад
Omg that old printer paper (4:19)! Who still has that? That's amazing! Oh yeah, and this tech is pretty cool too :D
@krishnansrinivasan830
@krishnansrinivasan830 8 лет назад
when we talk about micro lenses , would it mean there will be a micro lens for each pixel ?
@okaro6595
@okaro6595 8 лет назад
+krishnan srinivasan No, consider te Lytro Illum (not the one shown), it has 200,000 micro lenses and a 40 Mpix sensor. The exported image is, howewer, just 4 megapixels. So the very concept of pixel is not that simple in a light field camera.
@krishnansrinivasan830
@krishnansrinivasan830 8 лет назад
Thank you 😃
@thumper5555
@thumper5555 Год назад
I wonder what file sizes are like.
@aryanarora7046
@aryanarora7046 8 лет назад
What did he say @1:12 "i am holding up a lightro mainly because.."??
@kliko
@kliko 8 лет назад
+Aryan Arora "I own one. Or let's say the school owns one." So he's holding this brand of camera up because he/the school has one.
@JohnnyDoeDoeDoe
@JohnnyDoeDoeDoe 8 лет назад
+Aryan Arora "I am holding a lightro mainly because I own a lightro, or I should say, the school owns a lightro. Uhm...:
@trip3commy
@trip3commy 6 лет назад
every bigfoot hunter should have one of these
@Fibonochos
@Fibonochos 4 года назад
I wounder how this would combined with photogrammetry.
@hermannyung7689
@hermannyung7689 2 года назад
So after 6 years we have computational photography on iPhone! How are they different from each other?
@stevecummins324
@stevecummins324 8 лет назад
Could someone use one of these very expensive cameras to take an image of the jastrow duck rabbit "illusion", and see if shifting the point of focus to infront or behind the real image makes the image flip? I suspect such may be the case because if one places a thick glass sheet over a printed copy of the duck rabbit, and then alternates focus point between the top and bottom surfaces of the glass the image can be flipped. there's been a few pyschology phds which used binocular eye trackers, and which say that can't be an explaination, however basic trigonometry shows the eye trackers used just wouldn't have had the angular accuracy to determine depth of focus any closer than +/-5cm over about 30 cm viewing distance.
@stevecummins324
@stevecummins324 8 лет назад
+steve cummins The retina appears to be non optimal, with light having to pass through nerve fibres etc before hitting the light detectors. Could those those layers be functioning as a micro lens array?
@Imrooot
@Imrooot 8 лет назад
I think it's the future of photography. Nearest future.
@brooped
@brooped 8 лет назад
+Imrooot At least for now the image quality is ridiculously bad compared to the traditional imaging. It's going to take a long while before professionals can start switching to these things.
@BogdanManciu
@BogdanManciu 8 лет назад
It's not yet comercially viable. This is the future of mobile phone cameras that are 1mm thick and always in focus. Alas the microlenses are hard to mass produce, but give it some time the material engineers will figure it out.
@brooped
@brooped 8 лет назад
Bogdan Manciu The thing is really that I don't see out-of-focus pictures being such a big problem anyway nowadays. People are fully capable of using cameras with autofocus, so this technology is not really solving any pain points. It looks more like a "nice to have" kinda thing, probably nice if the image quality ever comes on par with the "normal" stuff, but not worth paying significantly more for.
@bennylofgren3208
@bennylofgren3208 8 лет назад
+Imrooot Nearest and farthest future all in one image!
@qzorn4440
@qzorn4440 6 лет назад
can the object depth be measured? ;~)
@Nulono
@Nulono 8 лет назад
0:03 Which are what?
@andrewkiminhwan
@andrewkiminhwan 8 лет назад
Snot's Ham was the origin of the name Nottingham apparently, thanks Jeopardy! haha
@soufmaro502
@soufmaro502 2 года назад
it does not neccesary make sence doing it that way it also would be possible just to increase the sensor size and lens size so they make a bigger focus area the lens in front the 3d lens also would have its limitations it certainaly has a bend point where it cant make a focus depending how big the lens really is and work
@tonyrulez69
@tonyrulez69 8 лет назад
okay
@CatnamedMittens
@CatnamedMittens 8 лет назад
I love his accent. Where's he from?
@CatnamedMittens
@CatnamedMittens 8 лет назад
***** Thanks.
@zubirhusein
@zubirhusein 8 лет назад
Would this be useful for astronomical purposes?
@zubirhusein
@zubirhusein 8 лет назад
Quantum Custodian I was thinking more more for the parallax
@unvergebeneid
@unvergebeneid 8 лет назад
+2chws For photographical purposes, astronomical objects are infinitely far away. This means you don't need to refocus and you also don't have any parallax effects, at least not by moving the camera a few centimeters. However, in radio astronomy a similar technique called beamforming is used to get a virtual directional antenna that you can redirect at different objects in post.
@coooooooooool1000
@coooooooooool1000 8 лет назад
+koseq7 what it that .&.qu.ot; thing.?
@coooooooooool1000
@coooooooooool1000 8 лет назад
Penny Lane wow
@unvergebeneid
@unvergebeneid 8 лет назад
***** Yeah, the RU-vid comments have accumulated quite a few bugs in recent times. But this one should be so easy to fix that it's the ultimate prove that no-one in this company has ever come across it themselves, never mind some kind of QA department one could expect. As if a lot of the people here weren't bad enough, now we also have to fight the technology :/
@rkpetry
@rkpetry 8 лет назад
So in other words there are more pixels in this CCD than in a common CCD camera...? It'd be more interesting to use a 50% random mask (instead of a front lens), and do a 'hologram-like' inverse function on the CCD output...
@karlkastor
@karlkastor 8 лет назад
2:11 Lego Mindstorms!
@keylupveintisiete7552
@keylupveintisiete7552 8 лет назад
I suscribed because he likes photography
@EE12CSVT
@EE12CSVT Год назад
All very good, but really what's the point of it other than the gimmick of refocusing after the event?
@MaverickJeyKidding
@MaverickJeyKidding 7 лет назад
This is like HDRi, only for DoF! P.S. What if you wear such glasses? What whould you see? Complete mess or totaly DoF-independent sight?
@devjock
@devjock 8 лет назад
Ah so it's a camera taking 1000's of tiny pictures of the microlens array. Cameraception!
@ShinoSarna
@ShinoSarna 8 лет назад
Whoa, imagine what this technology could do e.g. with a fisheye lens. Or making 3d movies with a single lens.
@kyrianrahimatulla1561
@kyrianrahimatulla1561 3 года назад
Shoot a video of a still frame. Do a focus pull. Watch the video and pause it whenever the desired distance is in focus. Bim Bam Boom
8 лет назад
This camera looks a bit like an oversized spy camera. I want it just because of that - and of course because this convinient function.
@OneWorldLikeItOrNot
@OneWorldLikeItOrNot 8 лет назад
Seems modeled on a the vision of a fly.
@vladimirivanov1562
@vladimirivanov1562 7 лет назад
I'm curious what would it look like if it was a real film rather than CCD.
@factsverse9957
@factsverse9957 6 лет назад
I should bring this to school...
@Bladavia
@Bladavia 8 лет назад
"Optics, you know" Yup, this sums it up rather nicely.
@colonelbastian6036
@colonelbastian6036 8 лет назад
why?
@sasuke2910
@sasuke2910 8 лет назад
Interesting, but camera can take images very fast nowadays, why wouldn't you just continuously take images at multiple focus levels? And if you wanted the parallax effect then move the lense also?
@Yggdrasil42
@Yggdrasil42 8 лет назад
Because even very fast isn't fast enough when objects or the camera are in motion. You get motion blur. And combining the parallax effect with refocusing would be impossible as well.
Далее
Lytro's ABANDONED 40k Resolution Cinema Camera
12:58
Просмотров 353 тыс.
#755 Why is a Camera Lens so Complicated?
17:21
Просмотров 77 тыс.
Video Game & Complex Bokeh Blurs - Computerphile
13:52
Просмотров 244 тыс.
How Holographic Doom Works on a Lightfield Display!
18:28
When natural light photography goes wrong.
9:24
Просмотров 782 тыс.
Why Light Field Cameras are Hard
6:33
Просмотров 24 тыс.
SHA: Secure Hashing Algorithm - Computerphile
10:21
Просмотров 1,2 млн
Maze Solving - Computerphile
17:15
Просмотров 1,1 млн
How Does a DSLR Camera Work
8:32
Просмотров 132 тыс.
Industrial Light-field Magic - Computerphile
4:26
Просмотров 98 тыс.
The Camera of the Future (That Failed)
5:12
Просмотров 2,2 млн