The explanation of the microscope experiment contains a misleading simplification for the sake of understandability. Can you find which simplification that is?
Great video and amazing explanation. You have addressed a very important aspect of the angular resolution of the Webb and clarified that Hubble gives actually sharper images than Webb which previously was not understood and they gave the wrong impression that Webb is sharper than the Hubble. However Webb can collect more light and therefore will have a longer range. Nevertheless, you did not address that infrared spectrum of Webb is not absorbed so much as visible spectrum of Hubble from space dust, gasses and nebulas. Therefore with the Webb we will seethrough these obstacles which is a very important feature of infrared telescopy.
About the spacing between the mirror segments of the Webb and the diffraction and sharpness problems this can cause? I believe the black substrate material behind the mirrors will absorb all these artifacts.
the way you showed us the wave propagating on screen doesn't make a lot of sense to me. Im not entirely sure how you did that to be honest, was it by changing the focal length?
Explaining that the light pattern outside the aperture looks like a wave because that's the only way light can travel is misleading -- shining a laser across a flat surface wouldn't look wavy, even though it's also traveling as an electromagnetic wave. The wave pattern is an interference effect with the wavelets traveling at the edge of the aperture
@@SodiumNitrateBot Yes, Lucas, spot on. I am suggesting in that part that what you are seeing is the actual EM-wave moving out and inwards, which is of course not true. You are still just seeing diffraction. However, because you can only see diffraction to the point that the field extends, there will still be some relationship between the extend of the energy redistribution and the extend of the diffraction pattern. I left this in to let people discover this themselves. @Alex Domatas the patterns are recorded by changing the focal position of a microscope with a very shallow depth of focus. In this way you can observe what happens at a specific "slice" of space. I used this same method in previous videos such as the one on Photon sieves.
Your videos are always great! Yes, I'd be interested to know about segmented mirror telescopes. It does seem like diffraction would be a problem. Maybe they use similar tricks as semiconductor masks?
Thanks Ben. Yes, as long as the diffractive edges are regularly spaced, they represent a very specific frequency component in the Fourier transform. I'm not aware of how exactly they do it, you could imagine that it is some kind of low-frequency spatial filtering. That might also explain why the outer perimeter of the mirror has an hexagonal shape and has not been rounded. Because only then it the same filter can be used for both segment spacings as well as the edge.
@@LarsBerntzon In chip masks it works differently. The precise mask pattern is actually irrelevant, it's only about the light distribution at the photoresist layer. So you use computational methods to optimize this intensity profile by using specific diffraction effects and write the patterns to achieve this onto the litho mask.
@@HuygensOptics Ah, I see. So if we could go to the end of the universe and place a really big specially crafted mask there we could get really good images in our telescopes :)
After several weeks of James Webb news, I needed a fresh dose of Huygens Optics. Your videos are so well made that an interested novice, like myself, can follow along.
Actually by that video I knew your great channel exist, after it I found your great content that help me a lot with many similar protects I had in my lab. Thanks again.
This video was better than i expected based on the title. I know that formula for aperture and wavelength. How can you make that into a 20 minute video? Its just one formula. Maybe think about a better title. "James Webb vs a 20mm telescope"
It's useful for an expensive telescope to be able to produce pretty pictures that appeal to the public and politicians... I also like looking at them. The hubble deep field images weren't considered as high scientific priorities when they were made but they've been important in keeping the public onside with spending on space telescopes. I agree that the main job of Webb will be spectroscopy - unfortunately spectroscopy always seems to look like graphs and spreadsheets.
I thought the Hubble deep field weren’t considered scientific before they were taken, because they thought they were just nothing into nothing so were expecting nothing. Once they took it I think a lot was learned. The public hype was a bonus.
20:13 Hubble has actually recorded quite a few of these deep field images. Some with a longer exposure time than 140 hrs. As far as I know, the ‘Hubble eXtreme Deep Field’ has an exposure time in the order of 20 days. The so named ‘Hubble Legacy Field’ even amounts to 250 days (!) of exposure, but I believe that’s a mosaic.
Hee Jules! long time no see! Het zou kunnen dat er inmiddels al langere foto's gemaakt zijn maar ik kan alleen de Webb Deep field vinden en die heeft een belichtingstijd van totaal 12.5 uur. Ook best heel spectaculair. Ik moet zeggen dat ik van Webb alleen de highlights mee krijg, en het slechts zijdelings volg, er zijn gewoon te veel leuke dingen. Groet!
@@HuygensOptics Jo Jeroen :) Lang geleden inderdaad! Ik doe vrij veel met astrofotografie tegenwoordig (nog niet op RU-vid, overigens) en ben dus wel gewend aan die lange belichtingen-‘integration time’ noemen ze het meestal in de astrofotografie. Integraties van meer dan tien uur zijn voor amateurs beslist niet ongebruikelijk. Uiteraard verzamelen we dat licht in losse belichtingen van een paar minuten elk en combineren we een en ander met software. Briljante track trouwens, de laatste paar minuten! Ik kwam niet meer bij 😂
@@JulesStoop Ben zelf al een aantal jaren lid van de sterrenwacht almere (SSA). Daar zitten ook een aantal fanatieke astrofotografen die ieder jaar naar zuid Frankrijk afreizen om daar mooie foto's te maken. Inderdaad komen daar erg mooie foto's van, maar vaak ook nachten lange belichtingen. Ik heb nog een 50cm F:3 liggen die ik wil ombouwen voor astrofotografie. Maar ja, tijd is bij mij de beperkende factor. Ik hoop er volgend jaar een video over te maken.
@@HuygensOptics Ah ja. Naar de Pic du Midi :) Geweldig. Een jeugddroom van me zelfs. 50 cm op f/3 is bonkers! Wat voor optisch ontwerp wil je gaan gebruiken? Een correctorplaat lijkt me ondoenlijk op 50 cm. maar dan moet je hyperbolische elementen gaan maken(?). M.a.w: ik ben heel benieuwd naar die ombouw :)
I’m not aware of any other RU-vid channel that has your combination of approachability for an optics beginner like me, combined with depth of technical detail if I make the effort to study it. Thanks so much, and please keep the videos coming.
Living in Tucson Arizona and surrounded by a large astronomical and optical research community I have had the opportunity to talk with many optical researchers and astronomers. You sir have an exceptional ability to communicate the science of optics. I appreciate your efforts and hope you continue as I have learned a great deal from your videos. cheers
Im a hobbyist in tucson. What approachable communitys do you all speak off? I got an 8" dobsonian and some great books, but i need more stellar friends!:)
Most schools of engineering are like that, these days. Simply focused on cramming as much information into a students head as possible without regard for actually understanding it.
@@Obsidian0Knight I can imagine that when they were deciding the budget for JWST somebody told them "ok, we can give you enough money to make a telescope as good as hubble but in infrared, you are saying that it is a hubble replacement anyway, just in infrared, alright? We won't fund anything bigger than that".
@@TheoEvian Because the first object from JWT to be posted on Reddit will be compared side by side to the image from Hubble, and every crackpot will claim JWT is a waste of money as it produces worse images. So I guess NASA cares about Reddit Karma?
@@Tore_Lund They do care about PR quite a lot but I don't think that would be the main reason. As I said, I can imagine the image quality being a part of the budget negotiations.
I had most of this material at the university in a course on industrial imaging, at the time it was quite fascinating and i am surprised how much i put it to use working. It is so much fun to watch these videos and gain more info and insights on these topics. The quality of content you make and put out there for free is really astonishing. I cant thank you enough!
Great explanation of these concepts. If you read any of the blogs on photography or cameras, you will be amazed at the confusion amongst photographers on the topic of "resolution" of lenses.
Ooooopssss there goes resolution in the deeper infrared. That was a remarkable clear and visually superb illustrated explanation, understandable for a layman like me. LOVED IT 💯!
Indeed. Not an oops though, it Is highly specialized and likely hundreds of hours were poured into deciding exactly where to focus our instrumentation. You can not have it all, and in this case they preferred to Have great balance between visible and infrared. Rather than deep infrared only without any visible light
Just gotta say, I love this channel much. Plenty of other channels offer shallow takes on similar topics, but this is the only channel I know that really dives deep into the subject of optics, and presents in a manner that most people with an interest in science can learn from. Thanks so much for taking the time to make these videos.
Or you can look at it in terms of a spatial Fourier transform, translating between position and k-space. The spatial profile between the aperture/lens/mirror and the focus can be described by a fractional Fourier transform. This then tells you, that the k-vector distribution created by the round aperture in front of a lens makes a focus which is its Fourier transform, an Airy disk.
@@HuygensOptics Speaking of Fourier, would it make sense to make apertures with Gaussian or raised cosine edges? E.g. via gradient in optical density? Those would result in less ripple. Another question - just like we use equalisation in digital communication, would it be possible to compensate for some of the distortion caused by the finite aperture using digital filters?
I would have loved to have this person as my teacher at the University. It does not matter how hard or complex the topic is, he always finds the way to make it extremely attractive to me... and I am just totally ignorant when it comes to physics or optics. Chapeau!
I got a master's degree specializing in electromagnetic wave propagation and your explanation of the physical cause of edge diffraction is the best ever! It really brings together the mathematics and real world observations. And yes, what you show as "waves" in the microscope image are actually interference patterns, not the actual light waves. This is a good way to visualize what is going on with the energy transfer.
I am not expecting to see pin sharp images of far off structures as we do with Hubble, what we will get is plenty of spectroscopy data to throw our finest minds and our compute power at and make mathematical models of whats out there...cheers.
I hate to say it but this video has deflated a bit of my excitement for jwst. Up until now I had the idea in my head of jwst being able to literally zoom further out, and to be able to resolve those more distant objects just as well as Hubble. I have been imagining resolved images tracing the first stars ever born, and then being able to watch step by step as they coalesce into galaxies. This video has made it clear that we won’t be getting those images. I get that there is still an immeasurable amount of new science sure to come from the observatory, I’m just a bit bummed that the super-mega-ultra deep field of my minds eye is likely still one or two generations of space telescopes out…
That is something that has always frustrated me about the science journalism surrounding the James Webb. It has always been sold as a bigger/better Hubble, but it's not. It's an instrument with a totally different purpose. I have been worried for years that the general public might be disappointed when it "fails" to generate "better" results. Nothing against JWST. Just 100% a failure to explain what the goals and capabilities are. Which is a shame.
@@martinmckee5333 Most journalists aren't paid enough to care about anything. Most articles are written on short deadlines by people not qualified to write them, while they are working on several other similar articles besides. Under these circumstances, it's not surprising such incredibly poor coverage is given to every subject.
@@lobsterbark Not surprising, no. But still disappointing. I'm convinced that the quality of science journalism (or lack thereof), is a big reason for the current lack of excitement in and outright distrust of science by the general population.
Something with 377 ohms of impedance (so it generates no reflection) but also has resistance to absorb the energy … the basic principle of stealthy radar absorption across the intended frequency range. EM waves are EM waves but materials involved behave different at different frequencies.
@@elderbob100 Some mirrors have the outer edge curve slightly away to make the bulk of the fringe rings be outside the imaging reflection. Black paint is also used, which is no better than a sharp edge, but usually the edge of an astronomical mirrors has worse imperfections at the edge, so it helps.
Very nice Video! I am working in the field of optics and I got some amazing new perspectives from your videos. I think another advantage of the longer wave length of the James Webb telescope is that you can observe through dust which is not possible with visible light.
When you describe the effect of relaxing boundary conditions, you're correctly describing the physics of diffraction. But I would argue that Huygen's principle denotes something more specific, even though I agree that this isn't always understood when people present it as if it were an exact description of nature. Huygens gave an approximate description of (linear) wave propagation that has one fundamental physical ingredient: the superposition principle. This ability to form superpositions makes it possible (a) to consider every point as the source of a "spherical wave," and (b) to reconstruct the wave at any other point from the spherical-wave contributions that originated at other points. To do this right, you have to do certain integrals (using Green's functions). The construction in Huygens' principle is an approximation to this method that captures the main phenomena reasonably well (in three dimensions, but ironically not so well in two).
It seems like it should be possible to set limits on the possible granularity of spacetime by looking for gaps in the stretched wavelengths. Thank you for the thorough going over on aperture.
With NASA’s announcement today of the completion of the JWST’s alignment, I would absolutely love a very (very!) detailed video on exactly what alignment means - their video mentions both spatial and phase alignment, using FFTs, and I can’t think of anyone better on the internet to explain it than Huygens Optics!
It's the same principle as why you don't see the obstruction by the secondary mirror and the arms that hold it, they are not in focus. They form, in the worst case scenario, if the gaps were very big, a slight gradient on the image. (but I'm sure mr Huygens can explain it better). edit: it could be that they cause some weird difraction spikes on very bright stars, not sure about that.
I like your videos. But I don’t like the overly broad ‘the energy wants to transfer evenly’ as an explanation for diffraction. I think wave interference is key to understanding the phenomenon. I believe it’s not easily understood because the number of waves interfering isn’t fully appreciated. It’s important to note that ‘wave cancellation’ or destructive interference, does not actually destroy the energy or waves. The simply prevent the transfer of energy (in a very practical manner; not due to balance of energy or desire). By exerting an equal and opposite force onto a body. In this case, the electrons in your CCD. Imagine for instance that the source is emitting in all directions, but the emitters aperture only creates constructive interference along a given path (directional gain). destructive/constructive interference makes the beam of light appear to be directional. But energy exists outside of the main beam and side lobes. We don’t see it due to cancellation. But it’s still there. If you think about it from this perspective, when light propagates beyond a corner, the destructive interference that made it directional is no longer there. Thus the Omni directional wave (that was always there) is revealed. In short: expanding waves of light only appear directional to us due to interference.
You saw that right and your comment is actually spot on (you are the first to acknowledge this). My description of the observed wave pattern as being the direct wave is rather misleading and was intended to visualize / simplify what you actually observe, which is still the interference of waves rather than the wave expansion itself.
I absolutely love your Videos, every time I learn something new and exciting. Can't wait for the next one 😃😃 You are doing a Great job of explaining things in a simple to understand and interesting way. 10/10 🥰
Good video. I am an amateur photographer. The facts laid out here are the reason, that people claiming "My phone does just as good pictures as your big camera" drive me mad.
Thank you for this extremely clear explanation. I find that experiments like Webb and Ligo are truly physical science marvels. There was, I believe, and still is an infrared telescope aboard a large aircraft flying at roughly 12 km. The older one was the Kuyper observatory. In this respect I am curious how much better Webb will bee ! It is awesome! I was not clear about the statement " in our part if the universe " . I thought Holland was at the center. ;)
A remarkable presentation filled with teaching and discussion RE:JWST. As long as you identify fundamental limitations to optical performance, JWST will be 'seen' as remarkable with potentially new scientific data for cosmologists. Without belaboring points, a larger mirror could not have been launched today and high red shift demands longer wavelengths. JSWT will (hopefully) collect the only available photons from high red shift objects - full stop. Your video is one of the most well prepared and rigorous I have ever seen; I particularly like the energy discussion. I will review Feymann's lectures again to compare with your presentation.
Thank you for making this video - it's extremely informative. I am working on my PhD in Astrophysics, and you're still educating me. The surface-tension analogy and explanation for the Huygens-Fresnel principle was particularly eye-opening.
That was a crazy good video! I would add that in quantum theory light is a particle of a given energy where "wavelength" corresponds to the probably of a thing (mirror or camera sensor) interacting with such photon(so reflecting it if a mirror, or absorbing it, if say a CCD) very generally speaking. Thus resolving power is based on how close the optics are to the "Overlap" of probability areas. These probability areas could be imagined as a fuzzy sphere equal to the wavelength of light ..where the closer we are to the sphere center , the higher the probability that photon will be absorbed/reflected etc. for ex: if a detector/mirror is so small that two separate photons landing on it have large overlapping areas , then "which photon" it was can not be well distinguished (uncertainty principle applied to electrons absorbing photons ) so the resolution is low. However all this does not contradict anything you have in the video, it is simply the alternative way to look at it. Keep up the great work.
I am once again stunned by having a phenomenon which I knew about and experienced many times be explained to me to an higher higher level. And for once I don't feel left with more questions because it presented a lot of answers. I join the other comments to praise this RU-vid channel and it's content as outstanding and wish you nothing but the best success. Do you have any recommendations on further reading or watching because waiting for another video is my current approach. Perhaps I should do some application myself and figure out what the numbers are for the optical systems I am interested in. the 150mm f/1 lens I own is meant for 8-12μm wavelengths and the detector I can place below it is a focal plane array at 640x480 pixels which themselves are at a 25μm pitch, but not fully covering the area due to the topology of those pixels. From experimental evidence - i can see the moon a few pixels across and haven't noticed a single deep space object or even planet that isn't the earth. But my integration time is very much limited to about 1/60s. I do have a bunch of lenses, some cameras... But no microscope - but I found myself observing those ripples by having a tiny bit of sunlight fall into my eye through eye lashes. It does look similar to what your microscope observes - which showed me how wavelike everything I see actually is.
Took me back to my Electromagnetism module in my physics degree, agree with you that the James Web Telescope will be sensitive rather than high resolution Vs Hubble.
Physics is funny.... It works exactly the same for radio and radar. We have something called synthetic aperture radar or just SAR for short. Not having a big enouch antenna we just have the antenna move along a line and collect data. It's used on aircraft and spacecraft. Amazing what kind of resolutions you can get... It's kinda the same thing as ganging together a bunch of radiotelescopes like the VLA.
ty! people expect sharp colorful and dramatic pictures, and NASA is willing to deliver for support and low criticism. but the point is , this is for scientific measurement and findings, not IMAGES! We need to catch low energy faint and distant emissions of light, and analyse the spectrum. ty
WEBB is overly hyped and people expect to see great images, unfortunately this part will be partly disappointing. It will see through dust on IR spectrum but its angular resolution doesn't provide any prettier pictures than Hubble from extremely distant objects.
Can you do a quick video explaining the fringe patterns seen in the recent Webb image now that it is aligned. 1… rays coming out of the star 2… each ray has multiple interference lines 3… the blooming around the star 4… Dim stars look like weird twisted Daisy wheels.
Downside of aperture, due to irregularities in the Earth's atmosphere, even in the best areas for observing, that 40 inch telescope will probably only reach its theoretically highest resolution about 1 night out of 30 while the 8 inch scope will do this about 1 night every 7.
This is wrong. NASA said that one of the first things they want to do is re-create the Hubble Ultra Deep Field. You really think the first shots they put out are going to be some shitty blurry photos of distant galaxies? Hell no!
probably just a coincidence but the ripple on the laser passing thru the (shrinking? something else changing?) aperture reminds me of gibbs effect rippling on bandlimited square waves, the output being a square wave in the radius of a polar coordinate space
That sounds disappointing. However, for objects less than a billion light years away, hopefully resolution of structural details will be good enough to reveal details of nebulae and stars that were previously invisible. At greater distance, it sounds like we will get a coloured mosaic without distinctive shapes.
Regarding the multisegmented JWST mirror... Is "the trick" they use the same as in astronomical interferometers, where several telescopes in an array can achieve the resolution of an aperture the size of the entire array? The fact that telescope arrays achieve this despite 99.9 % of the "mirror" is missing seems even more magic to me.
with interferometry you can co some crazy stuff... Like that time they pointed radio telescopes all around the world at a black hole, which resulted in image as sharp as if telescope was size of Earth itself... IIRC they dumped the raw data from those telescopes, and they had to transfer it over so they can be interpreted in one place.
Excellent and very well explained video. I cringe when I see those adds for a tiny telescope and the pictures that are in that advertisement show a nice high resolution image of Saturn!! Very false advertising, should be a criminal offense!! I always knew that "size matters" ! Now I just want to sell my 8" (203mm) SCT and buy a 1,000mm one..... Oh, that's wayyyy too expensive :) There are also many people on various YT live streams who are commenting that they are expecting fantastically detailed images, far superior to the Hubble Telescope. I expect that may ne initially disappointed!
I think you are right. the images from the web won't be as spectacular as Hubble, which is how it is sold to the public. If what Nasa gets out of Webb is spectral plots, there won't be much public interest in it.
Your videos are just a treasure to watch learn as they are so well explained and with a lot of detail yet so much "clarity" (yes no pun here). Even at 72, with basic knowledge of physics from my college days, I am now able to recall and even understand so much more of the physics and the practical engineering of optics and optical devices. I wish I had teachers like you back when we just were told the formula or some "hand waving" explanation of various physics topics where even the profs did not have the insight nor the ability to communicate rather deep science principles. Oh well, the world is a lot better off now with such wonderful people connected through the web that the IQ of this entire planet has gone up by several points.🤩🤩👌👌👌👌
The interferometric method that combines the submirrors mentioned at ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-gOpbXBppUEU.html deserves an explanation.
The presenter pans across a lot of physics, not just optics. The presentation is so well thought through that a lot of difficult concepts get explained so ordinary mortals can understand. Max kudos! But should we be surprised at the Dutch who currently (Q3/2023) have Robbert Dijkgraaf as Minister of Education. Dhr Dijkgraaf is an internationally known theoretical physicist, former director of the Institute for Advanced Study at Princeton (a role taken by Robert Oppenheimer in the 1950s). So brain the size of a planet, close enough - education in good hands.
@@HuygensOptics Finally. If you mention it in future video descriptions and at the end of the video, this is going to get some traction, and will at least give you some budget to get specific equipment. Patreon community is also a great place to get feedback on videos before releasing them.
@@graealex Thanks for the advice. GIven my intentions to make more videos this year, I think is definitely a good idea to generate some extra income from sources directly related to the videos rather than regular work.
I wonder what you could get by taking the incredible details you could see on the shorter wavelengths, and analyzing the relative "fuzziness" on the longer wavelengths, combining them if you assume some model of how the sources of the light correlate their emissions at different wavelengths
Great great great video! I didn't even know how curious I was about this stuff. Your voice is super relaxing and you're so informative, too. We received a little 80mm aperture hobby telescope for christmas,and are already looking at others after seeing the moons of Jupiter.
Now teach everyone how the lenses in smartphone cameras cannot resolve more than 1 or 2 megapixels of image content, and the 10s of megapixels Apple (etc) advertises are just a cartooning of the coarse image called "sharpening" and "feature synthesis" from image processing.
The theoretical limit of resolution for a typically sized smartphone lens is about 20 megapixels. Practical considerations mean that is more like 10 in reality. Your numbers are very outdated. It's not 2012 anymore.
@@lobsterbark Nonsense. Diffraction limits are the weakest link and cannot be avoided. It has nothing to do with when things were made. Nobody's invented a way to defeat optical principles. This is such a huge lie on the part of the industry, and apparently you're completely fooled by it.
@@lobsterbark The image plane MTF integration of a typical smartphone camera lens is diffraction limited to about 1 MP of content. The spot size in microns for any lens is about equal to the f-number. Spot area divided into the camera sensor area gives the theoretical image content limit. Given the small sensors in smartphones, there's no way to get more than about 1 MP in the optics; the sensor MTF is irrelevant. What they're doing is sharpening and cartooning blurry images to make them look more resolved, but that is false, not real, resolution. For example, a smartphone can't resolve 4K video, but the image processing is tricked up to fake it.
At 13:41 you write *rot B* = με 𝛿 *E* /𝛿t and call this Faraday's law of induction. You have it backwards. Faraday's Law of Induction is *rot E* = - ∂ *B* /∂ _t._ What you have is Maxwell's correction to Ampère's circuital law but without Ampère's original term. Equivalently, it is the curl of the B field when electric current is zero, as it is for the case of light.
You are right about this, others have previously pointed it out to me too. It is one of the Maxwell's equations but the reference to Faraday is incorrect. Sorry about that, I should have checked better.
Jeroen: Thank you for this video. It has let me add your astute observations on diffraction to my own patchwork of understanding. And I appreciate your sharing your extraordinary optical creations through your other videos. You are making an outstanding contribution to the world wide web!
Thank you for this video! Just yesterday, I was watching a video on Photolitography, and what do you know, the feature size they can "project" onto silicon is calculated by the formula "k * (lambda / Aperture)". Makes much more sense to me now.
4:02 The notion of your idea for use of astrophotography is a miss, they want to make a bigger aperture for example 10"SCT out of glass, just make it bigger then we'll buy.
Eon space labs is doing great. Really impressed with their work in very short time and limited resources. Keep doing good job Team Eon Space Labs (ESL).👏👏👏
I just finished binge watching all your videos, what fascinating content. I've long had a desire to grind my own telescope mirror, I wonder if you'd be interested in doing a series on how to get started. I'm sure you'd find many people interested in trying to follow along.
Actually, Gordon Waite has an excellent series of videos on the subject (older videos on his channel) . This is the first one: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-uM1scQXpJzE.html
Scott Manley did a video on Synthetic Aperture Radar (SAR), and I've been longing for an explanation on why the same method cannot be applied to optical telescopes to make a planet-wide aperture. Any experts out there willing to enlighten me?
I was wondering that too, and I hope that he will address this question in one of the next episodes. Currently there is only classic optical interferometry which unfortunately requires expensive high-precision optical elements. In radio astronomy, however, aperture synthesis is already relatively easy to implement. Now that even consumer products are equipped with phase-detecting optical sensors, will we see optical aperture synthesis at an affordable price in the near future?
It is actually about pixel size. For example, the Canon R5 has 45mpix at 4.4x4.4 micron pixel size. A comparable or better resolution is theoretically possible for an f=800mm lens with a 95mm aperture.