One time I was with a friend of mine who is a physicist, and we were watching a presentation about birds of prey, and the presenter said something about how much detail a hawk can see at a certain distance, and my friend did a little quick back of the envelope calculation, and he was like, nah, that's way beyond the diffraction limit. This is why I love being friends with a scientist.
Hey Scott, I just wanted to point out, that the minimum angular resolution has also been "cheated" when it comes to microscopy. They use special fluorescent dies on the samples. For the actual image only small amounts of the die are activated each time, so that statistically all the glowing points are far enough away from each other. The final image is then reconstructed by and layering all the individual images and computer magic. I just found the procedure on the other end of the scale to be quite similar. Have a nice day and keep out the inspiring space videos.
Martin Henríquez Wehr just to inform you, a chemical which imparts colour to an object is called a 'dye', not a die. Dyes and stains are used in microscopy to colour the subjects. A die (noun) can refer to the cubical or otherwise shaped toy that is marked with numbers/values on each face, but it can also refer to the silicon substrate, usually square shaped, on which computer chips are manufactured.
I was talking about the wavelength based resolution limit of light microscopy, which is fundamentally different from electron microscopy. As far is i know, you can't give photons a size, a result of their particle/wave duality.
Personally I found he was just super quiet for the entire video. It isn't that the outro was obnoxiously loud, it's that you needed to crank the volume to hear him.
It amused me at some point to realise that the absurdly huge eyes of anime characters could actually have a practical benefit thanks to the optical limits of lenses. Long story short, if the fovea and retina were suited to it, a being with eyes that size would have visual acuity many orders of magnitude better than a human. ... I guess that's something to consider if extensive cybernetics become a thing. XD
Akshay Anand That did occur to me, though her eyes are still quite small in the scheme of things. The downside of course is larger eyes are easier to injure. Also if they have to be roughly spherical they can quickly get to a size where they wouldn't fit inside a person's skull anymore.
Eyes are not spherical. Long story short the part containing the lenses that you are interested in enlarging is protruding from the main sphere and you could in theory enlarge that without affecting the back of the eye much. Also if the eye is cybernetic it also makes sense that it has some sort of increases protection too.
In your explaination of dawe's limit I think you forgot to point out that the main reason for which interference patterns are created is primarly diffraction by the optical system's aperture, allowing light rays from the object to sightly change direction and interfere around the object's primary image
Project Idea: Now that everyone uses internet for TV, reclaim those sat dishes, set up a webpage and some open source hardware (maybe some SDR platform could be enough). Create an open source, worldwide masive interferometer.
There are already some plans out there to convert to I think 10GHz (from 11.something) for a simple radio telescope. The problem is that the receiver head already has a down-converter in it which kills the phase info. Handling phase info at even 10GHz is not for home gamers unless they are either expert radio-hams or similar. For example 1pF (a tiny capacitance) at 10GHz is an impedance of just 15 Ohms. The testing equipment for that sort of kit is for example a 20cm long ceramic stick with a 1mmx1mmx0.1mm copper plate on one end. You move it near to the circuit node where you think you have a problem and check the result (if it gets better you are under capacitance if worse then over or correct). It's an absolute git working at more than about 1GHz. You would also need a 40GHz network analyser (as you want kit capable of at least 3 x higher frequency than what you are testing) and you can of course hire one of those - for about £1500 a day. An E8363C (Agilent) is going second hand on fleabay for $45,000 and that is dirt cheap. To be really useful the kit would need to work at about 80GHz and that sort of frequency makes the above look like pennies.
Nice video overall. However, after all that time talking about interferometry, your final bit ignores the possibility of a constellation of small satellites being used to resolve higher detail pointed at the earth.
Yeah, I thought of that, too, and posted before I saw your post. Off course, that would force those satellites to measure radio waves instead of using optics, to preserve the phase function of the wave (if I understood Scott's explanation correctly) and I am not sure, what informations would exactly be visible on such pics.
+Kai Andrea He specifically mentions the Very Large Telescope (VLT) in Chile which can gang the four optical+infrared telescopes in interferometry mode (VLTI) for higher angular resolution. Though, upon re-watching, he doesn't make it very clear that they are optical telescopes.
That actually rossespinds very well towards monitors and TVs, the eye as a certain angular resolution, so a bigger screen that is further away gan run the same resolution without look pixelated.
I once read that if you placed a telescope on like Pluto's orbit and used the sun itself as a lens that bends light around it, you could actually get pictures of exoplanets, and by that I mean, like, really good quality 'can see continent outlines' levels of detail. Was I lied to? Or such a big 'lens' could allow for such?
The sun's gravitational focal point is roughly 550 AU. The Voyager craft have been going for over 40 years and are still only about 120 AU out. So 550 AU is nearly 200 years travel time. But that would get you a fly-by of the focal point, the craft would need retro-propulsion to station itself and/or need thousands of years via a Hohmann transfer orbit. And all that would buy you an objective lense just 1.4 million km across. It should be technically much easier to cobble together a space-based interferometer with a base-line 100 times greater.
The focal point isn't really a point, the focus of the region close to the sun is at about 550 AU but the focus from the area a little further from the surface of the sun is further out so a craft travelling out will see a different part of the "lens" going from 550 AU and outwards. A bigger problem is that you are stuck observing the same spot that you had to decide upon before launching. There is no way to re-point the sun :-)
I'd rather stay the heck away from dusty, mirror-bending gravity wells with my telescope. Space-based interferometer with millions of km base-line FTW.
personzorz, Moon is better, because it has gravity. You can build much cheaper cooling systems with help of gravity and shielding made of enormous amounts of dust and rocks in place (for instance, it's super hard to build a nuclear reactor in zero-g because of it). Lack of seismic activity and small orbit precession also helps a lot. You don't have to cancel out stabilization systems running from your sensitive experiment readings there. You can also build several km long vacuum chambers for linear accelerators or any laser experiments there, like gravity wave detection. No mantle means that a lot more of rare heavy elements are still close to the surface on the Moon than on Earth + you don't have any concerns about radioactive ore utilization. Moon colonization in these regards can't be underestimated
I recall watching a SETI talks youtube video (but i'm having trouble finding it now) where the speaker said NASA was given a decommissioned spy sat primary mirror, that was intended as a backup and never launched, on the condition that it would never be pointed at the earth. it was 16 m diameter if i recall correctly. the speaker's talk was about how to do nano-scale adjustments to the mirror's surface after it unfolded.
Hi Scott loved the video, however everything you mentioned can actually be summed up a Phased Array radar as theres plenty of them about and thats how they work. One principle you haven't mentioned is a Synthetic-aperture radar (SAR) radar which are also used for astronomy they take that principle one step further but using one moving 'telescope' (say a satellite) with a known path and velocity
But the detectors, that detect radio waves use their wave-like nature to detect them, and so are able to measure their phase, while detectors, that detect visible light use particle-like nature of light to detect it, so they are not able to measure their phase.
Hey, Scott... How does the size of the telescopes affect the resolution when considering the distance between them? That is, how much better would two 14.14 meter (having exactly twice the surface area of a 10 meter telescope) telescopes 1000 miles apart than two 10 meter telescopes 1000 miles apart? Or for that matter, how would two 14.14 meter telescopes 500 miles apart compare to two 10 meter telescopes 1000 miles apart?
In principle, the size of the telescopes affects how many photons/how much signal you collect, but it has no effect on the resolution. So the pair of 14.14 metre telescopes could get the same data as the two 10m telescopes, but could collect it in half the time. In practice, I don't know if it works quite like this. I'd expect problems (beyond just needing long observation time) if you made the telescopes too small, but this is beyond my knowledge.
Soooo ... if you have about a dozen satellites each being a few hundred kilometers away from any other... Does that not help with the resolution? That is if they look at the same newspaper headline.
No. As he said in the video, we don't currently have the technology to directly measure the phase of visible light, which would be necessary for that kind of interferometry. We can only do that with radio waves for now. That's why the VLTI facility in Chile had to use those very precise light pipes to manually combine the light from each of the four telescopes. Now, if we develop that technology in the future, then, yes, it's theoretically possible to do that. But, you're still going to run into problems with atmospheric distortion and other such issues, so I don't think you'll ever be able to read a newspaper from orbit.
Atmospheric distortions can be cancelled out in the same way. I mean if you want just a picture you can take a bit of video and get rid of the atmospheric distortion using software, right? Anyway, that technology would be most useful for a lot of reasons but i guess the wavelenth is just way too short for any sort of "antenna array" for visible light ... for now.
To everyone asking about fleets of satellites, I think the other problem you would run into would be motion blur. Satellites move blisteringly fast relative to the surface. (But not geostationary satellites. (I don't think there are any geostationary spy satellites.))
The images at 5:50-6:03 if real are the most amazing astro. pics I've seen in a long long time, exoplanets orbiting their star? This graphically illustrates the advantage/need of high resolution using VLBLI.
Scott, I don't think the limit is because of interference. The phase of photons are random. The limit is due to the physical "width" of a photon. Radio has "widths" that are larger than buildings.
Sigh, I always regret that we didn't get to learn as much stuff in my engineering physics class on light cause we were at the end of the semester rush.
Why isn't there a term in that formula on the time of observation? Surely, after enough time, you will gather statistics enough to improve the resolution of your image?
OK, just to spice it up and : Couldn't spy satelites use the same technique discussed in the video? Yes, they would have to use radiowaves instead of optics, and I am not totally sure what informations could or couldn't be retrieved from such shots, but still?
So could you build a series of satellites and put them in orbit around the sun to get an aperture as big as their orbits? Provided you work out the engineering challenges of collating all the data?
Re. the newspaper thingy, what about SAR ? A LEO sat would certainly be moving. Would have to be a newspaper either flat on the ground with no wind or in the hands of a very static reader though.
The microwave frequencies that we use for imaging the Earth from satellites range from about 1 GHz (25 cm wavelength) to 10 GHz (3 cm wavelength). It has been possible for many years to build detectors to digitize the full waveform of this data including the phase, to enable interferometry, by down-converting the signal to remove the carrier frequency. This is possible because it is an active system and we know what signal we have sent out of the radar antenna. Recording the full phase allows us to use radar pulses received at many different locations as the satellite moves along the orbit and form a synthetic aperture that is many 10 km long to get the spatial resolution down to a few meters or even a fraction of a meter. The data sampling is up to around 300 MHz these days for radar satellites. As Scott explained, for optical wavelengths it is not possible to digitize the signals and maintain the phase.
What is best the resolution possible with the size of space telescope we have up now, 5cm? What if You use two or more with interferometry, with one as combiner with variable light path.
Another complicating factor is that all the telescopes are moving relative to each other with the Earth's rotation, orbital motion, and galactic orbital motion so the optical paths will also have to be compensated for doppler shift to keep the waves in sync - it just doesn't seem possible to adjust for relative motions of hundreds or thousands of metres per second, and keep the waves at less than a tenth of the wavelength! Just mindbogglingly awesome if they can overcome this!
Love this video man. I haven't seen you talk about superconducting CCD's before, but I'd love to hear your thoughts on them. It's super sexy tech. Ben Mazin from what I can tell has been pretty much the guy developing them for several years now. They DARKNESS instrument is apparently up and running at Palomar now.
Scott, do you have any good info on the problems with the ABI on GOES-17 and the progress on trying to fix it (or if that's even going to be possible)? I know you've done a vid on GOES before and you may well have mentioned this, but if you know any more than I do (that the temperature regulation for the ABI isn't working, so the majority of the IR bands don't function correctly for a good portion of the day when the sun is shining on the instrument from around the planet) then I'd love to see a vid on that.
We already have one; it's called Earth. We just need to work on the lens a bit. Not necessarily in _all_ locations, just enough that we can figure out what's going on.
10:06 - What if they had a space telescope with a segmented mirror that it kept folded up most of the time and only unfolded to its tens-of-meters size when it actually came time to spy on someone's newspaper? (Not claiming that that's actually the case, just playing devil's advocate.)
You forget that visual processing allows more detail beyond the theoretical limits of a purely physical telescope, you have averaging and advanced algos on your side
Hey @Scott Manley could it be done such that optical pipes between satellites to satellites would achieve said needed resolution? Or would they be horribly misaligned. Or for example between different points on the ISS? For either ground or stellar observation?
Hey Scott! Related question: would the naked eye be able to pick up a Resurgent-class Star Destroyer from orbit as depicted in the opening scene of Star Wars: The Last Jedi?
Satellites tend to be long and narrow to fit in fairings. Any chance a satellite could be designed to fly on its side with multiple small cameras and do the light-pipe trick to tie them all together to get an effective lens size of the broadside of the satellite?
This was great video, really learned something in field where I thought I was relative mature.Thank you very much for this great explanation.:) Subscribed
lolskigaming if it's accurate enough, you could have a larger aperture to get in detail, but only in one dimension. You would need to add another one next to the line that the two original cameras make. But since phones are mass produced and don't cost as much as a telescope, I highly doubt that they will get the precision right ^^
No; The angular resolution comes from the combination of light waves in a common focal point between the lenses, having separate 40Mp sensors will only ever give you two pictures, each with the same limit of angular resolution (with some stereoscopic effects). I am interested though in the 'one dimension' aspect Otto Knabe puts forward, this makes intuitive sense for an interferometry system but I'd love to see someone work through some examples.
I have a science question i would like to see a video on. with regard to parallaxis I understand that trigonometry would mathematically limit the distance we could measure because the angles at one point limit our ability to tri-angulate. Everything beyond that is an estimate. Is this scientifically sound and what is the max distance we could accurately measure before it becomes estimations of distance?
To get a truly huge aperture for a spy satelite: How about making a gigantic deployable mirror in space from inflated mylar (the pressure doesn't need to be high in space), or even electrostatically "inflated". It could be hundreds of meters in diameter. To get the concave shape you'd have it like a balloon, with one hemisphere transparent and the other hemisphere metallised on the inside. So the light comes in through the transparent side, reflects off the reflective inside, then you'd have the detector suspended inside or something. Tho i'm not sure the unprotected mylar will survive long in the harsh environment of space. It may be better to do the *electrostatic* thing rather than pressure, coz then you can have the primary surface metallised mirror facing directly outwards towards your target without needing a transparent window of plastic in the way, and ALL of the exposed mylar can be metallised and protected from space harshness. You'd make it by having a ring frame (perhaps inflated) around the outside of a flat circular mylar "canvas", and you'd have something suspended behind the canvas that is electrostatically attracting the mylar towards it, producing a parabola.
I highly doubt that inflated "balloon" approach would give good enough accuracy. Especially at tens of meters diameter. If we look at the James Webb telescope the max. displacement of the individiual mirrors is only 75nm, you would have a hard time trying to form a foil at that precision. Also they make the mirros out of beryllium because of its low thermal expansion, mylar (boPET) has compared to that more than 100x the thermal expansion...
Gerard O'Neil had a concept for a series of giant orbiting lenses that when aligned precisely could form a "telescope" thousands of kilometers long for extremely high resolution images.
Is there any way that this kind of technology could be used in a phone with current sizes sensors and lenses, but 2 if them at opposite corners of the phone to get extremely high quality images?
Love the content, was trying to explain the event horizon telescope to a friend a few days ago, so right on time... But turn down the outro music. Good Lord, it's coming thru speakers, and the levels between voice and music end up blasting my face off
If it's possible to detect the phase and intensity of radio waves, why can't the same be done for the visual spectrum? Is it only possible to detect the phase of EM radiation for certain frequency ranges, and if so is this a hard limit or something that can be improved upon in the future? Or am I missing something simple...
I've a sci-fi question, to whoever ends up reading this comment. In a sci-fi universe, I had the idea of having some group observing an event, say, a days or weeks-old space battle or space traffic, in real-time present by placing a space telescope out some light distance from it and recording it as it happens. How feasible is this?
After making a point about ‘Math’ being ‘Mathematics’ (quite right, too!) early into the video, you then screwed up slightly later by using the word Math! Which makes no difference at all to my enjoyment of this very interesting video - a hearty ‘well done’ sir, and I hope you can forgive my nitpicking..... lol
VLBI is pretty amazing. I would love it if one of the science oriented UA-cam channels gave an even more detailed explanation than you did of how it works. For example, those interference patterns are combined and undergo some mathematical process to create the resulting image. It would be nice to see an animation that attempted to explain how the math worked visually. I know I'm asking a lot. And I don't expect _you_ to do this necessarily. But it'd be nice if someone did. :-) I suppose the fact that phase is important means the even the images showing the interference patterns aren't enough information to give a good explanation of what's going on. :-( BTW, I think this technology is also related to how the Lytro Light Field camera works.
But can we build an optical telescope like this, the size of Earth? It should have way better resolution than a radio telescope, due to the smaller wavelength. I tried to do the math (I hope I dit it right), and if we build such a virtual telescope we could obtain an image of Proxima b of ~6018x6018 pixels, or an image of Trappist-1 e of 575x575. A planet the size of Earth (~12,000 km diameter), located at a distance of 10,000 ly, would be seen as 3x3 pixels, about the limit of the telescope
Well, I mean... we COULD, if the signal was never converted to an electrical signal before it was combined, using 6,000+ km optical cables, but I guess that's sorta beside the point, since it's not practical. It's also possible that the walls of the cable would also absorb all the photons from the signal before it reached the end.
i watched some of your videos and i must say that you are making them like a news story... its like "there were some scientist and discovered something that they called atom, you dont need to know what it is and how it works or how did they found it and what is made of, you just need to know that it did happen and they found it and it does exist"... i would like you to explain more about the stuff you are talking about, at least the basics of the principles behind it... otherwise great content and great videos, keep up the good work and best wishes...
WRT spy imaging: Don't forget time as another dimension and reverse convolution. It's not always about pure optics, but it may be about very well characterized optics.
Isaac Arthur made a great video on the topic of Mega Telescopes, it's here on YT. Everyone subscibed to Scott also should take a big interest in that channel.
Have they announced when we'll see the Event Horizons picture of Sagittarius A*??? I can't wait to see if the artist impressions we are used to are close to reality.
So the real solution is to build a really really really big one in space. If/When it comes online and is actually 100% reusable the BFR could probably get it up there pretty cheaply in sections.
6 років тому
Not trying to validate any wacky conspiracy theories, just thinking out loud... Can software be used to compensate for atmospheric interference instead of warping the mirror? Could you then use the same algorithim to compensate for an incorrectly/randomly warped mirror instead of atmospheric interference? If that became the case, what would stop a nation state from deploying a spy sattelite with a Mylar mirror which unfolded over the target, took some footage, and then folded back up, relying on the algorithm to filter out/compensate for the irregularities inherent to a foldable mylar mirror?
Prakash Kamath It's even better in space, because we don't have restrinctions like the size of our planet, or atmospheric effects. We can put a few telescopes in orbit around the Sun very far away from it and obtain a virtual telescope the size of....the entire Solar System...or more. Though it would be pointed at a specific target, and it would be difficult to make the virtual telescope to look at another direction, because you'd need to change the orbit of the entire array of satelites to look at different patches of the sky. This surely can be done still 👌
One time I was with a friend of mine who is a physicist, and we were watching a presentation about birds of prey, and the presenter said something about how much detail a hawk can see at a certain distance, and my friend did a little quick back of the envelope calculation, and he was like, nah, that's way beyond the diffraction limit. This is why I love being friends with a scientist.
Hey Scott,
I just wanted to point out, that the minimum angular resolution has also been "cheated" when it comes to microscopy.
They use special fluorescent dies on the samples. For the actual image only small amounts of the die are activated each time, so that statistically all the glowing points are far enough away from each other. The final image is then reconstructed by and layering all the individual images and computer magic.
I just found the procedure on the other end of the scale to be quite similar.
Have a nice day and keep out the inspiring space videos.
Martin Henríquez Wehr just to inform you, a chemical which imparts colour to an object is called a 'dye', not a die. Dyes and stains are used in microscopy to colour the subjects. A die (noun) can refer to the cubical or otherwise shaped toy that is marked with numbers/values on each face, but it can also refer to the silicon substrate, usually square shaped, on which computer chips are manufactured.
That's not *angular* resolution.
Thanks for the clarification, english ist not mein mother tounge.
Electron microscopes use electrons that are way smaller than photons. That's how u cheat resolution issues with light... Dont use light!
I was talking about the wavelength based resolution limit of light microscopy, which is fundamentally different from electron microscopy.
As far is i know, you can't give photons a size, a result of their particle/wave duality.
The event horizon project? This better not involve a spaceship that disappears under mysterious circumstances and reappears years later near Neptune
JediNg135 DO YOU SEE!!?
YES I SEE [beeep] [BOOOOOOOMB}
That movie really scared me when I was young. Haven't watched it since. Should go back for some nostalgia...
Yes you should :)
adjust the audio volumes please ... the outro is too loud and violent
Also, Scott's voice was too low. I had to boost the audio to hear him. The clip from 0:30 is normal, by comparison.
Personally I found he was just super quiet for the entire video. It isn't that the outro was obnoxiously loud, it's that you needed to crank the volume to hear him.
Yeah it was very quiet even at max volume.
Check yo (sound)stagin'!
It amused me at some point to realise that the absurdly huge eyes of anime characters could actually have a practical benefit thanks to the optical limits of lenses.
Long story short, if the fovea and retina were suited to it, a being with eyes that size would have visual acuity many orders of magnitude better than a human.
... I guess that's something to consider if extensive cybernetics become a thing. XD
KuraIthys Alita Battle Angel?
Akshay Anand That did occur to me, though her eyes are still quite small in the scheme of things.
The downside of course is larger eyes are easier to injure.
Also if they have to be roughly spherical they can quickly get to a size where they wouldn't fit inside a person's skull anymore.
Eyes are not spherical. Long story short the part containing the lenses that you are interested in enlarging is protruding from the main sphere and you could in theory enlarge that without affecting the back of the eye much. Also if the eye is cybernetic it also makes sense that it has some sort of increases protection too.
That be good for a dim planet.
Great stuff! (Keep an eye on your Audio levels! ) 🙂
The music at the end is very loud compared to the rest of the video.
can we get an update on planet 9? ive been reading a lot about it recently and its been 2 and a half years since you've done a video on it.
In your explaination of dawe's limit I think you forgot to point out that the main reason for which interference patterns are created is primarly diffraction by the optical system's aperture, allowing light rays from the object to sightly change direction and interfere around the object's primary image
Project Idea:
Now that everyone uses internet for TV, reclaim those sat dishes, set up a webpage and some open source hardware (maybe some SDR platform could be enough). Create an open source, worldwide masive interferometer.
I'm not sure if people can build and maintain something as accurate as usable interferometer.
There is a movie with Charlie Sheen as a radio astronomer you should see :-)
I'm not rich enough to own a satelite dish. But i can code the software.
i guess that could work.
There are already some plans out there to convert to I think 10GHz (from 11.something) for a simple radio telescope. The problem is that the receiver head already has a down-converter in it which kills the phase info. Handling phase info at even 10GHz is not for home gamers unless they are either expert radio-hams or similar. For example 1pF (a tiny capacitance) at 10GHz is an impedance of just 15 Ohms. The testing equipment for that sort of kit is for example a 20cm long ceramic stick with a 1mmx1mmx0.1mm copper plate on one end. You move it near to the circuit node where you think you have a problem and check the result (if it gets better you are under capacitance if worse then over or correct). It's an absolute git working at more than about 1GHz. You would also need a 40GHz network analyser (as you want kit capable of at least 3 x higher frequency than what you are testing) and you can of course hire one of those - for about £1500 a day. An E8363C (Agilent) is going second hand on fleabay for $45,000 and that is dirt cheap. To be really useful the kit would need to work at about 80GHz and that sort of frequency makes the above look like pennies.
There you go young nerds of the world, figure out how to digitize phase information at optical frequencies. I bet there's a Nobel Prize in it for ya.
Nice video overall. However, after all that time talking about interferometry, your final bit ignores the possibility of a constellation of small satellites being used to resolve higher detail pointed at the earth.
Didn't he say that the multiple viewpoints techniques doesn't work with optical telescopes, only radio telescopes?
Yeah, I thought of that, too, and posted before I saw your post. Off course, that would force those satellites to measure radio waves instead of using optics, to preserve the phase function of the wave (if I understood Scott's explanation correctly) and I am not sure, what informations would exactly be visible on such pics.
There could be optical interferometers in space, in fact they have been proposed, but no evidence optical versions have been built... yet.
What would be the smallest wavelength, that could still allow for interferometry? (is that a word?) Could microwave spy satellites use that?
+Kai Andrea He specifically mentions the Very Large Telescope (VLT) in Chile which can gang the four optical+infrared telescopes in interferometry mode (VLTI) for higher angular resolution. Though, upon re-watching, he doesn't make it very clear that they are optical telescopes.
I wish I was as manly as Manley
And as Scottish as Scott?
And as Scottish as Scott?
Scott the Manly Scottish Manley Man
*manly
My dad is working in the MPIfR in Bonn, Germany as a korrelator and VLBI was a great part of my childhood, you gotta love Effelsberg
As someone who has worked on the largest radio interferometers in the world, you're doing a pretty good explanation.
Mr Manley, you are my primary illicit sciencey picture dealer. Bravo!
That actually rossespinds very well towards monitors and TVs, the eye as a certain angular resolution, so a bigger screen that is further away gan run the same resolution without look pixelated.
I once read that if you placed a telescope on like Pluto's orbit and used the sun itself as a lens that bends light around it, you could actually get pictures of exoplanets, and by that I mean, like, really good quality 'can see continent outlines' levels of detail.
Was I lied to? Or such a big 'lens' could allow for such?
Telescope as large as the Earth is good and all but I want a telescope that uses the sun as a gravitational lens... go big or go home! :-)
Will be a long time before that ever happens.
Yeah, just getting that far out will take a looong time. Unless someone has a brainwave on new propulsion systems.
The sun's gravitational focal point is roughly 550 AU. The Voyager craft have been going for over 40 years and are still only about 120 AU out. So 550 AU is nearly 200 years travel time. But that would get you a fly-by of the focal point, the craft would need retro-propulsion to station itself and/or need thousands of years via a Hohmann transfer orbit.
And all that would buy you an objective lense just 1.4 million km across. It should be technically much easier to cobble together a space-based interferometer with a base-line 100 times greater.
correction: go even bigger or stay on earth (home).
The focal point isn't really a point, the focus of the region close to the sun is at about 550 AU but the focus from the area a little further from the surface of the sun is further out so a craft travelling out will see a different part of the "lens" going from 550 AU and outwards. A bigger problem is that you are stuck observing the same spot that you had to decide upon before launching. There is no way to re-point the sun :-)
Why go to the moon/mars? Cause we need to put a telescope there :P
I'd rather stay the heck away from dusty, mirror-bending gravity wells with my telescope. Space-based interferometer with millions of km base-line FTW.
Put em at the Earth/Sun L4/L5 points!
One day we have galaxy spanning telescopes. :D
personzorz, Moon is better, because it has gravity. You can build much cheaper cooling systems with help of gravity and shielding made of enormous amounts of dust and rocks in place (for instance, it's super hard to build a nuclear reactor in zero-g because of it). Lack of seismic activity and small orbit precession also helps a lot. You don't have to cancel out stabilization systems running from your sensitive experiment readings there. You can also build several km long vacuum chambers for linear accelerators or any laser experiments there, like gravity wave detection. No mantle means that a lot more of rare heavy elements are still close to the surface on the Moon than on Earth + you don't have any concerns about radioactive ore utilization. Moon colonization in these regards can't be underestimated
Nobody Quite You can build zenith telescopes on the moon
at 8:28 what is the instillation at the south pole?
I recall watching a SETI talks youtube video (but i'm having trouble finding it now) where the speaker said NASA was given a decommissioned spy sat primary mirror, that was intended as a backup and never launched, on the condition that it would never be pointed at the earth. it was 16 m diameter if i recall correctly. the speaker's talk was about how to do nano-scale adjustments to the mirror's surface after it unfolded.
Nice episode. :) Waveguides are beginning to make a splash. Are they being used in telescopes instead of lenses?
Hi Scott loved the video, however everything you mentioned can actually be summed up a Phased Array radar as theres plenty of them about and thats how they work. One principle you haven't mentioned is a Synthetic-aperture radar (SAR) radar which are also used for astronomy they take that principle one step further but using one moving 'telescope' (say a satellite) with a known path and velocity
5:12 - So why not make smartphone camera lenses bigger?
You want a 20 mm thick phone in your pocket
isnt the radio waves as well the particles too. as they are no different than light in fundamental way.
But the detectors, that detect radio waves use their wave-like nature to detect them, and so are able to measure their phase, while detectors, that detect visible light use particle-like nature of light to detect it, so they are not able to measure their phase.
Where is the video that you referenced at the beginning of the video?
Hey, Scott... How does the size of the telescopes affect the resolution when considering the distance between them? That is, how much better would two 14.14 meter (having exactly twice the surface area of a 10 meter telescope) telescopes 1000 miles apart than two 10 meter telescopes 1000 miles apart? Or for that matter, how would two 14.14 meter telescopes 500 miles apart compare to two 10 meter telescopes 1000 miles apart?
In principle, the size of the telescopes affects how many photons/how much signal you collect, but it has no effect on the resolution. So the pair of 14.14 metre telescopes could get the same data as the two 10m telescopes, but could collect it in half the time.
In practice, I don't know if it works quite like this. I'd expect problems (beyond just needing long observation time) if you made the telescopes too small, but this is beyond my knowledge.
Soooo ... if you have about a dozen satellites each being a few hundred kilometers away from any other...
Does that not help with the resolution? That is if they look at the same newspaper headline.
No. As he said in the video, we don't currently have the technology to directly measure the phase of visible light, which would be necessary for that kind of interferometry. We can only do that with radio waves for now. That's why the VLTI facility in Chile had to use those very precise light pipes to manually combine the light from each of the four telescopes.
Now, if we develop that technology in the future, then, yes, it's theoretically possible to do that. But, you're still going to run into problems with atmospheric distortion and other such issues, so I don't think you'll ever be able to read a newspaper from orbit.
Atmospheric distortions can be cancelled out in the same way. I mean if you want just a picture you can take a bit of video and get rid of the atmospheric distortion using software, right?
Anyway, that technology would be most useful for a lot of reasons but i guess the wavelenth is just way too short for any sort of "antenna array" for visible light ... for now.
To everyone asking about fleets of satellites, I think the other problem you would run into would be motion blur. Satellites move blisteringly fast relative to the surface. (But not geostationary satellites. (I don't think there are any geostationary spy satellites.))
The images at 5:50-6:03 if real are the most amazing astro. pics I've seen in a long long time, exoplanets orbiting their star? This graphically illustrates the advantage/need of high resolution using VLBLI.
Scott, I don't think the limit is because of interference. The phase of photons are random. The limit is due to the physical "width" of a photon. Radio has "widths" that are larger than buildings.
Sigh, I always regret that we didn't get to learn as much stuff in my engineering physics class on light cause we were at the end of the semester rush.
Why isn't there a term in that formula on the time of observation? Surely, after enough time, you will gather statistics enough to improve the resolution of your image?
OK, just to spice it up and : Couldn't spy satelites use the same technique discussed in the video? Yes, they would have to use radiowaves instead of optics, and I am not totally sure what informations could or couldn't be retrieved from such shots, but still?
You can't see people by radiowaves, because the waves are bigger than the people you want to observe.
Hmm Super High Frequency waves have wave lengths down to one centimeter, microwaves go even smaller.
Microwaves begin around λ 1m.
They could find a mountain. Maybe.
TheAllegiantTraitor .....and go down to 1 mm at 300 GHz. (I have no fancy greek letters on my keyboard, though)
So could you build a series of satellites and put them in orbit around the sun to get an aperture as big as their orbits? Provided you work out the engineering challenges of collating all the data?
7:17 are you saying that radio waves are NOT also photons?
Re. the newspaper thingy, what about SAR ? A LEO sat would certainly be moving. Would have to be a newspaper either flat on the ground with no wind or in the hands of a very static reader though.
The microwave frequencies that we use for imaging the Earth from satellites range from about 1 GHz (25 cm wavelength) to 10 GHz (3 cm wavelength). It has been possible for many years to build detectors to digitize the full waveform of this data including the phase, to enable interferometry, by down-converting the signal to remove the carrier frequency. This is possible because it is an active system and we know what signal we have sent out of the radar antenna. Recording the full phase allows us to use radar pulses received at many different locations as the satellite moves along the orbit and form a synthetic aperture that is many 10 km long to get the spatial resolution down to a few meters or even a fraction of a meter. The data sampling is up to around 300 MHz these days for radar satellites. As Scott explained, for optical wavelengths it is not possible to digitize the signals and maintain the phase.
Not sure if I understood this properly. Can you get around this with a curved sensor instead of a flat sensor?
Can metamaterials get around this at all?
What about gravitational telescopes ?
What is best the resolution possible with the size of space telescope we have up now, 5cm? What if You use two or more with interferometry, with one as combiner with variable light path.
Another complicating factor is that all the telescopes are moving relative to each other with the Earth's rotation, orbital motion, and galactic orbital motion so the optical paths will also have to be compensated for doppler shift to keep the waves in sync - it just doesn't seem possible to adjust for relative motions of hundreds or thousands of metres per second, and keep the waves at less than a tenth of the wavelength!
Just mindbogglingly awesome if they can overcome this!
Love this video man. I haven't seen you talk about superconducting CCD's before, but I'd love to hear your thoughts on them. It's super sexy tech.
Ben Mazin from what I can tell has been pretty much the guy developing them for several years now. They DARKNESS instrument is apparently up and running at Palomar now.
Any idea whether using plasmonics could allow a lens system to beat the Dawes limit, through reducing the wavelength?
Scott, do you have any good info on the problems with the ABI on GOES-17 and the progress on trying to fix it (or if that's even going to be possible)? I know you've done a vid on GOES before and you may well have mentioned this, but if you know any more than I do (that the temperature regulation for the ABI isn't working, so the majority of the IR bands don't function correctly for a good portion of the day when the sun is shining on the instrument from around the planet) then I'd love to see a vid on that.
We already have one; it's called Earth. We just need to work on the lens a bit. Not necessarily in _all_ locations, just enough that we can figure out what's going on.
I choose to interpret this as a polish Mexico joke
10:06 - What if they had a space telescope with a segmented mirror that it kept folded up most of the time and only unfolded to its tens-of-meters size when it actually came time to spy on someone's newspaper? (Not claiming that that's actually the case, just playing devil's advocate.)
You forget that visual processing allows more detail beyond the theoretical limits of a purely physical telescope, you have averaging and advanced algos on your side
Hey @Scott Manley could it be done such that optical pipes between satellites to satellites would achieve said needed resolution? Or would they be horribly misaligned. Or for example between different points on the ISS? For either ground or stellar observation?
Hey Scott! Related question: would the naked eye be able to pick up a Resurgent-class Star Destroyer from orbit as depicted in the opening scene of Star Wars: The Last Jedi?
It'd be a little smaller than our moon.
Scott Manley Thanks! :)
Satellites tend to be long and narrow to fit in fairings. Any chance a satellite could be designed to fly on its side with multiple small cameras and do the light-pipe trick to tie them all together to get an effective lens size of the broadside of the satellite?
This was great video, really learned something in field where I thought I was relative mature.Thank you very much for this great explanation.:) Subscribed
But what if my phone has 2 40 mp sensors spaced 2cm apart do I get a better detail?
lolskigaming if it's accurate enough, you could have a larger aperture to get in detail, but only in one dimension. You would need to add another one next to the line that the two original cameras make.
But since phones are mass produced and don't cost as much as a telescope, I highly doubt that they will get the precision right ^^
No; The angular resolution comes from the combination of light waves in a common focal point between the lenses, having separate 40Mp sensors will only ever give you two pictures, each with the same limit of angular resolution (with some stereoscopic effects). I am interested though in the 'one dimension' aspect Otto Knabe puts forward, this makes intuitive sense for an interferometry system but I'd love to see someone work through some examples.
I have a science question i would like to see a video on. with regard to parallaxis I understand that trigonometry would mathematically limit the distance we could measure because the angles at one point limit our ability to tri-angulate. Everything beyond that is an estimate. Is this scientifically sound and what is the max distance we could accurately measure before it becomes estimations of distance?
To get a truly huge aperture for a spy satelite: How about making a gigantic deployable mirror in space from inflated mylar (the pressure doesn't need to be high in space), or even electrostatically "inflated". It could be hundreds of meters in diameter.
To get the concave shape you'd have it like a balloon, with one hemisphere transparent and the other hemisphere metallised on the inside. So the light comes in through the transparent side, reflects off the reflective inside, then you'd have the detector suspended inside or something. Tho i'm not sure the unprotected mylar will survive long in the harsh environment of space.
It may be better to do the *electrostatic* thing rather than pressure, coz then you can have the primary surface metallised mirror facing directly outwards towards your target without needing a transparent window of plastic in the way, and ALL of the exposed mylar can be metallised and protected from space harshness. You'd make it by having a ring frame (perhaps inflated) around the outside of a flat circular mylar "canvas", and you'd have something suspended behind the canvas that is electrostatically attracting the mylar towards it, producing a parabola.
I highly doubt that inflated "balloon" approach would give good enough accuracy. Especially at tens of meters diameter. If we look at the James Webb telescope the max. displacement of the individiual mirrors is only 75nm, you would have a hard time trying to form a foil at that precision. Also they make the mirros out of beryllium because of its low thermal expansion, mylar (boPET) has compared to that more than 100x the thermal expansion...
Ah the EHT... can't wait for that perfect Xeelee selfie.
Hey Scott you finally turned down the volume.
4:20 "The wavelength of radio waves is longer than that of visible light"?? I don't think so.
upload.wikimedia.org/wikipedia/commons/thumb/c/cf/EM_Spectrum_Properties_edit.svg/330px-EM_Spectrum_Properties_edit.svg.png
Gerard O'Neil had a concept for a series of giant orbiting lenses that when aligned precisely could form a "telescope" thousands of kilometers long for extremely high resolution images.
Is there any way that this kind of technology could be used in a phone with current sizes sensors and lenses, but 2 if them at opposite corners of the phone to get extremely high quality images?
Why we can't combine data from optic telescopes around the world? Then that resolution must be super sharp right? (direct imaging of exoplanets?)
Huangwei Xie because we don’t have the technology to determine visible light’s phase angle.
Love the content, was trying to explain the event horizon telescope to a friend a few days ago, so right on time... But turn down the outro music. Good Lord, it's coming thru speakers, and the levels between voice and music end up blasting my face off
Problem was after fixing resolution images my voice ended up too quiet
I wonder, if we do this with optical telescopes, would we be able to say, view pluto with a high resolution, say better than hubble?
If it's possible to detect the phase and intensity of radio waves, why can't the same be done for the visual spectrum? Is it only possible to detect the phase of EM radiation for certain frequency ranges, and if so is this a hard limit or something that can be improved upon in the future?
Or am I missing something simple...
Great vid, thank you!
I've a sci-fi question, to whoever ends up reading this comment.
In a sci-fi universe, I had the idea of having some group observing an event, say, a days or weeks-old space battle or space traffic, in real-time present by placing a space telescope out some light distance from it and recording it as it happens. How feasible is this?
If I undesrtood your correctly, my answer is: you can't transfer the information faster than the light travels.
After making a point about ‘Math’ being ‘Mathematics’ (quite right, too!) early into the video, you then screwed up slightly later by using the word Math! Which makes no difference at all to my enjoyment of this very interesting video - a hearty ‘well done’ sir, and I hope you can forgive my nitpicking..... lol
VLBI is pretty amazing. I would love it if one of the science oriented UA-cam channels gave an even more detailed explanation than you did of how it works. For example, those interference patterns are combined and undergo some mathematical process to create the resulting image. It would be nice to see an animation that attempted to explain how the math worked visually.
I know I'm asking a lot. And I don't expect _you_ to do this necessarily. But it'd be nice if someone did. :-)
I suppose the fact that phase is important means the even the images showing the interference patterns aren't enough information to give a good explanation of what's going on. :-(
BTW, I think this technology is also related to how the Lytro Light Field camera works.
Now I'm wondering how much it would cost to build two radio telescopes and launche them, one each, to the Earth-Mood L4 and L5 points.
With a telescope the size of the earth we should be able to look really close to a black hole and see the earth millions of years ago
The World (sized telescope) is not enough.
So would it be feasible to build telescopes "larger" than the earth with space based telescopes?
audio is weak on this video
We need to send telescopes to different Lagrange Points around the sun to create super large interferometers!
Noooo... The launch escape system on your SaturnV is missing.
But can we build an optical telescope like this, the size of Earth? It should have way better resolution than a radio telescope, due to the smaller wavelength. I tried to do the math (I hope I dit it right), and if we build such a virtual telescope we could obtain an image of Proxima b of ~6018x6018 pixels, or an image of Trappist-1 e of 575x575. A planet the size of Earth (~12,000 km diameter), located at a distance of 10,000 ly, would be seen as 3x3 pixels, about the limit of the telescope
Pay attention, he said that in the video. This is not possible, because one can't retrieve the phase information at optical wavelengths.
Arne Schwarz Ah darn it, thanks
Well, I mean... we COULD, if the signal was never converted to an electrical signal before it was combined, using 6,000+ km optical cables, but I guess that's sorta beside the point, since it's not practical. It's also possible that the walls of the cable would also absorb all the photons from the signal before it reached the end.
If we sent enough telescopes in orbit around the sun, could we use the same math to derive a telescope with an aperture the size of the solar system?
For the first 2 37 Scott doesn't blink
i watched some of your videos and i must say that you are making them like a news story... its like "there were some scientist and discovered something that they called atom, you dont need to know what it is and how it works or how did they found it and what is made of, you just need to know that it did happen and they found it and it does exist"... i would like you to explain more about the stuff you are talking about, at least the basics of the principles behind it... otherwise great content and great videos, keep up the good work and best wishes...
WRT spy imaging: Don't forget time as another dimension and reverse convolution. It's not always about pure optics, but it may be about very well characterized optics.
This is when physics come in handy
Isaac Arthur made a great video on the topic of Mega Telescopes, it's here on YT. Everyone subscibed to Scott also should take a big interest in that channel.
awesome video
Have they announced when we'll see the Event Horizons picture of Sagittarius A*??? I can't wait to see if the artist impressions we are used to are close to reality.
Today
Can't we use the Voyager probes or something else in space to make a telescope as large as the solar system? Why haven't we already done so?
Finally it's interferometer!
3:39 - No, he did the _maths._ Because he's British.
"Legolas, what do your elf eyes see?"
.
Similar problem, and outside that limit if I understand correctly.
Magic doesn’t have to behave the laws of physics. Nor to shield surfing elves.
Saw the title of the video and for a second thought this was an Issac author video.
So the real solution is to build a really really really big one in space. If/When it comes online and is actually 100% reusable the BFR could probably get it up there pretty cheaply in sections.
Not trying to validate any wacky conspiracy theories, just thinking out loud...
Can software be used to compensate for atmospheric interference instead of warping the mirror? Could you then use the same algorithim to compensate for an incorrectly/randomly warped mirror instead of atmospheric interference? If that became the case, what would stop a nation state from deploying a spy sattelite with a Mylar mirror which unfolded over the target, took some footage, and then folded back up, relying on the algorithm to filter out/compensate for the irregularities inherent to a foldable mylar mirror?
Yes, that is already done. It's generally called deconvolution.
How much time did it take for you to read that extra thicc Linux book in your shelf?
So much science, I love it
What about taking images six months apart and get effectively a 186,000,000 mile wide telescope?
Which channel is claiming that satellites can read newspapers? I am genuinely curious.
There were conspiracy theorists long before UA-cam - I remember people telling me this back in the 90’s
I remember these claims too, but I could never find them in written or video form.
why cant they creat a CCD as big as a telescope primary mirror???
can we build one as big as the solar system?
the follow up question should be... can you design a set of spy satelites so that they have a combined aperture to read the newspaper from space XD
Can we use satellites orbiting earth for this?
Prakash Kamath Oh yes we can
Prakash Kamath It's even better in space, because we don't have restrinctions like the size of our planet, or atmospheric effects. We can put a few telescopes in orbit around the Sun very far away from it and obtain a virtual telescope the size of....the entire Solar System...or more. Though it would be pointed at a specific target, and it would be difficult to make the virtual telescope to look at another direction, because you'd need to change the orbit of the entire array of satelites to look at different patches of the sky. This surely can be done still 👌
Thanks
Wonder if a bigger telescope would have made a difference in finding the extra moons around Jupiter?