If Iconic Space Photos Had Been Taken With a Smartphone Camera

Поділитися
Вставка
  • Опубліковано 28 гру 2024

КОМЕНТАРІ • 711

  • @horacefairview5349
    @horacefairview5349 6 років тому +1303

    "Hey! Don't take that, it's not scheduled."
    "hehe"
    *Takes masterpiece*

    • @_Leouch
      @_Leouch 6 років тому +40

      It is still miracle astronauts are so disciplined :D

    • @e1123581321345589144
      @e1123581321345589144 4 роки тому +61

      @@_Leouch that's usually exaggerated. Those guys were notorious for smuggling stuff on-board the space craft

    • @_Leouch
      @_Leouch 4 роки тому +15

      @@e1123581321345589144 ye, mirracle that they ONLY take random stuff on board(and one short rebelion)

    • @kennylex
      @kennylex 4 роки тому +18

      Omg, they where space rebels, they took an unscheduled photo.

    • @_Leouch
      @_Leouch 4 роки тому +3

      @@kennylex HOW DARE YOU?! - said NASA (master) chef to rebel army

  • @stevemoore12
    @stevemoore12 6 років тому +583

    I love hearing the childlike excitement of the astronauts. The sight alone made them giddy.

    • @rankcolour8780
      @rankcolour8780 6 років тому +22

      I know that I personally would be absolutely gushing with excitement lol.

    • @JandCanO
      @JandCanO 6 років тому +28

      "Calm down, Lovell"

    • @davecrupel2817
      @davecrupel2817 6 років тому +4

      I wouldn't know *how* to react.....i'd be at a loss for words..

    • @gonzaroh
      @gonzaroh 6 років тому +3

      Yeah I know would freeze. I remember the first time I saw a live feed of the earth from the ISS, it was a live talk with the crew and they pointed the camera at the earth through a window, I was motionless and got goosebumps, I can't imagine what I would feel if I was the one watching it directly.

    • @KanishQQuotes
      @KanishQQuotes 6 років тому +3

      It is a very humbling experience
      As a child I saw a video of a space shuttle launch on tv with camera attached to the shuttle, looking down as it leaves earth and reaches space
      I felt puny and helpless
      And all the struggles of politics, religion, greed seemed insignificant

  • @xmlthegreat
    @xmlthegreat 6 років тому +629

    "Calm down, Lovell!" I love that Jim Lovell got so excited about the view.

    • @SynchronizorVideos
      @SynchronizorVideos 6 років тому +9

      Reminded me very much of me and my brother when we road-tripped to the path of totality for the last eclipse.

    • @Arkasai
      @Arkasai 6 років тому +20

      The photographer in me got all tickled at that last line from Lovell, I imagine him peering through the sight of a light meter of some kind saying "250 at f/11."

    • @jaimeduncan6167
      @jaimeduncan6167 6 років тому +15

      Arkasai the earth is even brighter than the moon. People don’t realize how much light you have and overexpose the moon all the time. Not this guy :)

    • @TheNervousnation
      @TheNervousnation 6 років тому +1

      how could one not!?

    • @nightrous3026
      @nightrous3026 6 років тому +1

      I have a book abot apollo 8. Its good. Its called genesis: the story of apollo 8.

  • @Mythricia1988
    @Mythricia1988 6 років тому +515

    Somebody asked about the difference between a modern phone camera sensor, and the types of sensors used on space probes, ignoring things like optics. So I wrote a brief reply. Only 8500 characters. Yeah. I write a lot. I figured it was a good little article I wrote so I'll paste it here, for others to read.
    -----
    Question:
    Great video. This brings up another question: Ignoring the focal length and the quality of the optics. How does the quality of the sensor compare to various space probe cameras? And further, many probe cameras use color filters to catch colors in different spectra. The scientific value of a photo from a consumer sensor would probably be lower, but how much of the colors in planetary photos are due to post processing? I know that pictures of stars and nebulas are frequently highly modified, but how about planets?

    Answer:
    Camera sensors are a science to understand, in many respects. There are always tradeoffs. There's no such thing as "the best one". But, in the sense of astrophotography, I think mobile cameras would fare poorly. They have very high resolution, many megapixels (16MPix and upwards, some over 20MP), compared to most astro cameras that are often not even 1MPix, or at most a couple MPix. High res, color CCD cameras are extremely expensive, I paid over 2000$ for a camera with "only" an 8MPix color sensor, and it's literally just a cylinder with a camera sensor inside it, so it's useless for normal photography - talk about specialized equipment. So in that sense, a mobile camera would seem to be superior, right? But it's *much* more complicated than that. I'll try to elaborate briefly.
    The first major difference is pixel size. Sometimes called pixel pitch. This is the literal width / height of a single pixel on the sensor - and it's usually measured in µm (thousands of a mm). In mobile phones, the entire sensor is tiny! Yet it has so many pixels.... Well, number of pixels divided by size, math says, the pixels must be very small! On a modern phone, the pixel pitch is usually somewhere between 1 and 2 µm. And this is the length of one side of the pixel - so the AREA of the pixel, is .. well, width * length. Almost 100% of sensors use square pixels, so you usually only see one number quoted as the pixel pitch.
    Well, an astronomy oriented CCD, usually very similar if not identical to the types of cameras spacecraft use - have MUCH larger pixels. My "high res" CCD has a pixel pitch of 3.69µm . The area of every pixel is over 13 times bigger than most phone cameras! And that's actually a very low number compared to monochrome lower resolution CCD's - they have pixel pitches in the tens of µ-meters. A popular size is 16µm for some of the really fat Kodak CCD's. That's enormous!
    So what's the point? Well, each pixel is like a bucket. And the bucket gathers light. Small bucket.... Can't hold a lot of light. Big bucket? You can gather a lot of light, and more importantly, it's easier to measure accurately. Trying to measure a thousandth of a table spoon is hard - trying to measure 1 tablespoon is pretty easy, even if you are wrong, the result is not skewed by much. And that's why astronomy or science / engineering related CCD's are not only lower res than your phone, they're much larger too - the sensors are often comparable in physical size to what's used in professional DSLR cameras (i.e. they're huge). Because the huge size allows them to have a useable resolution, while keeping those massive huge light-gathering pixels. The problem of low resolution in scientific CCD cameras are easily overcome, so it's not really a big problem anyway, even though people insist of just thinking of a picture as 1 single exposure, even though there are no reasons why that must be the caese. Instead of just taking 1 picture, you can take 2, or 10, or 100, or a thousand - because things in space rarely move very fast, and if they do, you probably have a bigger problem. You can take thousands of picture of a galaxy, over the span of years - and it doesn't matter, as long as you align them correctly. Because it hasn't moved, or changed, in any way, at all, ever, and never will, within our human lifetime.
    The actual benefit of big pixels at the end of the day, are things like light sensitivity due to less wasted space on the sensor chip. In a mega compact sensor like one used in a phone, a very large percent of the area is actually occupied by stuff *other than* the actual pixel elements. Things like signal pathways and other silicon junk takes up a very large percent of the area. But on a large CCD, where everything is much bigger... The space wasted on signal lines and other "junk" is much smaller.
    This leads to an improvement in something called quantum efficiency. Sounds fancy, but it's a simple concept - it's the relationship of how many photons you detect, versus how many photons actually struck the sensor. If you have a ratio of 1 (100%) then it means you are 100% likely to actually detect every single photon that hit the sensor. A mobile sensor only has a QE of like ~15-30%. That's incredibly poor. If you throw 1000 photons and a phone camera, only 150-300 of them will be detected. Now let's compare that to an astrophotography CCD.... The QE of my color CCD is well over 50%, with even higher peaks. The monochrome version is even better.. It's almost 80%!! And this is a "high res" camera, if you start looking at slightly lower, and more common, resolutions - you start seeing peaks over 90%. It's an insane difference. It means that, if you expose both sensors for the same amount of time - the monochrome, high QE sensor, will capture over twice as much light. That's an extremely significant benefit. It cannot be overstated. Almost nothing else matters when it comes to capturing information and reducing noise (false information).
    I think this also explains why, in digital cameras, the idea of "ISO" is complete bogus nonsense. You can't change the sensitivity of the sensor. It either captures the photon and converts it into an electron, or it doesn't. There's no changing it. "ISO" is just a post-processor that increases the brightness of the image. You can just do it in photoshop instead and get better results.... As you'll see on any scientific CCD camera, there literally isn't an option for "sensitivity" or "ISO" - because it's bullshit and doesn't exist :)
    The second major difference is sensor type. Almost all phones, and consumer cameras in general (including professional DSLR), use CMOS sensors. They are very fast, very cheap, and can be made very small, and are easily made in extremely high resolution - and they perform very well in good light conditions. It's obvious to see why they are used in consumer electronics, because, well, we humans are daylight creatures, we like to see, and thus take pictures of, things in bright daylight. They perform horribly poorly in low light however... As anyone with a phone can personally attest to, right? They are very noisy, which doesn't matter in most pictures, because most pictures are literally pictures of "noise" and colors... But it matters a lot in astronomy, where a lot of what you are trying to capture, are handfuls of photons on a completely black background. CCD's are great in this sense - a high end monochrome, actively cooled CCD can count literal, single, photons, coming from a point in space. That's pretty cool, right?
    Another little detail, and a fun fact;
    There's no such thing as a color sensor. No matter if it's CCD or CMOS. Doesn't exist. It's all a trick. A "color" sensor is simply a monochrome sensor, with millions of little red, green and blue filters glued onto the surface. When you take a picture, the processor in the camera take the red, green and blue filter pixels and combines them into a color image. And yes, this does mean that the "true" resolution of your color CCD is actually much lower - because any single pixel is only photographing 1 color. They combine nicely and give you a reasonable result - but it's gonna be a "processed" result no matter how you do it. The math and the numbers used to combine the different colors into one single image, is arbitrarily decided, by a person, at some point. It's just as false or just as correct, as any other camera.
    This is why most scientific cameras are monochrome, and use filter wheels to automatically capture one shot per filter (or, in reality, hundreds of shots), and then combine these full-frame color sub-frames into one master frame that contains all the colors. As you can imagine, this produces a better result. It's a much more controlled process. Or, you can use no filters at all, and get maximum sensitivity to light. The downside is you won't have any clue what the colors are, but that often doesn't matter. You can also use fancier filters, that select only a single wavelength to capture - as opposed to just "generic red" which is actually not a specific wavelength, rather it's an arbitrarily decided spectrum of wavelengths. But, since CCD's are so sensitive, even to non-visible light such as infrared and ultraviolet - you can use fancy filters to capture "invisible" light, and then simply decide to use that information and present it in a "visible" way in the final photograph. That's why pictures of nebulas and such are so fancy looking - because they take wavelengths that would be invisible to the human eye, and assign them to a visible color.
    ------
    So there you go. A short summary of digital image sensors. It's a science. I understand only a little bit of it, enough to know what I'm doing in my amateur astrophotography.

    • @Veptis
      @Veptis 6 років тому +10

      Mythricia that's some effort. But I wanted to question your comment on the Bayer matrix. Does this mean a 20MP sensor in a stills camera is using 5Million for red and 5Million of those for blue whole the rest 10million are green? I have always thought it's 20million pixel with each 3 values for RGB. So basically 80million captures. That's why the monochrome Leica has twice the resolution for example? Becuase a display pixel has sub pixels to it.
      I got a stills camera from Sony with a 1" sensor that picks up quite some light with it's build in lens in my suburban area. We are 4 days away from the brightest day so I can't do any widefield anyways as it's just too birght in the night.
      I also got a digital cinema camera with a little smaller sensor but low resolution and an actual pixel pitch of 6.5µm so it should catch more light. But does not allow long exposure and is therefore unusuable without a wide scope for planetary.
      My phone also has one of those tiny sensors where he pixels must be almost as tiny as the wavelengths of light. The secondary sensor is a FPA from FLIR. It's equivalent to a Lepton2 with 17µm with tiny resolution of 80x60(2dead so far) and it captures light in the far infrared of ~7-14µm. Newest version of heir FPA technology uses pixel combinations to captres wavelengths of 13.5µm with a 12µm pixel pitch.
      I am yet to find a good processing option to output raw data that can be used for science and astrophotography. But the spectral response of the sensor is known and could be divided by for a broad result. I am unsure about doing long exposure with this sensor as well, so I might not be able to ever take an astrophoto with it. If you got any ideas, share them.

    • @kodiak2fitty
      @kodiak2fitty 6 років тому +8

      It's 5/5/10 for RGB (check out lagemaat.blogspot.com/2007/09/actual-resolution-of-bayer-sensors-you.html) Bayer CCD/CMOS technology is yet another source of "lying" by manufacturers (no different than oversized old CRT monitors and Gigabytes vs Gibibytes on a hard drive). There is a lot of magic in the processor to get your final image. When you collect RAW, you can choose to use a different algorithm than the chip built into the camera. That is partly why apps like DxO PhotoLab can make such an improvement to photos if your camera saves RAW sensor data.

    • @Mythricia1988
      @Mythricia1988 6 років тому +20

      Hey. So, it's actually kind of a complex issue with bayer matrices and what you count as "pixels". While it's true that for a DISPLAY, one "pixel" (picture element) is a grouping of 3 subpixels (or more, the ratio is not 1:3 for all displays), and on a display, this means that effectively, your display actually has 3 times as many active elements as the resolution would suggest... Buuuut, for sensors, the convention is to count every element as one pixel, and in fact, each pixel is actually square, not a group of shapes like on a display. And further more, in the picture output by a consumer color camera, the number of pixels is actually equal to the number of active picture elements on the sensor. So it's a 1:1 relationship.
      Clearly, there has to be some kind of magic happening, right? How can you use every pixel in the output, when each pixel only captures 1 color?
      Well, once again, it's kind of a trick. The image processor interpolates every pixel's final color, based on all its neighbor pixels. So, every pixel ends up being a combination of itself + neighbors, based on weights. There are several ways to do this, each way has it's pros and cons, ranging from the very simple and very fast nearest-neighbor interpolation, which is the same algorithm used sometimes for re-sizing images. There are much better algorithms though, such as Adaptive Homogeneity-Directed (AHD) interpolation, which is used in very modern DSLRs. But they all have artifacts, no matter what you do, the reconstructed image usually looks good in uniform-colored areas, but loses resolution (detail) and shows artifacts around edges and high contrast areas. And that's where the "magic" shows its colors, so to speak. You actually *do* lose resolution in many areas.
      Interestingly, the algorithms are usually reversible - this is sometimes used by astrophotographers using DSLR's to "extract" the three primary color sub-frames from RAW images, so that they can handle them separately and apply their own processing pipeline.

    • @Alexagrigorieff
      @Alexagrigorieff 6 років тому +3

      With smaller pixels, things go south fast. You don't get enough well capacity for that. If it's 5000 electrons, the best theoretical S/N you can get from it is sqrt(5000)=71. You cannot get good color separation. With micron-sized pixels, microlenses (to gather the light to the center of the pixel) don't quite work. Electrons are bleeding to the neighboring wells, which reduces the color separation further. Without good color separation in the array, you have to dial up (to the negatives) off-diagonal members of your color conversion matrix, which amplifies noise quite a lot.
      But when you have a ~10 micron sized pixel, such as in high end cinema cameras with 5K medium format array, with a few hundred thousand electrons capacity, it's marvelous.
      By the way, silicone sensors are pretty bad with UV. Not sure how far they work, considering that even visible blue is already tough on them.

    • @lawrencedoliveiro9104
      @lawrencedoliveiro9104 6 років тому +3

      Sampling theory comes into it as well: the ideal digital pixel is a point with zero dimensions*. Which is physically impossible. Because real-world sensors have a nonzero size, this has the effect of lowering the contrast of fine details close to the Nyquist limit.
      The cure for this is the “unsharp mask” filter, well-known to all users of photomanipulation software.
      *That’s right, pixels are not “little squares”.

  • @ahaveland
    @ahaveland 6 років тому +127

    The reenactment of the Earthrise pic was just amazing - thanks!

  • @BartJBols
    @BartJBols 6 років тому +283

    CALM DOWN LOVELL FFS

  • @ObsidianAnt
    @ObsidianAnt 6 років тому +287

    Amazing video Scott! The ability to explore the solar system from home really is awesome. :)
    Would you consider making a video on how those stunning nebula images are made?

    • @kellog9876
      @kellog9876 6 років тому +6

      Hey its obsidian ant! Love your videos too

    • @iaago1520
      @iaago1520 6 років тому +3

      ObsidianAnt normal cameras have an exposure time of like 8000th of a second you can see objects that give of or reflect enough light but nebulas are far away objects not much light reaches us but nebulas are big the same goes for galaxies if a relativly big nebula was giving off enough light then it would appear as big as the moon in the night sky and you just need to have a small telescope and a camera with a 10 mintue exposure time to capture galaxies or nebula

    • @RFC3514
      @RFC3514 6 років тому +8

      > you just need to have a small telescope and a camera with a 10 mintue exposure time
      Clearly, you have never tried to photograph a nebula (or any other faint celestial object), and didn't even bother to do basic research before posting here.
      "Normal cameras" have variable exposure time. You wouldn't use 1/8000th of a second unless you were taking a photo under strong sunlight with a big aperture and / or high sensitivity. For an indoor photo taken with normal lighting (no projectors, no flash), 1/10th of a second (or even up to a few seconds - though that's no good for moving objects) is pretty common. Anything longer than 30 seconds generally relies on bulb or T mode (i.e., an external timer).
      But shutter speed is the _easy_ part.
      The issue with astral photography isn't just that you need longer exposures, it's that Earth rotates _a lot_ during those 10 minutes (or even a single minute - enough to turn stars into streaks and anything bigger into a smudge).
      So, you need a motorised equatorial mount (or computer-controlled altazimuth mount, etc.) that compensates for the rotation of the Earth in real time, keeping the objects you want to photograph still within the frame. To get high detail, the tracking doesn't just need to be very _accurate,_ it also needs to be very _smooth_ (stepper motors with larger steps move irregularly).
      Also, "those stunning nebula images" typically consist of multiple photos taken at different wavelengths (some of them outside our visible spectrum), converted into different colours, and composited together. Some of those require special filters that cost as much as a camera, and others can't be taken with a "normal" camera at all.
      There is a *lot* more to it than "you just need to have a small telescope and a camera with a 10 mintue exposure time".

    • @iaago1520
      @iaago1520 6 років тому +1

      RFC3514 i was just going for a short explanation and and thank you for aggressivly telling me some facts about take a picture of a nebula

    • @RFC3514
      @RFC3514 6 років тому +2

      What you wrote wasn't "an explanation" at all, short or otherwise. It was a patronising dismissal of ObsidianAnt's suggestion.
      You literally wrote that, to get stunning nebula images like the ones published by NASA and ESA, "you just need to have a small telescope and a camera with a 10 mintue exposure time".

  • @d2factotum
    @d2factotum 6 років тому +164

    Wouldn't taking a picture of Pluto with an iPhone camera also run into problems with the very low light levels that far from the sun? I've never found smartphone cameras to be great in dim conditions!

    • @jan.tichavsky
      @jan.tichavsky 6 років тому +9

      The sunlit face has still enough light and smartphone lenses are fast so it's not a problem. Interesting engineering has been done to take pictures with long lenses/telescopes from very fast moving spacecraft at varying level of illumination, some sensors only capture single (or couple with different filters) row of pixels and the image is scanned with motion of the craft itself, similar to function of flatbed scanner.

    • @scottmanley
      @scottmanley  6 років тому +97

      It’s equivalent to taking pictures about 20 minutes after sunset

    • @brittneypaul2089
      @brittneypaul2089 6 років тому +10

      Most cell-phones use CMOS image sensors instead of a CCD to save on cost. CCD is better in low-light applications because you can essentially capture individual photons.
      Was disappointed trying to capture the Aurora Boriallis one night with my cell-phone. (just got a black square)

    • @ΑΡΗΣΚΟΡΝΑΡΑΚΗΣ
      @ΑΡΗΣΚΟΡΝΑΡΑΚΗΣ 6 років тому +3

      Brittney Paul I tried taking a night shot with my Xperia M:
      Dimensions: 2592x1944
      f-stop: f/2,8
      Exposure: 1/7s
      Flash: no
      Focal length: 3,49mm
      ISO: 2520

    • @MrtinVarela
      @MrtinVarela 6 років тому +21

      +Brittney Paul
      "only ccd captures individual photons" You don't know what you're talking about. CCD and CMOS are both based on the exact same photoelectric effect.
      CCD transmits analog data to an external amplifier and digital converter, while the cmos does the whole process within each individual cell. This usually introduces rolling shutter and reduces light-gathering efficiency but solves photo-charge bleeding, reduces power consumption, improves data rate AND reduces production costs.
      The optical difference between them is that ccd can capture more light because there is no dead spots between pixels like in cmos where the amplifying electronics are alongside individual cells.
      Your conclusion is correct but the explanation is not.

  • @bencushwa8902
    @bencushwa8902 4 роки тому +1

    I do photography work part-time. I get asked all the time why people should bother hiring photographers anymore given how good cell phone cameras are today.
    You just explained it better than I ever have. Well done sir, and thanks for all of your lovely videos.

  • @jessgarcia4100
    @jessgarcia4100 6 років тому

    Hearing the Apollo 8 crew getting hyped over photographing the Earth has probably made my week. As a photographer this gives me a lot of joy. I've watched so many videos on Apollo and this is the first time I've heard this exchange.Thank you for this.

  • @backlog2389
    @backlog2389 3 роки тому +1

    The everyone else photo goes down as my favorite photo of all time. I dont know that it will ever happen again. Literally every human who has lived or died is in that photo except for one person. Pretty iconic in my book.

  • @darkstride7081
    @darkstride7081 5 років тому +1

    I think bringing standard cameras along would give an interesting comparison between the beautiful close-up shots of the planets and other bodies to the vast distances of space. It really puts into perspective just how far apart everything is.

  • @gordonrichardson2972
    @gordonrichardson2972 6 років тому +68

    Action photographers maxim: Get closer to the subject!
    In space that's not so easy...

    • @Cragified
      @Cragified 6 років тому +7

      Not that hard really. Surviving the encounter and coming home though... quite challenging!

    • @nicejungle
      @nicejungle 6 років тому

      I agree :D
      "If Your Pictures Aren’t Good Enough, You’re Not Close Enough" - Robert Capa

    • @PixlRainbow
      @PixlRainbow 6 років тому +3

      Cragified space probes are wireless cameras that you throw at subjects. Only the camera needs to get closer to the subject, the cameraman doesn't have to.

    • @gordonrichardson2972
      @gordonrichardson2972 6 років тому +1

      If you only have one camera, and you have multiple subjects, that could be awkward and/or expensive...

    • @Martinit0
      @Martinit0 6 років тому

      Although in this particular case they had to step waaaay back to get all the subjects into the frame.

  • @Hans-jc1ju
    @Hans-jc1ju 6 років тому +25

    Concerning the darkness you mentioned at the end: Pluto is pretty fracking far from the sun. How dark would it have looked in an iPhone camera? Would we really see anything at all at that point?

    • @scottmanley
      @scottmanley  6 років тому +26

      The sun is about 2000 times fainter, which means it’s like taking pictures of people just after sunset

    • @Alexagrigorieff
      @Alexagrigorieff 6 років тому

      If Pluto is at ~30 AU, isn't that 900-1000 times fainter?

    • @karenrobertsdottir4101
      @karenrobertsdottir4101 6 років тому +4

      @@Alexagrigorieff Pluto is more reflective than the moon (albedo of 0,49 to 0,66, vs. around 0,12 for the moon). The moon is basalt, and anyone who's ever seen basalt knows that it's a dark grey (almost black) rock. Pluto is exotic ices (and water ice) covered unevenly in tholins.

    • @hobbified
      @hobbified 4 роки тому

      @@Alexagrigorieff yes, although Pluto varies between 30 and 50 AU, so that'd be a range of 900 to 2500 times dimmer. So 2000 is reasonably representative for a long-term average.

  • @thecapacitor1395
    @thecapacitor1395 6 років тому +30

    If I ever get the chance to go into space in my lifetime, now I know the exposure settings for taking a picture of Earthrise. 1/250 at ƒ11 :D

  • @seditt5146
    @seditt5146 3 роки тому +1

    If they were taken with modern cameras we would probably still be waiting for all the data to come in half a century later and it would get interrupted constantly by google using up all the phones ram

  • @Flying_Basset
    @Flying_Basset 6 років тому +29

    They carried 80 and 250 mm lenses for the Hasselblads, but you have to remember that medium format cameras have much bigger frame size than 35mm, so 80 = 43 and 250 = 135 on a 35mm camera. 43 is even slightly wider than a "normal" 50, and 135 is a common portrait lens. They both represent reality very well.

    • @scottmanley
      @scottmanley  6 років тому +8

      Yeah, the good news is you can calculate the angular resolution and convert.

    • @johnfrancisdoe1563
      @johnfrancisdoe1563 6 років тому +1

      Andrej Đeneš My usual 24x36 has a 43mm instead of a 50mm.

    • @MarvinCZ
      @MarvinCZ 6 років тому

      43mm is still considered normal. It's not just exactly 50mm

    • @ASJC27
      @ASJC27 6 років тому +1

      43mm is actually closer to a true normal lens than a 50 would be (on a 35mm film or FF dslr ). That is because a normal lens is defined as one that has a focal length equal to the diagonal of the sensor (43mm diagonal for a 35mm film). Another similar definition is that a normal lens is one that provides a diagonal angle of view of 1 radian. That translates to a 40mm lens for 35mm film.

    • @Flying_Basset
      @Flying_Basset 6 років тому

      Yes, I'm aware of the diagonal "rule" (I think of it more as a guideline), but 50mm has been _the_ normal lens for a century and I guess people got used to the look (send your complaints to Oskar Barnack). In fact, even longer focal lengths were considered normal - 55 was common in Japan and 58 in Soviet Union. I've never heard of that 1 radian rule, but 40mm definitely looks like a mild-wide angle to me (that just might be me being weird - I also prefer 85 in 6x6 instead of "true normal" 80).

  • @nitePhyyre
    @nitePhyyre 6 років тому +1

    I love the dismissive little chuckle Anders lets out when Borman tells him not to take the picture.

  • @alexlandherr
    @alexlandherr 6 років тому +21

    And now you just gave Marques Brownlee a new video topic.

  • @coolbluelights
    @coolbluelights 4 роки тому +1

    This video could quite possibly be the one to put the flat earth movement to rest for good.

  • @justins21482
    @justins21482 4 роки тому

    Anyone else love the little jingle at the end of most of Scott's videos? As silly as it sounds that might be my fav thing about his channel. It seems to sum up the feeling you get from these discussions in one simple little jingle.

  • @doxielain2231
    @doxielain2231 6 років тому +1

    "A tiny small disk of light suspended in a giant black background" is a pretty accurate description of our human experience in the universe

  • @burpleson
    @burpleson 6 років тому +2

    Thanks, Scott. I'd never heard that exchange among the Apollo 8 astronauts. It adds a very interesting background to their iconic photo.

  • @Veptis
    @Veptis 6 років тому +1

    There is an interactive website where you can watch the Apollo 17 mission in real time. With all kind of life coverage. Like TV broadcast. Mission control, and their comms. All transcripted and it has all the images taken in real time, with maps and such. It's great.

  • @TalismancerM
    @TalismancerM 5 років тому +8

    Me: "How many drops is this for you Scott?"
    Scott Manley: "One hundred...simulated" 10:04

  • @jacobneesen7869
    @jacobneesen7869 6 років тому +170

    But what if they captured images with an andriod?

    • @edstirling
      @edstirling 6 років тому +1

      yeah! nobody buys an iphone anymore. what about a ZTE phone that trump wants us all to buy? does that do space pictures?

    • @philip1201
      @philip1201 6 років тому +17

      They would get pretty interesting Data.

    • @bomxacalaka2033
      @bomxacalaka2033 6 років тому

      edstirling 😂😂😂😂

    • @ahaveland
      @ahaveland 6 років тому +4

      From an "andriod" they'd get phoots... :-)

    • @Daniel115XD
      @Daniel115XD 6 років тому +2

      with a pixel 2 they could take awesome pictures

  • @natgrant1364
    @natgrant1364 6 років тому

    When I was a kid, you really couldn't just snapshot pictures of stars or even the Moon. At one point, my sister had a camera that could have its shutter held open for long exposures. She took a picture of the Moon with it and because she couldn't follow the moon properly, it turned out very blurry, though it did capture the light just fine.
    It's amazing what even typical phones and tablets and crap can actually do now.

  • @myownidenity4955
    @myownidenity4955 4 роки тому

    You can tell those guys were old school Air Force fighter pilots taking in the moment. Truly the great generation

  • @FlaminSquirrel
    @FlaminSquirrel 6 років тому +3

    I have no idea how you find the time to produce all these videos. Great stuff; thanks.

  • @MythicalPhoebe
    @MythicalPhoebe 6 років тому +2

    i would kill to see the earth at a distance. until it's only slightly bigger than the moon in the sky. perhaps i'm a bit obsessed, but every time i get a chance to see a clear night sky, i always look up at the moon. to think that such a small spot in our sky is actually a miniature world of it's own. to think that people have been there. it's always such a humbling thought. i always wanted to sit on the surface of the moon and do the same looking back at earth.

  • @beyondsingularity
    @beyondsingularity 6 років тому +1

    Oh man. Earthrise. What a magnificent picture. Thanks for the context!

  • @Sinnistering
    @Sinnistering 2 роки тому

    I don't know why, but seeing the size of Earth made it suddenly hit me just how far away the Apollo astronauts were. I had to pause the video and just go "holy shit." Completely awestruck.

  • @OspreyKnight
    @OspreyKnight 6 років тому +1

    So, when I do lunar photography my settings follow the cloudy 8 rule.
    So, cloudy eight is a derivative of the sunny 16 rule in photography, which states:
    On a sunny day set your aperture to 16, and match your shutter speed and ISO.
    f16 100 ISO 100/1 second
    This will give you a "correct" exposure for a sunny day.
    For a cloudy day you follow the same rules, except open your aperture to f8.
    The moon is reflected sunlight, much like the light bouncing into the shadow of a building. So, the moon is really bright compared to the landscape you're photographing in unless it get it just after or before sunset.

    • @bthemedia
      @bthemedia 6 років тому

      OspreyKnight yeah, that makes sense - though @11:40 the video note of ISO 100 and shutter 1/6000 makes no sense... what aperture? Assume 1/100~1/125 for standard shutter speed, ISO 100 gives 1/125 @ f/8 or 1/60 @ f11. For 1/6000 that would be ~f/1.2!

    • @OspreyKnight
      @OspreyKnight 6 років тому

      bwvids yeah he kinda left out the aperture. I think he underexposed the moon. It should come out as a nice grey white with darker splotches.

  • @johnnusbaum6149
    @johnnusbaum6149 6 років тому +42

    Love taking pictures of Jupiter (and it’s moons) with my phone pointed through my telescope lens :)

    • @tomvorat4173
      @tomvorat4173 6 років тому

      John Nusbaum that is possible?

    • @faxinspace
      @faxinspace 6 років тому +10

      John Nusbaum me too
      Unfortunately, my best picture is a smear of Jupiter across the screen, but you can see identically smeared points of light around it.
      I was so proud of myself for actually taking a picture of Jupiter.
      Next time, it will be better

    • @joeriff7719
      @joeriff7719 6 років тому +2

      Celestron and a few other companies make smartphone adapters to take photos through your telescope eyepiece. I don't have one, but I've seen a few decent photos taken with them, mostly of the Moon. Jupiter would be tricky.

    • @faxinspace
      @faxinspace 6 років тому

      I may have to go buy one of those

    • @druze3210
      @druze3210 6 років тому

      Jupiter looks great through the telescope, doesn’t it? (Frrantastic)

  • @JosephHarner
    @JosephHarner 6 років тому

    Field of view is one thing worth mentioning, but the lens on a camera can change this dramatically. You can actually buy specialty phone cases that add lenses to a smartphone, and this leads to a very interesting new situation where you are comparing just the sensors in these cameras, not the lenses and magnification through which the image is being focused.
    Modern phone cameras are *amazing*. It really can't be stated enough how far digital cameras have come in the last two decades. With a more stable mounting and an added lens to magnify and focus the image, you could take a truly spectacular picture using a modern phone camera sensor.

  • @ExploringCabinsandMines
    @ExploringCabinsandMines 4 роки тому +1

    when a satellite is traveling through space and it makes an adjustment in attitude say 180 ° how does it stop absolutely at the precise angle and not be slowly still turning however slightly??

  • @DJRonnieG
    @DJRonnieG 4 роки тому +2

    My coworker keeps saying some of these images" are obviously fake." I used a similar program to prove that the Deep Spaxe Climate Observatory Moon-Earth photo was indeed real.

  • @EspenLodden
    @EspenLodden 6 років тому +7

    Great video. Any avid hobby astronomist finding him/herself in outer space would of course bring some tele attachments compensating for the long distances. This brings up another question: Ignoring the focal length and the quality of the optics. How does the quality of the sensor compare to various space probe cameras? And further, many probe cameras use color filters to catch colors in different spectra. The scientific value of a photo from a consumer sensor would probably be lower, but how much of the colors in planetary photos are due to post processing? I know that pictures of stars and nebulas are frequently highly modified, but how about planets?

    • @johnfrancisdoe1563
      @johnfrancisdoe1563 6 років тому +2

      Harambe's Ghost When you say southern sky, are you referring to the view from your ancestral home near Kilimanjaro, or to your last known residence in that zoo?

    • @K1lostream
      @K1lostream 6 років тому

      Astronomist!

    • @EspenLodden
      @EspenLodden 6 років тому +1

      Thanks! I have noticed the red color of Mars myself using a pair of binoculars. Too bad the summers here up north are so bright that we can't see much till way past midnigt

    • @Mythricia1988
      @Mythricia1988 6 років тому +1

      Yeah, the struggles of a northern amateur astrophotographer... The summers are bright (can barely see anything at all, you can completely forget nebula images during the summer), and the winters are covered in 1.5 meters of snow. I kinda regret investing so much in the hobby considering where I live >.< Should move somewhere close to the equator for a few years and get my moneys worth.......
      Edit, to actually answer your questions (be prepared for a wall of text, but you asked, and I shall answer):
      Camera sensors are a science to understand, in many respects. There are always tradeoffs. There's no such thing as "the best one". But, in the sense of astrophotography, I think mobile cameras would fare poorly. They have very high resolution, many megapixels (16MPix and upwards, some over 20MP), compared to most astro cameras that are often not even 1MPix, or at most a couple MPix. High res, color CCD cameras are extremely expensive, I paid over 2000$ for a camera with "only" an 8MPix color sensor, and it's literally just a cylinder with a camera sensor inside it, so it's useless for normal photography - talk about specialized equipment. So in that sense, a mobile camera would seem to be superior, right? But it's *much* more complicated than that. I'll try to elaborate briefly.
      The first major difference is pixel size. Sometimes called pixel pitch. This is the literal width / height of a single pixel on the sensor - and it's usually measured in µm (thousands of a mm). In mobile phones, the entire sensor is tiny! Yet it has so many pixels.... Well, number of pixels divided by size, math says, the pixels must be very small! On a modern phone, the pixel pitch is usually somewhere between 1 and 2 µm. And this is the length of one side of the pixel - so the AREA of the pixel, is .. well, width * length. Almost 100% of sensors use square pixels, so you usually only see one number quoted as the pixel pitch.
      Well, an astronomy oriented CCD, usually very similar if not identical to the types of cameras spacecraft use - have MUCH larger pixels. My "high res" CCD has a pixel pitch of 3.69µm . The area of every pixel is over 13 times bigger than most phone cameras! And that's actually a very low number compared to monochrome lower resolution CCD's - they have pixel pitches in the tens of µ-meters. A popular size is 16µm for some of the really fat Kodak CCD's. That's enormous!
      So what's the point? Well, each pixel is like a bucket. And the bucket gathers light. Small bucket.... Can't hold a lot of light. Big bucket? You can gather a lot of light, and more importantly, it's easier to measure accurately. Trying to measure a thousandth of a table spoon is hard - trying to measure 1 tablespoon is pretty easy, even if you are wrong, the result is not skewed by much. And that's why astronomy or science / engineering related CCD's are not only lower res than your phone, they're much larger too - the sensors are often comparable in physical size to what's used in professional DSLR cameras (i.e. they're huge). Because the huge size allows them to have a useable resolution, while keeping those massive huge light-gathering pixels. The problem of low resolution in scientific CCD cameras are easily overcome, so it's not really a big problem anyway, even though people insist of just thinking of a picture as 1 single exposure, even though there are no reasons why that must be the caese. Instead of just taking 1 picture, you can take 2, or 10, or 100, or a thousand - because things in space rarely move very fast, and if they do, you probably have a bigger problem. You can take thousands of picture of a galaxy, over the span of years - and it doesn't matter, as long as you align them correctly. Because it hasn't moved, or changed, in any way, at all, ever, and never will, within our human lifetime.
      The actual benefit of big pixels at the end of the day, are things like light sensitivity due to less wasted space on the sensor chip. In a mega compact sensor like one used in a phone, a very large percent of the area is actually occupied by stuff *other than* the actual pixel elements. Things like signal pathways and other silicon junk takes up a very large percent of the area. But on a large CCD, where everything is much bigger... The space wasted on signal lines and other "junk" is much smaller.
      This leads to an improvement in something called quantum efficiency. Sounds fancy, but it's a simple concept - it's the relationship of how many photons you detect, versus how many photons actually struck the sensor. If you have a ratio of 1 (100%) then it means you are 100% likely to actually detect every single photon that hit the sensor. A mobile sensor only has a QE of like ~15-30%. That's incredibly poor. If you throw 1000 photons and a phone camera, only 150-300 of them will be detected. Now let's compare that to an astrophotography CCD.... The QE of my color CCD is well over 50%, with even higher peaks. The monochrome version is even better.. It's almost 80%!! And this is a "high res" camera, if you start looking at slightly lower, and more common, resolutions - you start seeing peaks over 90%. It's an insane difference. It means that, if you expose both sensors for the same amount of time - the monochrome, high QE sensor, will capture over twice as much light. That's an extremely significant benefit. It cannot be overstated. Almost nothing else matters when it comes to capturing information and reducing noise (false information).
      I think this also explains why, in digital cameras, the idea of "ISO" is complete bogus nonsense. You can't change the sensitivity of the sensor. It either captures the photon and converts it into an electron, or it doesn't. There's no changing it. "ISO" is just a post-processor that increases the brightness of the image. You can just do it in photoshop instead and get better results.... As you'll see on any scientific CCD camera, there literally isn't an option for "sensitivity" or "ISO" - because it's bullshit and doesn't exist :)
      The second major difference is sensor type. Almost all phones, and consumer cameras in general (including professional DSLR), use CMOS sensors. They are very fast, very cheap, and can be made very small, and are easily made in extremely high resolution - and they perform very well in good light conditions. It's obvious to see why they are used in consumer electronics, because, well, we humans are daylight creatures, we like to see, and thus take pictures of, things in bright daylight. They perform horribly poorly in low light however... As anyone with a phone can personally attest to, right? They are very noisy, which doesn't matter in most pictures, because most pictures are literally pictures of "noise" and colors... But it matters a lot in astronomy, where a lot of what you are trying to capture, are handfuls of photons on a completely black background. CCD's are great in this sense - a high end monochrome, actively cooled CCD can count literal, single, photons, coming from a point in space. That's pretty cool, right?
      Another little detail, and a fun fact;
      There's no such thing as a color sensor. No matter if it's CCD or CMOS. Doesn't exist. It's all a trick. A "color" sensor is simply a monochrome sensor, with millions of little red, green and blue filters glued onto the surface. When you take a picture, the processor in the camera take the red, green and blue filter pixels and combines them into a color image. And yes, this does mean that the "true" resolution of your color CCD is actually much lower - because any single pixel is only photographing 1 color. They combine nicely and give you a reasonable result - but it's gonna be a "processed" result no matter how you do it.
      This is why most scientific cameras are monochrome, and use filter wheels to automatically capture one shot per filter (or, in reality, hundreds of shots), and then combine these full-frame color sub-frames into one master frame that contains all the colors. As you can imagine, this produces a better result. It's a much more controlled process. Or, you can use no filters at all, and get maximum sensitivity to light. The downside is you won't have any clue what the colors are, but that often doesn't matter. You can also use fancier filters, that select only a single wavelength to capture - as opposed to just "generic red" which is actually not a specific wavelength, rather it's an arbitrarily decided spectrum of wavelengths. But, since CCD's are so sensitive, even to non-visible light such as infrared and ultraviolet - you can use fancy filters to capture "invisible" light, and then simply decide to use that information and present it in a "visible" way in the final photograph. That's why pictures of nebulas and such are so fancy looking - because they take wavelengths that would be invisible to the human eye, and assign them to a visible color.
      ------
      So there you go. A short summary of digital image sensors. It's a science. I understand only a little bit of it, enough to know what I'm doing in my amateur astrophotography.

  • @Drew_42
    @Drew_42 6 років тому

    "Point is, you can take these images, and you can learn from them." What a great thesis.

  • @ReneSchickbauer
    @ReneSchickbauer 6 років тому

    Strange to think that you still can't buy a consumer camera to capture an image in the same resolution as the Lunar Orbiter missions. And you still need multiple of the most expensive displays available to show a full frame of those pictures. Yay for old-school analog photography!

  • @JimmyBlimps
    @JimmyBlimps 4 роки тому

    That photo of Saturn is fantastic, I've not seen that one before.

  • @willvergano6761
    @willvergano6761 6 років тому

    Hey Scott, I love your videos, they are so brilliantly put together and informative. The last few you've done, following space probes and their journeys have been fascinating.. i'm wondering if you'd be up for doing more of them, and maybe focusing in more detail on the routes they took around planets for gravity assists etc. I am a space science noob, and I've never actually been on KSP but have watched literally all of your videos religiously, please don't ever stop doing them!

  • @georgegarcia566
    @georgegarcia566 4 роки тому

    Awesome vid. Amazing how the audio of the moon landing was matched with the pictures.

  • @-.._.-_...-_.._-..__..._.-.-.-
    @-.._.-_...-_.._-..__..._.-.-.- 6 років тому

    I kinda wish NASA would take regular pictures along with their high tech pictures just to help regular people understand.

  • @thetheflyinghawaiian
    @thetheflyinghawaiian 6 років тому

    Always on top of it, quality stuff Scott.

  • @Darnokk15
    @Darnokk15 6 років тому

    Wow, I always knew that the Lagrangian points were far away from Earth, but somehow I didn't picture them to be THAT far away. And the fact that New Horizons was that far away from Pluto at its closest approach also blew my mind, amazing, I love this video.

  • @SomeMadRandomPerson
    @SomeMadRandomPerson 4 роки тому

    Very intriguing Scott, nice one mate 😎👍🏻🏴󠁧󠁢󠁳󠁣󠁴󠁿

  • @kain5056
    @kain5056 4 роки тому +1

    Man, now that we're about to colonize the Moon, I dream of one day in the future where I have the opportunity to go there, taking my telescope with me, and look at the Earth through it like I look at the Moon now.

  • @kirkula
    @kirkula 6 років тому

    Easily one of your best videos thus far.

  • @VolkerHett
    @VolkerHett 4 роки тому

    As my grandfather once taught me, 35mm this side of the street, 90mm other side of the street, 50mm when you don’t know what to expect.
    35mm on small format film (what the kids call full frame now) being the 63 degrees

  • @Redsauce101
    @Redsauce101 6 років тому

    Started with an ad for Samsung s9. I'm sold.

  • @dwoodman26
    @dwoodman26 4 роки тому +4

    9:54 'Calm down Lovell'
    Words he would remember in Apollo 13...

  • @EstrellaViajeViajero
    @EstrellaViajeViajero 4 роки тому

    Now influencers would just photoshop the earth to be closer to the moon's horizon so it wouldn't look like the missed the shot.

  • @CharlesVanNoland
    @CharlesVanNoland 6 років тому

    It's so surreal imagining the astronauts floating around the moon in a tin can, farther than any man has ever been, taking pictures of the lunar earthrise like that.

  • @Wacoal34d
    @Wacoal34d 6 років тому

    Wow the astronauts looking for color film was brilliant! Thanks!

  • @mickstephenson
    @mickstephenson 6 років тому

    What's the white dot to the left of the image at 7:49 ? Whatever it is much be incredibly bright.

  • @IanZainea1990
    @IanZainea1990 3 роки тому

    I love how Lovell got a bit scolded for being so exuberant.

  • @MarcelHuguenin
    @MarcelHuguenin 2 роки тому

    Excellent video Scott!

  • @fotingomaster
    @fotingomaster 4 роки тому

    This was an awesome video! Thank you very much!!!

  • @CoolDudeClem
    @CoolDudeClem 4 роки тому

    I can only imagine all the rage building up in every smartphone owner's mind right now.

  • @tezza48
    @tezza48 4 роки тому +3

    I'd love to see that lunar landing after being run through a DAIN and increased to 30 FPS

  • @Masterhow101
    @Masterhow101 6 років тому

    Awesome video Scott, thanks for that

  • @TomLeg
    @TomLeg 4 роки тому

    YOu have to consider that way out there, the lighting is drastically lower, needing slow exposures or wide apertures. But a super-telephoto provides only a small (high f/ number) aperture.

  • @A.Lifecraft
    @A.Lifecraft 5 років тому

    Smartphone cameras are really the exception in the world of photography. Those are mostly fisheye-lenses that are - on any other cameras except smartphones - mounted far less often than your average 70-300 zoomlens. You could do some decent space-photography on a 300mm lens, at least those moon- and earthrise- shots.

  • @keddiels9843
    @keddiels9843 6 років тому +1

    Is it true that the rings around planets aren't actually solid like they appear in photos as it's just a side effect of having a slow shutter speed to capture more light?

    • @scottmanley
      @scottmanley  6 років тому +2

      They’re not solid, but it’s not a case of shutter speed, but resolving power, once you get really close you can see individual particles, but from a distance it looks like a thick cloud.

  • @ReneSchickbauer
    @ReneSchickbauer 6 років тому

    Scott, the correct Sagan quote is "a mote of dust suspended in a sunbeam".

  • @paqx3534
    @paqx3534 6 років тому

    Really informative! Best MAn;ey video in a while. I really enjoy these types of videos more then KSP nowadays, once you get past the limitations of KSP, aerospace and everything connected to it just has an infinite amount of learning you could do

  • @MythicalPhoebe
    @MythicalPhoebe 6 років тому

    god the cosmos are so beautiful. it kills me that some people actually think this is all fake.

  • @spacecoyote7706
    @spacecoyote7706 6 років тому +1

    Wow, you really cover everything, love your vids!

  • @schlurf6239
    @schlurf6239 6 років тому

    I love the topics you pick for your videos, awesome stuff!

  • @chris-hayes
    @chris-hayes 6 років тому

    Wow NASA's eyes is really cool, I'm definitely going to try that out. Scott, if you get the chance, I think it would be awesome if you did a livestream of just messing around with NASA's Eyes, I think your commentary would be more informative than I could ever get out of messing around with it on my own.

  • @gamermandan4511
    @gamermandan4511 4 роки тому

    Wow 250 at f/11. You gotta have good light for those settings. Moon is definitely bright. Man how exciting that moment must have been.

  • @Niskirin
    @Niskirin 6 років тому

    What I'd love to know is how much better images you could get now (with state of the art technology) vs 40-50 years ago.

  • @Bill_Woo
    @Bill_Woo 6 років тому +1

    All fascinating. However I thought this would be about resolution/pixels with the inexpensive mega-pixelation in phones nowadays.
    Other factors that come to mind are handheld stability of a smartphone; the counteraction effect of smartphone image stabilization feature; and that a smartphone shot (unless on a spacewalk or lunar excursion) would be shot from "behind the windshield" so to speak (though that's exactly what Anders/Lovell/Borman did).

  • @taylor1038
    @taylor1038 6 років тому +1

    What kind of aperture, ISO, and shutter speed did they use for some of these photos? There's a lot less light so far out, did they have to use longer shutter speeds and track the target? I heard Lovell had his camera at 1/250th (makes sense for the telephoto lens) and f/11, I wonder what ISO film they used. I'd think it'd have to be pretty high for such a quick shutter speed and small aperture, or is the moon and earth just that bright?

    • @yyunko7764
      @yyunko7764 6 років тому

      film rolls from that time had a common ISO of 400, 800 was quite high, but some 1600 ISO film was available, all of that could be pushed to over 12800 ISO with some quality degradation, but it's unlikely NASA would have used that process

    • @Flying_Basset
      @Flying_Basset 6 років тому +2

      They were shooting in the daylight so Sunny 16 applies. According to the sticker on one of the film-backs (sterileeye.files.wordpress.com/2009/07/apollo11-magazine.jpg ) which suggests 1/250 f/11 for sun and 1/250 f/5.6 for shade, I would guess it was around 100 ISO.

    • @Flying_Basset
      @Flying_Basset 6 років тому +1

      OK, I found it. They had Kodak Panatomic-X (ISO 80) and high-speed Kodak 2458 (ISO 1600) for black and white, for colour they had Ektachrome SO-68 and SO-121. I don't know if those numbers represent ISO (SO stands for "special order"), but consumer Ektachromes at the time had speeds in the range of 32-160 ISO.

    • @taylor1038
      @taylor1038 6 років тому

      Yyunko I know my camera isn't too good over 1600 but it's a smaller sensor. I dont have have much frame of reference for film camera ISO and how it compares. I guess the larger medium format film helps mask the noise.

    • @CorwynGC
      @CorwynGC 6 років тому

      Oddly enough Earth is a daylight object as well... :-)

  • @NathanBrayTV
    @NathanBrayTV 6 років тому

    Are these videos/photos available anywhere?

  • @Shogojagamokototo
    @Shogojagamokototo 6 років тому

    This was excellent and explained things very concisely. Big thumbs up from me. Can't wait to see how you blow our minds next.

  • @christianbergman711
    @christianbergman711 6 років тому

    Back in the late 60s (early 70s?) when I was just getting into film photography I learned that the correct exposure of the moon with a 35mm SLR was the same as an exposure at the beach on a sunny day. I remember one exposure of the full moon that I took based on the camera's light meter - it came out looking like a daylight picture of the sun through the palm trees (I grew up on the east coast of Florida). So yeah the moon has the same brightness as the road at midday.

  • @terryh6666
    @terryh6666 4 роки тому +1

    Congrats on 1m

  • @rogerchen128
    @rogerchen128 6 років тому

    Great job on the video! I wish you’d have spent some time talking about brightness/exposure and the use of false color in space photography too.

  • @Kelkschiz
    @Kelkschiz 4 роки тому

    Lovely episode. Especially appreciated the Moon Rise sequence. But the whole topic of space photography interests me a lot as an amateur photographer and space nut.

  • @burtlangoustine1
    @burtlangoustine1 6 років тому

    Excuse me Scott, may i ask you, What would you rather hold?
    1. A computer image/composition of a celestial.
    2. A real photo of a celestial (preferably Earth).
    It'll be taken from space and will have the negative with it too. Cheers

    • @scottmanley
      @scottmanley  6 років тому

      I’d go for whichever provided the best resolution and sampling depth.

  • @KanishQQuotes
    @KanishQQuotes 6 років тому

    It is a very humbling experience
    As a child I saw a video of a space shuttle launch on tv with camera attached to the shuttle, looking down as it leaves earth and reaches space
    I felt puny and helpless
    And all the struggles of politics, religion, greed seemed insignificant

  • @seriousmaran9414
    @seriousmaran9414 4 роки тому

    That is including the lens with the sensor and electronics. The lens on a modern phone is tiny and wide angle so it can take pictures of people close up and landscapes. You could put the same electronics in a larger lens and it would produce decent images. The problem is the cost saving is dwarfed by launch costs.
    Using a mobile phone could produce an adequate picture, a modern superzoom could produce all of the moon images without problem.

  • @pilote111
    @pilote111 6 років тому

    Great video! Plz continue the good job

  • @JETZcorp
    @JETZcorp 4 роки тому

    "The world's first everyone-elsie". I love this phrase so much.

  • @SyNcLife
    @SyNcLife 3 роки тому

    8:29 - wow this sequence seems so natural and real

  • @joost199207
    @joost199207 6 років тому

    I love these kinds of video from Scott, very interesting stuff.

  • @bergonius
    @bergonius 6 років тому +2

    The earth rise reconstruction made me wet my eyes a bit.

  • @JP_Stone
    @JP_Stone 6 років тому +1

    Eyes on the Solar System is such an awesome program and all for free. Its up there with and in some ways better than Space Engine.

  • @stella187
    @stella187 6 років тому

    Love these types of videos!

  • @uss_04
    @uss_04 6 років тому

    Now I want these images in 360 and put in a headset. Especially a 360 view of Cassini in its final hours...

  • @rayceeya8659
    @rayceeya8659 6 років тому

    Also need to point out that the only reason you see the curvature of the moon is because it was taken from the command module not the surface of the moon.

  • @tomiplaz
    @tomiplaz 6 років тому

    Really, really good job with this video. Keep it going

  • @PrinceWesterburg
    @PrinceWesterburg 6 років тому +1

    You forgot one other thing; Out at Pluto the light from the sun is so dim. An iPhone would see very little!

    • @scottmanley
      @scottmanley  6 років тому +2

      Did the math, it’s the same as taking pictures 5 minutes after sunset

  • @CombraStudios
    @CombraStudios 6 років тому +2

    I'm so drunk that I understand everything twice as good compared to when I'm just a sober space enthusiast

  • @GumMagnum
    @GumMagnum 6 років тому

    Space telescopes got nothing on the field of view settings of new players in Team Fortress 2

  • @JD3Gamer
    @JD3Gamer 3 роки тому

    Tldr camera optics are just as important as any other part of taking a photo.

  • @Masoudy91
    @Masoudy91 6 років тому +1

    Well, sir, can enlighten us about aerospike engines again?
    I know you covered them in other videos (big fan btw), as I am curious why they are not used?

    • @adamdapatsfan
      @adamdapatsfan 6 років тому +1

      Quick answer is that the tradeoff isn't good enough. An aerospike is more efficient than a standard deLaval nozzle, which does result in increased payload - but you could also just make your tanks bigger, and achieve pretty much the same effect. Given the relative complexity of a new engine type vs. bigger tanks, most rocket designers choose the latter. The only place aerospikes are really useful is in Single-Stage-to-Orbit design, where efficiency matters much more - but SSTOs are themselves not really worth the complexity at the moment, and perhaps they never will be.
      As far as I know, since Firefly Aerospace switched to a normal nozzle design, only one serious company, Ripple Aerospace, is considering an aerospike design - and they say they haven't actually decided yet.

  • @Steven_Edwards
    @Steven_Edwards 3 роки тому

    And this is why Cell Phones never get good photos of UFOs. It's not that they don't come and visit occasionally... It's just that our wide angle field of view cameras we all carry around with us really suck for capturing a distant small object violating the laws of physics.