I also noticed that but the data set came from Hubble’s legacy archive under WFC3 fits files. Would love to know why the diffraction pattern on that bright star looks different!
@@ianlauerastro It's because the diffraction spike is in relation to the trusses that hold the central mirror forming the X pattern. The vertical spike is actually bloom on the sensor itself, where the photon energy bleeds along the read column of the sensor, a common problem of CCDs and can even show up in other technologies, just in slightly different ways and less frequently.. but they sacrifice sensitivity to avoid it. If you look at the angles, you'll see 90 degrees between the X, and then the vertical spike probably 45 degrees off, (but I'm just guessing, it could be arbitrary, and would be based on the angle of the sensor's pixel grid to the frame of the telescope). JWST on the hand, has a few more sources of spikes.. it has hexagonal mirrors, meaning each mirror edge will produce a spike as well, giving you a 6 sided star with 60 degrees between each spike. Then it also has a support arm which will form another line, giving you that 7th and 8th spike, 30 degrees off (again, based on the alignment of the support arm to the hexagonal mirrors). I've been doing astrophotography for a while as well, but taken a break for a few years. I clicked because I'd never seen raw Hubble data either, though I've thought about it a few times. I loved how you presented this data in such a clean and simple way, but without dumbing it down. Excellent presentation. And you have an amazing voice for it. Edit: Looking at a few JWST images, and one fun thing to notice is how the spikes have the rainbowing, coming from how the different wavelengths interact with the physical obstructions in slightly different locations.. but the character and hue pattern shifts depending on if it's the mirror diffrations or the support arm. Likewise there's a tighter pattern 60 degrees off from the main one. This may be a corner-pattern, from the corners of the hexagonal mirrors. I've had an issue with that in an optical design I was helping with at work. Light reflecting off the corner of the optic flares out based on how the surface was machined. There's a lot to be learned from those diffraction artifacts. Also, you'll notice JWST doesn't seem to have pixel bloom like Hubble.. likely a much more modern sensor controlling it better. Perhaps one of the (relatively) new SCMOS or something? Or maybe it was cleaned up in the processing. I'll have to do some reading on that and maybe look at some raw JWST images.
@@ianlauerastro Thats "blooming", an artifact caused by CCD cameras without an "antiblooming gate", which are used in Hubbles cameras. When pixels are saturated they can "overflow" to pixels close by to get also saturated. This effect is more or less only visible on bright stars.
@@ianlauerastro Modern astrophoto can hardly count as scientific at all. So many steps with photoshoping and DE_CONVO'ing the image, that it is mostly Work of Art and not actual characteristic of light. But we can have middle ground between astrophoto and scientific data, we can call it "scientific image", where only scientific data is allowed to be displayed, prohibiting "cutting out the stars" and other manipulations like convo/deconvo. I enjoy scientific images much more then astrophotos (which serve completely different purpose of "WAH" effect on people never touching telescope).
The 1st Hubble image they took, after repairs, was a 10 day exposure of an empty square of space. That image is the Hubble Deep Field! Thousands of galaxies never before seen.
There's an old video on UA-cam here that has stuck with me for a dozen years now - the Hubble Deep Field was the most important photograph ever taken. To think that in every patch of 'empty' sky contains hundreds or THOUSANDS of galaxies we can't see blows my mind and humbles me.
Not just an empty square of space. A square as big as you could see through a 1mm by 1mm hole in a sheet of paper 1 meter away from your head. It would take 13.5 million images of that size to photograph the entire sky. So if you had enough time, you could photograph about 135 billion different galaxies with Hubble. Each one with some 100 million stars. And we know that the universe has to be much, much greater than any telescope could physically see.
i used to talk a lot with "amateur" astronomers with tens of thousands of dollars worth of equipment and software, and watching them go through the whole process of starting with raw images and work them into stunning beautiful pictures. So much work goes into making sure those edits aren't artistic license but true to the data they've gathered, or checked with existing data/the thoughts of other astronomers to make sure they aren't presenting orion as green or something when they want to render what it might look like if you could actually see all that whisper thin gas and dust with your naked eye. was also fun to watch them argue about what was "truly" true color lol
@@Ionee-q4f that's relatable as an amateur astrophotographer (on the lower side of the expense spectrum). Especially if you change camera and the end you go like "Hmmm, is this how the Orion looks like ? Lemme check some other images". In my personal opinion, it's worse for planetary imaging where you might have to select the best frames to stack in order to achieve sharp images of a planet.
I saw colour in M 42 one night when the ambient temp was -38C and humidity bone dry. My 10" scope showed hints of pink and green among the greyish-blue.
@Brick_Wall_quote_Entertainment If you are not making money, you are still an amateur. Some of us DO work at 'Pro' levels. It is amazing how we can now buy tools, to get results that took NASA etc huge $$$$$ to get, back in the 70s Nobody is beating Hubble as an amateur, but results from a few $1000 worth of kit are still mindblowing. If one already has a DSLR or Mirrorless camera a $1000 worth of add ons will produce remarkable results. If starting with nothing Look up Seestar 50, or Dwarf 3 scope and camera as one unit. Both under $1000
The reality is, no photograph is not false color. :) Cameras see color differently than human eyes do, prints, or screens show them also differently, processing is done to all of them, and also, most human see colors differently.
man i loved that raw black and white photo. i wish someone would put all the hubble and jwst images from their raw set, adjust for brightness, and put them on a website. i'd love to look around. the color ones are great for art etc, but the raw realness of the black/white one is aweseom.
Since the space telescopes are publicly-funded and belong to the taxpayers, anyone can in fact go online and freely download the raw data from Hubble, JWST and others. A quick google search should produce the archives.
Would be a lot more difficult with jwst as it’s not really taking photos in wave lengths we can typically see. They could shade them black to white but it would still be a false colour image rather then an under exposed photo.
I used to be a concert photographer - imagine having to capture good photos in constantly changing, low-light conditions surrounded by drunks, except for the three songs you get to go into the fotog pit. your wedding photography metaphor was _on point_ - I'd capture maybe 3,000 photos in a night for a 3-5 photo publisher agreement. I was lucky enough to learn on film, so I got pretty good with composition and timing for the majority of my shots, but I can't imagine doing something like that outside of a digital medium with the constraints.
@ plenty were usable. When you do concert photography or low light conditions, you use a feature called “auto bracketing” that takes many pictures in succession with different light settings. Since the light changes so fast at concerts, you have to do that to increase the likelihood of getting a good shot. You’ll recognize this from press briefings when you hear the cameras go “click click click click” in rapid fire succession - that’s either auto bracketing or (more commonly for news and especially sports) someone has their rig set to constantly take photos as long as the button is held down. You can quite easily hit 5k+ photos over the course of a 6 hour session that way, of which there may be 600-1000 or unique photos, plenty of “usable” photos, and a small handful that meet the needs of the story being told, have lucky timing, good composition, and lighting that played nice. At least with concert photography, there was always a goal to tell a story of the night. That’s very different than just taking random “good” photos, so more photos = higher likelihood of that story. Thankfully, we don’t pay by the film reel any more, so it’s extremely cheap insurance to go crazy with it.
@@ltjjenkins I was shooting for a VERY famous DJ. We had green room shots scheduled which I guess his manager didn’t time well, because we walked in on him getting ..attention.. from an escort. We waited 5 min and did the interview anyway. By the end of the night we were all smoking a J in the back of his SUV. Turns out some famous people just want to hang out and not be asked the same 6 questions over and over (our interview was purposefully designed to not follow industry shtick). Helped me though, as I got access for shots that none of the other fotogs could after that for his shows. I ended up climbing a bunch off (very off limits) scaffolding to get some phenomenal wide shots that nobody else was able to do. It got me a lot more work.
That was the best, thoroughly described explanation that I've heard to date on this whole controversy concerning color images we get from space. So much so, that I believe I actually understand it now.
This is legitimately one of the best UA-cam videos I've ever seen. Fascinating subject. Perfectly edited, perfectly narrated. You really can't do much better than this.
The most powerful ones weve recorded, have enough energy that if they hit you, it would be like a baseball hitting you at 100 mph, one of the was the “omg particle”
Astronauts in orbit report have reported brief flashes of light in their vision, even with their eyes closed. These are cosmic rays hitting their retinas.
This is a tight video, even though it's not super short, all the info is concise and to the point, while posing interesting questions that are answered satisfactorily. 10/10!
The big thing to remember about "False Color Imaging" is that the images aren't meant to be "pretty images for your desktop background" - they're _made to be used by scientists for scientific research._ The "false colors" are there because it makes details stand out more than they would in "true color." The fact that some of these scientific false color images _look good_ is just happy coincidence. (And also the work of people - largely volunteers - taking the raw data and messing with the levels/color-mapping to get images that look at least *close* to how it would look to the naked eye.)
They’re not trying to make images that are close to how it would look to the naked eye, though. The famous picture of the eagle nebula literally has pink stars.
As a photographer myself, I always thought that if you went to those nebulae, there wouldn't be much to see as the lens of the Hubble is compressing the image. Maybe taking something millions of miles thick and compressing it to a flat image. Kind of like if I zoom into someone with a 500mm lens about 50 yards down the street and New York City is 5 miles in the distance it makes it look like the subject is super close to the buildings. But when you go to the scene you truly see how far away the buildings are. And it might also depend on you point of view and if your position changes which stars are backlighting the nebulae. It probably wouldn't look like the Mutara Nebula battle in Star Trek II.
NASA - April 24th, 1990. Hubble launch date along with the statement, "I don't know what the heck we are going to see, but it's going to be fantastic." Having watched man walk on the moon ---- thank you NASA. It still brings me to tears.
@@generalmarkmilleyisbenedic8895 - It's kind of funny. A faked moon walk bringing real tears. I just can't decide if there was a winner to The Space Race.
@ No I meant if we looked at them from near Earth, as hubble does. Sure they'd look different if you could get closer but I don't know what you'd actually see.
I think about it the way a photographer thinks about RAW files from the camera. A RAW might look washed out or even under/overexposed, but the data that the final amazing exported photograph looks like is still contained within the file. A program designed to interpret and enhance some of the features from the RAW (for instance, Adobe Lightroom) can find the features that a human eye looking at the RAW can't. Hubble, Webb, et. al., are multispectral imagers (Webb is heavily IR-dominant, but still), so they can 'see' things that the human eye can't see directly. The folks that generate the amazing final photos we see are interpreting non-visible wavelengths as a color, and then adding that to the final photo. In a way, the edited Hubble photos are like what we could see if the human eye could see beyond the visible wavelength, which I think is an amazing thing to think about.
Yes, and what’s also amazing is when the objects that are close enough for us to discern even the start of their details reveal these, such the Orion Nebula and Andromeda. Being some of the brightest objects, when you can detect that purple hue or milky smudge visually, the edited images of other DSOs make sense.
Another often overlooked aspect of the Hubble photos is they're often multiple exposures of the exact same thing with the exact same filters in place stacked on top of each other. Each doubling of stacked images increases brightness by a full stop while noise is only increased by about 1/2 stop. I believe this technique was pioneered by Hubble himself.
This makes complete sense. Photographers often like to capture images with a high dynamic range. They don't want blown out highlights or shadows, so they turn down the contrast, which makes the image appear flat, but captures all of the details, which is better for editing the image later on and make it look as good as possible.
@@Telephonebill51 Why? And no, i've been doing photography some years ago and the topic of dynamic range regularly comes up there. And if you pay attention, you notice it very quickly. Just try to take a picture of the inside of a room with a window in the background and try to capture all the details in the room and outside. Most cameras can't do that. So you'd turn down the contrast and if that doesn't help you might try to take several pictures with different exposures and make an HDR picture. And again an unedited HDR image will look quite flat, because the displays dynamic range often isn't high enough either.
I’ve been waiting for this video. Been wondering why we can’t just see the raw images. Now I know why we can’t see them. Thank you! Incredible video BTW
I dont understand anti-intellectualism... why would some random scientist dude want to nefariously trick you into believing in a totally fake image. Like what could he possibly have to gain from that???
@@AbuFarouq oh boi, we got an "evolutionist" phrase in the replies, we've caught an anti-intellectual who uses cherry picked disagreements in the scientific community to fallaciously claim the entire field us wrong. . Lemme guess, young earth creationism?
It's not anti-intellectual. It's a valid response to opportunist science journalism. Like when the recent enhanced images of Jupiter were touted as 'real' images. Scientists do not gain from this. Opportunist science 'journalists' do.
In context of amateur Astrophotography, all images are manipulated to some extent to give the final desired look, everything from stacking and background extraction. It's important to keep in mind that regardless of how much editing is provided, the source of information of all images are still the galaxies themselves. Stacking can really compensate for not particularly good cameras in order to reconstruct what a human eye can see (for example, with the naked eye, you can usually see many details on planets like jupiter or mars, but on phone they aren't exactly visible) Edit: never mind, i took a look at the channel to realise it is intact a amateur astrophotographer.
Every photo ever taken are manipulated. All of the cameras ever existed distort reality in multiple way. They are distorted geometrically, they detect and show colors differently, just as the levels of light, the detail, noise, noise reduction, etc etc...
@@Sonnell the main point stands that these images or other images of space aren't "fake", but we could say abstracted. It kind of plays on the idea that people don't know what raw images look like and the possibility that it's too "enhanced". I think the most realistic you can get is with film photography as there is not much room for editing other then filters and development alterations, though given it is as well an abstraction of reality
@@joeshmoe7967 it's hard to estimate if other stated the same and adds therefore nothing to the conversation. I am glad my information was useful to some extent
Great video - and excellent picture of the eagle btw 😍! I only realized about three years into my astrophotography-journey that this beautiful nebula with its fascinating center sits right „in front of me“ in summer! So two years ago I pointed my Nexstar 6SE at it (on an Alt-Az mount), screwed as always my daily dslr behind it, took many dozens of 15 second exposures, stacked them and BANG there they were, these pillars of creation, right in the middle 🤩!! Nowhere near your picture of course, you know the limitations of a not modified uncooled camera and a mount without guiding, but still! Out of my backyard! Processed on my little MacBook Air! I still love that picture. And as you said, this made these iconic Hubble and now JWST-pictures so much more real to me. Thx!
The telescope isn't the for pretty pictures. It's there for scientific research. Using false colour you can see where different elements are in a gas cloud and what elements make up the cloud. Shooting in long wave infra red reveals things that simply can't be seen in visible light because of the expansion of the universe which has stretched the light waves beyond the visible spectrum. You would never see that with any visible wavelength telescope, no matter how big it is.
This was a great explanation. Years ago I found the Hubble image archive and looked up a particular galaxy I'd never seen presented. The images were there, and they showed the galaxy, but it was faint, about what I might see visually through a large telescope - basically a "faint fuzzy". But they also were full of cosmic ray streaks and stuck or dead pixels. I couldn't do anything with them. If I had processing and stacking software - I didn't - and knew how to use them - I don't - I might have been able to assemble an image from the many subs that were there, but I couldn't.
I think they use an algorithm for taking out the cosmic ray streaks. Basically anything that looks like a bright line a single pixel wide can be erased.
Nice one, Ian. It's awesome to see your channel going from strength to strength. 💪💪 What a great idea to edit Hubble images - makes up for the cloudy skies we've had here recently!
Finally someone has posted a video on this. You have had to run across my comments on the actual photos from Hubble and what we are presented with. You even hit the "if you were right there to look at the object, it wouldn't look just like the images we are presented with".
For me, the wonder of astronomy comes into play when the naked *can* actually discern the start of the details that are revealed so magnificently in edited astrophotography images. When I look at the Orion Nebula, Andromeda or Messier objects like open clusters through my little 80mm telescope, in dark skies I get visual confirmation that these things *do* exist. That’s the most impressive thing. Even our badly adapted human eyes can see the wonders.
In other words, Hubble captures black-and-white images with filters isolating specific colors. These raw images look dark and dull due to their linear brightness data. They’re processed to reveal details and combined into stunning color images. "False color imaging" assigns colors to specific wavelengths, helping visualize structures and elements. While these colors aren't what you'd see with your eyes, they reveal real, detailed data about the universe. The process parallels techniques like medical imaging.
I once had an astronomy professor tell me that the universe, to our eyes, on average is a lovely beige, because the small spectrum of light that we can see only gives us that. Your description of the image compilation and manipulation reinforces his statement.
False hue would be a better term. Since color is inherently defined through luminance as well, and these objects are extremely dark, literally all their images are in false color. Therefore, false hue would fit so much better.
Great work you're doing here. I'm a professional photographer. The RAW images we take are flat but, if you know what you're doing, they should all be properly exposed, with a few here and there a little under- or over-exposed. When they are somewhat underexposed, they can usually be recovered, but there is always degradation in quality. If the image is dark, as these Hubble shots are, to brighten them will result in a horrendous and insanely noisy image with incredible loss of detail. So, I'm intrigued by your 'stretching' - and in the interests of clarity and transparency (at 6:18, you appear to show the 'stretching' but it's an edit - a jump cut as you say 'wow'), it would be helpful to show us a video of how you did that - and what software you used. Because that is simply not possible in regular photography, with the incredible results being what you have shown.
raw infrared frames are basically white. you subtract background, atmosphere, bad pixels, cosmic rays, correct for sensor sensitivity... and then end up with something that actually looks like data.
It's interesting that images show more detail in Monochrome photography regardless of it being digital or film stock. I'm not a photographer, dabbled a bit in it when I was younger, but I really appreciate black and white photography for that very reason.
Yeah, it's much easier to decern between different levels of brightness than between different hues or saturations. A grayshade image gives the whole range of brightness to a single singal(either one colour or all colours added together). A colour image theoretically has more information but the brightness channel then has to be shared to additionally display that information and you lose some detail in the old signal.
With a mono camera, every pixel can capture the light. with a one shot colour camera 1/4 capture red, 1/4 capture blue, and 1/2 capture green, but with mono, you have to use and change filters
False color and color coding are great examples of how different forms of perception can yield accurate information about the same object. Of course, we couldn't use them if our senses didn't actually work in the first place. In fact, it is our confidence in our own senses that has motivated us to create technologies that vastly expand on the capacities of our sense-based perception to make it possible to acquire information far beyond what it is possible for us to experience by means our our senses, alone. It is our understanding of our senses and how they work that has enabled us to use the information they provide to expend our perception.
This video is well put out and easy to digest, a random recommendation in my timeline but I enjoyed it very much. I've also had the same questions myself but I never really cared enough to think hard enough about it, even so, I'm really thankful for people like you who share your knowlege in the most digestible way.
No being able to see these colours with your own eyes is no more evident then when trying to see aurora in from light polluted areas. During the last high activity period if you looked up you could BARELY see feignt greens and reds in the sky. But take a long exposure with a camera, even an iphone, and WOW, the colours are striking.
Not true. They are extremely bright in every color if you go north far enough. Dont do it though or you will never look at them again since they will never look anything like that unless you go back up north.
tldr; Hubble images are intended to convey scientific information, and at times be beautiful. They are not intended to show you what you would see if you were in a spaceship looking out the window.
Not really. They look very similar to what you would see. Hubble can see a little bit into UV though, so that part gets added. Other than that, it's the same what your eyes would perceive.
That's not to say that you wouldn't see anything at all. You probably would, just that it would be far dimmer and nowhere near as vibrant as in those images
So if I was at a point in space relative to the "Pillars" "photo", you're saying the colors might be completely different? It might look completely different?
With insanely huge and sensitive eyes (the size of skyscrapers), you would see it's all pinkish as most stuff in universe is. With your own eyes, you would see nothing, regardless of how close you are. You would see individual stars if you're close enough, that's it. Pillars of creation are visually, from our vantage point, big as Mare Imbrium on Moon. These are insanely huge objects.
@@srvafool You wouldn't see it (at least not clearly), but it is still there. It's basically a very dark object against a pitch black background. The universe wasn't created specifically so we can see it. We can use our ingenuity to see things that we would otherwise not be able to. Basically like putting on night vision goggles to see the details in a dark room
It depends, some of them yes, but others capture wavelength outside the visible spectrum, e.g. infrared or x ray. Some hubble pitctures have those shifted into the visible range, so your eyes wouldn't see all the features that are only visible in these wavelengths in real life
Great video, but it would be nice if you included a photo that simulated what a human would actually. You note that we wouldn't see colors and details in the standard hubble pics, but if we were hovering in space and looking at the sombrero galaxy, what WOULD we see. We can see Andromeda from earth even with atmosphere and lights, so ?? I think that's what people were hoping you would show from your title.
In short, "Like an unedited RAW image with 16bits color depth, 100 stops of Dynamic Range, 16.8millian megapixels before compounding, in several dozen puzzle tile pieces, each frequencies of lights seperately, with cosmic microwave background radiation noise."
I think it's also important to note Hubble did define what the colors should be for image processing, so in many ways, we can thank Hubble for helping to make color images consistent.
Likely grey smudges on the outer edge, with increasing brightness and (potentially) some gentle color closer to the core. I would imagine it would be similar to looking at the Milky Way in the night sky from a dark sky location.
Amazing video!!! By the way, you changed my life! I knew the space photos would be colorized artificially, but I didn't know the original photos would be so apparently blank
Maybe because it's too large ? I mean from my experience, RAW seems to take alot of data space, but it's crucial for the processing process, especially stacking
I'm not a wedding photographer, so I only know of this issue from my friends who do wedding photography. They say it's because they don't want to give out their unfinished work. It's like a baker handing over an unfinished cake. Handing over RAW files can misrepresent what they do. For exmaple, if the RAW files are awfully edited by someone else, then shown to people without the photographer knowing, their name (and business) is now associated with these terrible final images because they "took" the photo. Besides, dealing with RAWs is often a hassle for customers who may not understand how to handle them or might complain about technical imperfections inherent in the format (lots of blurry photos, over/underexposed, bad angles, etc.)
@ Logical, but for us astro folks, RAW files are not alien. Always good to ask in advance because if I ever want a photoshoot, I'd want them RAWs. (I hoard data, yum😋)
@droid_ex1438 depends on the context really. Another decent format is .png, wich doesn't really compress the image. Regardless of preferences, amateur astrophotographers want to give a image you like, not some merky dark images where you can barely even see difference between nebula and space. One time all RAWs of mine came out seemingly blank and i wanter to delete them till i raised the contrast and found the nebula 😂.
It's funny how he skipped that part. We would see absolutely nothing. Universe is mostly pitch black void. If we had huge eyes, we would see dim cyan whispy structures. If eyes were gigantic, or we had a monstrously big telescope to collect enough light, objects would appear pink-magenta from plasma of hydrogen and helium. That is the actual hue of almost all these diffused, nebular objects.
If it's a conspiracy it would also be hilarious that they would offer the raw images and that they'd just appear grey unedited. I can already hear the conspiacy believers: "See, that proves that they fabricated this from grey images with nothing on them out of thin air, dude!1&11!//d?"
I like looking at astro photos and then trying to find whatever object I'm looking at in Space Engine. Being able to travel to and fly around the Pillars of Creation is pretty cool.
People miss the point of how amazing these structures are because their brain can't fathom how large they are... There's a cloud of gas 5 light years long!!! You don't understand how incomprehensible big that is! Color is the least impressive thing about objects in space!
@@radiangamer_king Oh you mean how like would you yourself do it? If you have the three different images you can put them into Photoshop using the channels tool and just put one in each of the red green and blue channels
I have a hard time not feeling scammed after these kinds of videos. Like where is the really looking Pic then? Show us what it would really look like please?
I'm confused about your question, Packedburrito. The video states that an unedited, raw image from the Hubble Space Telescope would be shown. The unedited image (a seemingly black square) was displayed, but upon digital manipulation, the encoded image appears. This is the "true image". It's what exists in space, and what we would see (if our eyes had extremely long exposure time and could see the parts of the UV and IR intervals of the electromagnetic spectrum).
Thank you for an excellent video. It should be mandatory viewing for the "science deniers" who constantly shout "fake".... If only their feeble minds could accept knowledge.
So how these pillars looks like in real life? ok as i understand its grey for us cause its far away. but if we like kilometers from theem its grey too?
When talking about raw images being linear, it's true that they will look darker, but they are NOT flatter, they will actually have much higher contrast than they should. Gamma correction results in a natural-looking but much lower contrast image.
"False color" is akin to saying "transposed" in music. Our eyes can only see about an octave of the EM spectrum, but our ears can hear about 10 octaves of sound frequencies. So, we need to transpose most of the EM spectrum to our visible octave... Just imagine how much musical depth would be lost if you could only hear 1 octave of sound and every sound outside of that octave , whether from music or nature, had to be transposed to between A220 and A440, for instance. My takeaway is that "false color" is not "prettier" or more "artistic" than reality and they're definitely not fake, rather, they are a shallow slice of what we would see if our eyes were sensitive enough to the true frequencies and amplitudes of the EM spectrum.
Enjoyable and educational but the image around 6:11 looks like it came from JWT.
I also noticed that but the data set came from Hubble’s legacy archive under WFC3 fits files. Would love to know why the diffraction pattern on that bright star looks different!
@@ianlauerastro It's because the diffraction spike is in relation to the trusses that hold the central mirror forming the X pattern. The vertical spike is actually bloom on the sensor itself, where the photon energy bleeds along the read column of the sensor, a common problem of CCDs and can even show up in other technologies, just in slightly different ways and less frequently.. but they sacrifice sensitivity to avoid it.
If you look at the angles, you'll see 90 degrees between the X, and then the vertical spike probably 45 degrees off, (but I'm just guessing, it could be arbitrary, and would be based on the angle of the sensor's pixel grid to the frame of the telescope). JWST on the hand, has a few more sources of spikes.. it has hexagonal mirrors, meaning each mirror edge will produce a spike as well, giving you a 6 sided star with 60 degrees between each spike. Then it also has a support arm which will form another line, giving you that 7th and 8th spike, 30 degrees off (again, based on the alignment of the support arm to the hexagonal mirrors).
I've been doing astrophotography for a while as well, but taken a break for a few years. I clicked because I'd never seen raw Hubble data either, though I've thought about it a few times. I loved how you presented this data in such a clean and simple way, but without dumbing it down. Excellent presentation. And you have an amazing voice for it.
Edit: Looking at a few JWST images, and one fun thing to notice is how the spikes have the rainbowing, coming from how the different wavelengths interact with the physical obstructions in slightly different locations.. but the character and hue pattern shifts depending on if it's the mirror diffrations or the support arm. Likewise there's a tighter pattern 60 degrees off from the main one. This may be a corner-pattern, from the corners of the hexagonal mirrors. I've had an issue with that in an optical design I was helping with at work. Light reflecting off the corner of the optic flares out based on how the surface was machined.
There's a lot to be learned from those diffraction artifacts.
Also, you'll notice JWST doesn't seem to have pixel bloom like Hubble.. likely a much more modern sensor controlling it better. Perhaps one of the (relatively) new SCMOS or something? Or maybe it was cleaned up in the processing. I'll have to do some reading on that and maybe look at some raw JWST images.
@@ianlauerastro Thats "blooming", an artifact caused by CCD cameras without an "antiblooming gate", which are used in Hubbles cameras. When pixels are saturated they can "overflow" to pixels close by to get also saturated. This effect is more or less only visible on bright stars.
@@jakobsahnerah that makes sense! I’ve been so spoiled by cmos, I’ve forgotten about the issues that CCDs suffer from
@@ianlauerastro Modern astrophoto can hardly count as scientific at all. So many steps with photoshoping and DE_CONVO'ing the image, that it is mostly Work of Art and not actual characteristic of light. But we can have middle ground between astrophoto and scientific data, we can call it "scientific image", where only scientific data is allowed to be displayed, prohibiting "cutting out the stars" and other manipulations like convo/deconvo. I enjoy scientific images much more then astrophotos (which serve completely different purpose of "WAH" effect on people never touching telescope).
The 1st Hubble image they took, after repairs, was a 10 day exposure of an empty square of space. That image is the Hubble Deep Field! Thousands of galaxies never before seen.
We know!
@TheTuttle99 i didnt.
There's an old video on UA-cam here that has stuck with me for a dozen years now - the Hubble Deep Field was the most important photograph ever taken. To think that in every patch of 'empty' sky contains hundreds or THOUSANDS of galaxies we can't see blows my mind and humbles me.
Not just an empty square of space.
A square as big as you could see through a 1mm by 1mm hole in a sheet of paper 1 meter away from your head.
It would take 13.5 million images of that size to photograph the entire sky. So if you had enough time, you could photograph about 135 billion different galaxies with Hubble. Each one with some 100 million stars. And we know that the universe has to be much, much greater than any telescope could physically see.
10^22 stars!
i used to talk a lot with "amateur" astronomers with tens of thousands of dollars worth of equipment and software, and watching them go through the whole process of starting with raw images and work them into stunning beautiful pictures. So much work goes into making sure those edits aren't artistic license but true to the data they've gathered, or checked with existing data/the thoughts of other astronomers to make sure they aren't presenting orion as green or something when they want to render what it might look like if you could actually see all that whisper thin gas and dust with your naked eye. was also fun to watch them argue about what was "truly" true color lol
@@Ionee-q4f that's relatable as an amateur astrophotographer (on the lower side of the expense spectrum). Especially if you change camera and the end you go like "Hmmm, is this how the Orion looks like ? Lemme check some other images". In my personal opinion, it's worse for planetary imaging where you might have to select the best frames to stack in order to achieve sharp images of a planet.
are they even amateur anymore? are they now just freelancer astronomers?
I saw colour in M 42 one night when the ambient temp was -38C and humidity bone dry. My 10" scope showed hints of pink and green among the greyish-blue.
@Brick_Wall_quote_Entertainment If you are not making money, you are still an amateur. Some of us DO work at 'Pro' levels. It is amazing how we can now buy tools, to get results that took NASA etc huge $$$$$ to get, back in the 70s
Nobody is beating Hubble as an amateur, but results from a few $1000 worth of kit are still mindblowing. If one already has a DSLR or Mirrorless camera a $1000 worth of add ons will produce remarkable results. If starting with nothing Look up Seestar 50, or Dwarf 3 scope and camera as one unit. Both under $1000
@Brick_Wall_quote_Entertainment Nowadays they are usually referred to as "citizen scientists"
“took a stab at it”
* shows one of the best amateur eagle nebula images i’ve ever seen
Thanks! Appreciate that!!
wow, I was surprised that incredible photo can be taken as an amateur
For real. I had to pause and stare at it for a minute.
What kind of telescope did you use to take it?
The term “false color” pre-dates digital imagery, back into film. Around WW-II ‘Infra-red False color film’ was created for camouflage detection.
Silver photographic plates skewed heavily toward blue wavelengths.
The reality is, no photograph is not false color. :)
Cameras see color differently than human eyes do, prints, or screens show them also differently, processing is done to all of them, and also, most human see colors differently.
@ No camera can exceed the human eye-brain perception of detail.
@@RideAcrossTheRiver Eagles eyes do, wish we had them!
@ _No camera_ can exceed the human eye-brain perception of detail.
man i loved that raw black and white photo. i wish someone would put all the hubble and jwst images from their raw set, adjust for brightness, and put them on a website. i'd love to look around. the color ones are great for art etc, but the raw realness of the black/white one is aweseom.
Since the space telescopes are publicly-funded and belong to the taxpayers, anyone can in fact go online and freely download the raw data from Hubble, JWST and others. A quick google search should produce the archives.
Totally agree, the raw photos have a unique beauty to them.
Would be a lot more difficult with jwst as it’s not really taking photos in wave lengths we can typically see. They could shade them black to white but it would still be a false colour image rather then an under exposed photo.
I used to be a concert photographer - imagine having to capture good photos in constantly changing, low-light conditions surrounded by drunks, except for the three songs you get to go into the fotog pit. your wedding photography metaphor was _on point_ - I'd capture maybe 3,000 photos in a night for a 3-5 photo publisher agreement. I was lucky enough to learn on film, so I got pretty good with composition and timing for the majority of my shots, but I can't imagine doing something like that outside of a digital medium with the constraints.
Any concert stories?
@@ltjjenkinsi’m here for that too!
3 thousand pics for only 5? the best 5 of course, but how many of those 3 thousand pics were good enough to be used? can't be that low
@ plenty were usable. When you do concert photography or low light conditions, you use a feature called “auto bracketing” that takes many pictures in succession with different light settings. Since the light changes so fast at concerts, you have to do that to increase the likelihood of getting a good shot. You’ll recognize this from press briefings when you hear the cameras go “click click click click” in rapid fire succession - that’s either auto bracketing or (more commonly for news and especially sports) someone has their rig set to constantly take photos as long as the button is held down. You can quite easily hit 5k+ photos over the course of a 6 hour session that way, of which there may be 600-1000 or unique photos, plenty of “usable” photos, and a small handful that meet the needs of the story being told, have lucky timing, good composition, and lighting that played nice.
At least with concert photography, there was always a goal to tell a story of the night. That’s very different than just taking random “good” photos, so more photos = higher likelihood of that story. Thankfully, we don’t pay by the film reel any more, so it’s extremely cheap insurance to go crazy with it.
@@ltjjenkins I was shooting for a VERY famous DJ. We had green room shots scheduled which I guess his manager didn’t time well, because we walked in on him getting ..attention.. from an escort. We waited 5 min and did the interview anyway.
By the end of the night we were all smoking a J in the back of his SUV. Turns out some famous people just want to hang out and not be asked the same 6 questions over and over (our interview was purposefully designed to not follow industry shtick). Helped me though, as I got access for shots that none of the other fotogs could after that for his shows. I ended up climbing a bunch off (very off limits) scaffolding to get some phenomenal wide shots that nobody else was able to do. It got me a lot more work.
That was the best, thoroughly described explanation that I've heard to date on this whole controversy concerning color images we get from space. So much so, that I believe I actually understand it now.
False colors are used to create fake pictures, why is that a good thing? Is Mars really red?
Glad to hear it! That was the goal with this video 😊
This is legitimately one of the best UA-cam videos I've ever seen. Fascinating subject. Perfectly edited, perfectly narrated. You really can't do much better than this.
The amount of cosmic rays visible in each frame was actually surprising to me. I thought it's like "an event that happens" but they are _everywhere!_
The most powerful ones weve recorded, have enough energy that if they hit you, it would be like a baseball hitting you at 100 mph, one of the was the “omg particle”
Astronauts in orbit report have reported brief flashes of light in their vision, even with their eyes closed. These are cosmic rays hitting their retinas.
This is a tight video, even though it's not super short, all the info is concise and to the point, while posing interesting questions that are answered satisfactorily. 10/10!
You've got one of the best video presentations on UA-cam.
Thanks so much!
The big thing to remember about "False Color Imaging" is that the images aren't meant to be "pretty images for your desktop background" - they're _made to be used by scientists for scientific research._ The "false colors" are there because it makes details stand out more than they would in "true color." The fact that some of these scientific false color images _look good_ is just happy coincidence. (And also the work of people - largely volunteers - taking the raw data and messing with the levels/color-mapping to get images that look at least *close* to how it would look to the naked eye.)
They’re not trying to make images that are close to how it would look to the naked eye, though.
The famous picture of the eagle nebula literally has pink stars.
As a photographer myself, I always thought that if you went to those nebulae, there wouldn't be much to see as the lens of the Hubble is compressing the image. Maybe taking something millions of miles thick and compressing it to a flat image. Kind of like if I zoom into someone with a 500mm lens about 50 yards down the street and New York City is 5 miles in the distance it makes it look like the subject is super close to the buildings. But when you go to the scene you truly see how far away the buildings are. And it might also depend on you point of view and if your position changes which stars are backlighting the nebulae. It probably wouldn't look like the Mutara Nebula battle in Star Trek II.
NASA - April 24th, 1990. Hubble launch date along with the statement, "I don't know what the heck we are going to see, but it's going to be fantastic."
Having watched man walk on the moon ---- thank you NASA. It still brings me to tears.
…allegedly
@@generalmarkmilleyisbenedic8895 - It's kind of funny. A faked moon walk bringing real tears. I just can't decide if there was a winner to The Space Race.
yeah pretty good movie for sure
why didnt the Russians call out the fake moon landing
Good job talking about bit depth Ian! Keep up the good work!
Thanks so much for watching and the kind words 🙏
@@ianlauerastro So what would they look like to our naked eye if we saw it in person?
@@gabecollins5585 Almost all of hubble's photos (there are exceptions) are of something we can't see with our naked eyes because they're too dim.
@ I meant like if somehow we were next to one of them hypothetically. Unless that’s what you meant.
@ No I meant if we looked at them from near Earth, as hubble does. Sure they'd look different if you could get closer but I don't know what you'd actually see.
I think about it the way a photographer thinks about RAW files from the camera. A RAW might look washed out or even under/overexposed, but the data that the final amazing exported photograph looks like is still contained within the file. A program designed to interpret and enhance some of the features from the RAW (for instance, Adobe Lightroom) can find the features that a human eye looking at the RAW can't.
Hubble, Webb, et. al., are multispectral imagers (Webb is heavily IR-dominant, but still), so they can 'see' things that the human eye can't see directly. The folks that generate the amazing final photos we see are interpreting non-visible wavelengths as a color, and then adding that to the final photo. In a way, the edited Hubble photos are like what we could see if the human eye could see beyond the visible wavelength, which I think is an amazing thing to think about.
Yes, and what’s also amazing is when the objects that are close enough for us to discern even the start of their details reveal these, such the Orion Nebula and Andromeda. Being some of the brightest objects, when you can detect that purple hue or milky smudge visually, the edited images of other DSOs make sense.
I love the simple explanation of “stretching” in 30 seconds. Great presentation of the subject matter, now subscribed!
OH MY GOD!!!
You explain it in the FUN, Easy to Understand, and make it Interesting to learn about it. THANK YOU. Trully ❤ it
🙏 Thanks for the love!
your photo of the pillars of creation is blowing me away rn
Another often overlooked aspect of the Hubble photos is they're often multiple exposures of the exact same thing with the exact same filters in place stacked on top of each other. Each doubling of stacked images increases brightness by a full stop while noise is only increased by about 1/2 stop. I believe this technique was pioneered by Hubble himself.
This makes complete sense. Photographers often like to capture images with a high dynamic range. They don't want blown out highlights or shadows, so they turn down the contrast, which makes the image appear flat, but captures all of the details, which is better for editing the image later on and make it look as good as possible.
You just now looked all those words up, didn't you?
@@Telephonebill51
Why? And no, i've been doing photography some years ago and the topic of dynamic range regularly comes up there. And if you pay attention, you notice it very quickly. Just try to take a picture of the inside of a room with a window in the background and try to capture all the details in the room and outside. Most cameras can't do that. So you'd turn down the contrast and if that doesn't help you might try to take several pictures with different exposures and make an HDR picture. And again an unedited HDR image will look quite flat, because the displays dynamic range often isn't high enough either.
I’ve been waiting for this video. Been wondering why we can’t just see the raw images. Now I know why we can’t see them. Thank you!
Incredible video BTW
I dont understand anti-intellectualism... why would some random scientist dude want to nefariously trick you into believing in a totally fake image. Like what could he possibly have to gain from that???
Views, recognition and prestige amongst peers Did you know how many fossils were found to be faked by some evolutionists?
They see scientific evidence as false teachings created by Satan to make believers doubt what's in their religious books
@@AbuFarouq
oh boi, we got an "evolutionist" phrase in the replies, we've caught an anti-intellectual who uses cherry picked disagreements in the scientific community to fallaciously claim the entire field us wrong.
.
Lemme guess, young earth creationism?
@@suruxstrawde8322what does the now deleted comment said
It's not anti-intellectual. It's a valid response to opportunist science journalism. Like when the recent enhanced images of Jupiter were touted as 'real' images. Scientists do not gain from this. Opportunist science 'journalists' do.
The actual pictures are shown at 6:06.
Great job on this Ian!
Thanks for tuning in dude 🙏 it was cool seeing raw Hubble images for the first time!
In context of amateur Astrophotography, all images are manipulated to some extent to give the final desired look, everything from stacking and background extraction. It's important to keep in mind that regardless of how much editing is provided, the source of information of all images are still the galaxies themselves. Stacking can really compensate for not particularly good cameras in order to reconstruct what a human eye can see (for example, with the naked eye, you can usually see many details on planets like jupiter or mars, but on phone they aren't exactly visible)
Edit: never mind, i took a look at the channel to realise it is intact a amateur astrophotographer.
Every photo ever taken are manipulated. All of the cameras ever existed distort reality in multiple way. They are distorted geometrically, they detect and show colors differently, just as the levels of light, the detail, noise, noise reduction, etc etc...
Your info is still valid. Some viewng this page may not yet understand all that goes into making the images.
Just like phone camera do
@@Sonnell the main point stands that these images or other images of space aren't "fake", but we could say abstracted. It kind of plays on the idea that people don't know what raw images look like and the possibility that it's too "enhanced". I think the most realistic you can get is with film photography as there is not much room for editing other then filters and development alterations, though given it is as well an abstraction of reality
@@joeshmoe7967 it's hard to estimate if other stated the same and adds therefore nothing to the conversation. I am glad my information was useful to some extent
Cameras can capture all kinds of things that our eyes never could...absolutely. Time to find these raws and play!!! lolol
Good vid, my dude.
Great video - and excellent picture of the eagle btw 😍! I only realized about three years into my astrophotography-journey that this beautiful nebula with its fascinating center sits right „in front of me“ in summer! So two years ago I pointed my Nexstar 6SE at it (on an Alt-Az mount), screwed as always my daily dslr behind it, took many dozens of 15 second exposures, stacked them and BANG there they were, these pillars of creation, right in the middle 🤩!! Nowhere near your picture of course, you know the limitations of a not modified uncooled camera and a mount without guiding, but still! Out of my backyard! Processed on my little MacBook Air! I still love that picture. And as you said, this made these iconic Hubble and now JWST-pictures so much more real to me. Thx!
It’s incredible what you can capture from your backyard! That’s what astrophotography is all about!
I just wish they would offer more ‘naked eye’ images.
'Naked eye' images would all be dark squares, unfortunately. Except planets, those are visible.
@@danzoom He probably means "as it would appear with large enough telescope aperture to trigger our color response". And he's right.
Or maybe he ment it in a perverted way. Makes you think.
The telescope isn't the for pretty pictures. It's there for scientific research. Using false colour you can see where different elements are in a gas cloud and what elements make up the cloud. Shooting in long wave infra red reveals things that simply can't be seen in visible light because of the expansion of the universe which has stretched the light waves beyond the visible spectrum. You would never see that with any visible wavelength telescope, no matter how big it is.
That wouldn't be very interesting. It would look black with a few specks of light/fuzzy blurs.
This was a great explanation. Years ago I found the Hubble image archive and looked up a particular galaxy I'd never seen presented. The images were there, and they showed the galaxy, but it was faint, about what I might see visually through a large telescope - basically a "faint fuzzy". But they also were full of cosmic ray streaks and stuck or dead pixels. I couldn't do anything with them. If I had processing and stacking software - I didn't - and knew how to use them - I don't - I might have been able to assemble an image from the many subs that were there, but I couldn't.
I think they use an algorithm for taking out the cosmic ray streaks. Basically anything that looks like a bright line a single pixel wide can be erased.
Nice one, Ian. It's awesome to see your channel going from strength to strength. 💪💪
What a great idea to edit Hubble images - makes up for the cloudy skies we've had here recently!
Thanks so much bud!
I noticed that when I took a picture of the sky. I saw waaaaaaaay more stars than with my eyes, but when I edited it, woow. It multiplied even more!
Well done as usual. Ian in explaning the deets buddy
Appreciate it dude!!
Finally someone has posted a video on this. You have had to run across my comments on the actual photos from Hubble and what we are presented with. You even hit the "if you were right there to look at the object, it wouldn't look just like the images we are presented with".
For me, the wonder of astronomy comes into play when the naked *can* actually discern the start of the details that are revealed so magnificently in edited astrophotography images. When I look at the Orion Nebula, Andromeda or Messier objects like open clusters through my little 80mm telescope, in dark skies I get visual confirmation that these things *do* exist. That’s the most impressive thing. Even our badly adapted human eyes can see the wonders.
Yes. There are many wonders that are visible to the naked observer.
@@williamdiffin28 You mean naked eye.. 😂
been starving for a video exactly like this, i already heard the dozens of explanation but i been curious about the process and raw data
great content
Honestly, that last wedding photo looked fire
I watched the video to the end and I'm fascinated by what I learned today.
You answered a long lasting question of mine and in such a beautiful way... thanks a lot for this amazing content!
I’m stoked you enjoyed the video!
Wow that was a super cool video. I liked seeing the black and white pictures rather than the ones where they add color.
In other words, Hubble captures black-and-white images with filters isolating specific colors. These raw images look dark and dull due to their linear brightness data. They’re processed to reveal details and combined into stunning color images. "False color imaging" assigns colors to specific wavelengths, helping visualize structures and elements. While these colors aren't what you'd see with your eyes, they reveal real, detailed data about the universe. The process parallels techniques like medical imaging.
This was a very interesting and well explained video. Thank you for the update was a pleasure to watch!👏
The one fact I am gonna remember from this video is that the human eye threshold sensitivity is 100-150 individual photons. That is insane holy cow
The Human eye is actually actually sensitive enough to detect a single photon under the right conditions.
@@tylerdurden3722Or neutrinos.
Astronauts can close their eyes and see light streaks: neutrino collisions.
I don't understand all of it but, this is the best explanation I have yet found for how these photos are taken. Thank you.
What a great and informative video.
I really enjoyed it. Stay cool dude
This is so well produced, Ian! Fantastic job. Gonna send this to my friends when they ask about this!
I once had an astronomy professor tell me that the universe, to our eyes, on average is a lovely beige, because the small spectrum of light that we can see only gives us that. Your description of the image compilation and manipulation reinforces his statement.
"false color imaging" is misleading, because it's not supposed to be an image as much as it's supposed to be a map.
False hue would be a better term. Since color is inherently defined through luminance as well, and these objects are extremely dark, literally all their images are in false color. Therefore, false hue would fit so much better.
Finally someone with clear answers on this topic, thanks!
How amazing, if not numinous, is it that we live during a time when we are able to see galaxies billions of light years away?
Great work you're doing here. I'm a professional photographer. The RAW images we take are flat but, if you know what you're doing, they should all be properly exposed, with a few here and there a little under- or over-exposed. When they are somewhat underexposed, they can usually be recovered, but there is always degradation in quality. If the image is dark, as these Hubble shots are, to brighten them will result in a horrendous and insanely noisy image with incredible loss of detail. So, I'm intrigued by your 'stretching' - and in the interests of clarity and transparency (at 6:18, you appear to show the 'stretching' but it's an edit - a jump cut as you say 'wow'), it would be helpful to show us a video of how you did that - and what software you used. Because that is simply not possible in regular photography, with the incredible results being what you have shown.
wow you answered like every question i had about this. great content!
Appreciate it!
Really amazing video!!
Thanks so much, Jack! I really enjoyed your video with Charlie Duke.
raw infrared frames are basically white. you subtract background, atmosphere, bad pixels, cosmic rays, correct for sensor sensitivity... and then end up with something that actually looks like data.
A question few even think to ask, including myself. Almost embarrassed I didn’t ask it myself. Love the video!
It's interesting that images show more detail in Monochrome photography regardless of it being digital or film stock. I'm not a photographer, dabbled a bit in it when I was younger, but I really appreciate black and white photography for that very reason.
Yeah, it's much easier to decern between different levels of brightness than between different hues or saturations. A grayshade image gives the whole range of brightness to a single singal(either one colour or all colours added together). A colour image theoretically has more information but the brightness channel then has to be shared to additionally display that information and you lose some detail in the old signal.
With a mono camera, every pixel can capture the light. with a one shot colour camera 1/4 capture red, 1/4 capture blue, and 1/2 capture green, but with mono, you have to use and change filters
Never sieves to amaze me how scientists take photos and then tell us so much infomation about it
Didn’t know about the cosmic ray strikes 😮😮😮
False color and color coding are great examples of how different forms of perception can yield accurate information about the same object. Of course, we couldn't use them if our senses didn't actually work in the first place. In fact, it is our confidence in our own senses that has motivated us to create technologies that vastly expand on the capacities of our sense-based perception to make it possible to acquire information far beyond what it is possible for us to experience by means our our senses, alone. It is our understanding of our senses and how they work that has enabled us to use the information they provide to expend our perception.
Appreciate the detailed explanation! Thanks
This video is well put out and easy to digest, a random recommendation in my timeline but I enjoyed it very much. I've also had the same questions myself but I never really cared enough to think hard enough about it, even so, I'm really thankful for people like you who share your knowlege in the most digestible way.
No being able to see these colours with your own eyes is no more evident then when trying to see aurora in from light polluted areas. During the last high activity period if you looked up you could BARELY see feignt greens and reds in the sky. But take a long exposure with a camera, even an iphone, and WOW, the colours are striking.
Not true. They are extremely bright in every color if you go north far enough. Dont do it though or you will never look at them again since they will never look anything like that unless you go back up north.
you just earned a subscriber! i’m fascinated with astronomy
tldr; Hubble images are intended to convey scientific information, and at times be beautiful. They are not intended to show you what you would see if you were in a spaceship looking out the window.
Not really. They look very similar to what you would see. Hubble can see a little bit into UV though, so that part gets added. Other than that, it's the same what your eyes would perceive.
That's not to say that you wouldn't see anything at all. You probably would, just that it would be far dimmer and nowhere near as vibrant as in those images
So the images are dake and doctored is what youre saying. @MachineGunX2
Some are, tho.
@@caddystubedid you watch the video at all?
GREAT CHANNEL...hope to see this community grow. keep up the good works
Fantastic video! Perfectly layed out with such attention to detail while also clear in conveying the amazing information! Fabulous work! 😄
🙏 Glad you enjoyed it!
This answered some questions I always wondered. Thank you. Now I can sleep tonight.
So if I was at a point in space relative to the "Pillars" "photo", you're saying the colors might be completely different? It might look completely different?
Yes. It would look pitch black mostly.
With insanely huge and sensitive eyes (the size of skyscrapers), you would see it's all pinkish as most stuff in universe is. With your own eyes, you would see nothing, regardless of how close you are. You would see individual stars if you're close enough, that's it.
Pillars of creation are visually, from our vantage point, big as Mare Imbrium on Moon. These are insanely huge objects.
@@lajoswinkler So if we ever could travel that far, we would never see it? How disappointing.
@@srvafool You wouldn't see it (at least not clearly), but it is still there. It's basically a very dark object against a pitch black background.
The universe wasn't created specifically so we can see it. We can use our ingenuity to see things that we would otherwise not be able to. Basically like putting on night vision goggles to see the details in a dark room
@@lajoswinklerso how is a telescope able to see it? I have a 12” dob and was to capture some of it
Really great video, thanks Ian!
Fantastic information 😮
🙏❤️
Excellent presentation, which assuaged some of my angst over the artificiality of the Hubble images. Thanks.
So suppose we were really out there floating, then what we would ACTUALLY see is just a fainted version of the true colour images released by NASA?
It depends, some of them yes, but others capture wavelength outside the visible spectrum, e.g. infrared or x ray. Some hubble pitctures have those shifted into the visible range, so your eyes wouldn't see all the features that are only visible in these wavelengths in real life
A long-standing troubling question in my mind has finally been resolved. Thank you!
Great video, but it would be nice if you included a photo that simulated what a human would actually. You note that we wouldn't see colors and details in the standard hubble pics, but if we were hovering in space and looking at the sombrero galaxy, what WOULD we see. We can see Andromeda from earth even with atmosphere and lights, so ?? I think that's what people were hoping you would show from your title.
True. I got about 3 seconds of what I came to see and the rest was yapping and telling me stuff I knew already.
you can only see andromeda in near perfect conditions, and even then, it just looks like a fuzzy blob
Well done Ian! I really enjoyed this video!
Many thanks!
In short, "Like an unedited RAW image with 16bits color depth, 100 stops of Dynamic Range, 16.8millian megapixels before compounding, in several dozen puzzle tile pieces, each frequencies of lights seperately, with cosmic microwave background radiation noise."
I think it's also important to note Hubble did define what the colors should be for image processing, so in many ways, we can thank Hubble for helping to make color images consistent.
Got to ask. If that is not what you would see looking at a galaxy out in space what would you see.?
Likely grey smudges on the outer edge, with increasing brightness and (potentially) some gentle color closer to the core. I would imagine it would be similar to looking at the Milky Way in the night sky from a dark sky location.
Amazing video!!! By the way, you changed my life! I knew the space photos would be colorized artificially, but I didn't know the original photos would be so apparently blank
On a side note genuine question, why do photographers prefer not send the RAW? 🤔
Maybe because it's too large ? I mean from my experience, RAW seems to take alot of data space, but it's crucial for the processing process, especially stacking
I'm not a wedding photographer, so I only know of this issue from my friends who do wedding photography.
They say it's because they don't want to give out their unfinished work. It's like a baker handing over an unfinished cake. Handing over RAW files can misrepresent what they do. For exmaple, if the RAW files are awfully edited by someone else, then shown to people without the photographer knowing, their name (and business) is now associated with these terrible final images because they "took" the photo.
Besides, dealing with RAWs is often a hassle for customers who may not understand how to handle them or might complain about technical imperfections inherent in the format (lots of blurry photos, over/underexposed, bad angles, etc.)
@ Logical, but for us astro folks, RAW files are not alien. Always good to ask in advance because if I ever want a photoshoot, I'd want them RAWs. (I hoard data, yum😋)
@droid_ex1438 depends on the context really. Another decent format is .png, wich doesn't really compress the image. Regardless of preferences, amateur astrophotographers want to give a image you like, not some merky dark images where you can barely even see difference between nebula and space. One time all RAWs of mine came out seemingly blank and i wanter to delete them till i raised the contrast and found the nebula 😂.
Youre a natural storyteller. Great video.
The only thing you left unanswered was what we WOULD see when looking at these objects.
Probably nothing
It's funny how he skipped that part.
We would see absolutely nothing. Universe is mostly pitch black void. If we had huge eyes, we would see dim cyan whispy structures. If eyes were gigantic, or we had a monstrously big telescope to collect enough light, objects would appear pink-magenta from plasma of hydrogen and helium. That is the actual hue of almost all these diffused, nebular objects.
@ would you not at least see stars everywhere?
@ A few, but that's about it
This is fascinating, you’re an excellent teacher Ian! Thank you!
Thank you kindly!
If it's a conspiracy it would also be hilarious that they would offer the raw images and that they'd just appear grey unedited. I can already hear the conspiacy believers: "See, that proves that they fabricated this from grey images with nothing on them out of thin air, dude!1&11!//d?"
They did exactly that. Raw images are just black and grey.
@@JH-6g5
Who did that?
I like looking at astro photos and then trying to find whatever object I'm looking at in Space Engine. Being able to travel to and fly around the Pillars of Creation is pretty cool.
People miss the point of how amazing these structures are because their brain can't fathom how large they are... There's a cloud of gas 5 light years long!!! You don't understand how incomprehensible big that is! Color is the least impressive thing about objects in space!
Brilliantly and so entertainingly explained. Thankyou for this wonderful piece.
Lol 1:42 the number of times I've introduced a smudge doing the "is this crap on me screen" scratch.
😂
So glad I found this video, I’ve been wondering about this exact question for a long time.
6:14 why was that voice so realistic in my ears
Room reverb and stereo audio, as opposed to the rest of the voiceover which is mono
you learned something new everyday in this site!!
great video ❤,but how can we add 3 different images with different colors of rgb into one coloured image
That’s how all images on every camera are made. They’re just over laid on each other.
@jaredf6205 no,like if i use a red filter to capture a photo and then green, such photos i want to add.
@@radiangamer_king Oh you mean how like would you yourself do it? If you have the three different images you can put them into Photoshop using the channels tool and just put one in each of the red green and blue channels
thank you
Thankyou for making this video, i understand it a lot more better now 👍
I have a hard time not feeling scammed after these kinds of videos. Like where is the really looking Pic then? Show us what it would really look like please?
Not sure I understand (maybe I’m reading your question wrong) but I showed what the unedited image looks like in this video
I'm confused about your question, Packedburrito. The video states that an unedited, raw image from the Hubble Space Telescope would be shown. The unedited image (a seemingly black square) was displayed, but upon digital manipulation, the encoded image appears. This is the "true image". It's what exists in space, and what we would see (if our eyes had extremely long exposure time and could see the parts of the UV and IR intervals of the electromagnetic spectrum).
This was an incredible video! UA-cam algorithms really know what I like lol! You should make another video about the James Webb Telescope.
Thank you for an excellent video. It should be mandatory viewing for the "science deniers" who constantly shout "fake".... If only their feeble minds could accept knowledge.
So how these pillars looks like in real life? ok as i understand its grey for us cause its far away. but if we like kilometers from theem its grey too?
No, they love to boast about their ignorance too much to listen to facts!
Lmao @ "science deniers". Is that the same as "climate deniers" and "vaccine deniers"??
@@caddystube in this house, we believe in soyence!!
@snorman1911 I do too. But this is Soyence fiction, and just entertainment to me. :)
When talking about raw images being linear, it's true that they will look darker, but they are NOT flatter, they will actually have much higher contrast than they should. Gamma correction results in a natural-looking but much lower contrast image.
"False color" is akin to saying "transposed" in music. Our eyes can only see about an octave of the EM spectrum, but our ears can hear about 10 octaves of sound frequencies. So, we need to transpose most of the EM spectrum to our visible octave... Just imagine how much musical depth would be lost if you could only hear 1 octave of sound and every sound outside of that octave , whether from music or nature, had to be transposed to between A220 and A440, for instance.
My takeaway is that "false color" is not "prettier" or more "artistic" than reality and they're definitely not fake, rather, they are a shallow slice of what we would see if our eyes were sensitive enough to the true frequencies and amplitudes of the EM spectrum.
Wow! Great job explaining.