Can I process the JWST data better than NASA?
Вставка
- Опубліковано 14 лип 2022
- The first round of JWST data was made public just one day after the release of the first images. In this video, I show the best method for downloading the data using the MAST archive, and then give an overview of how I went about processing! Want to support this channel? Check out my Patreon: / nebulaphotos
This work is based [in part] on observations made with the NASA/ESA/CSA James Webb Space Telescope. The data were obtained from the Mikulski Archive for Space Telescopes at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-03127 for JWST.
My note on free software in the video was brief and possibly misleading. So here is a method I've now personally tested and can confirm it works!
1. Download and install FITS Liberator: noirlab.edu/public/products/fitsliberator/
2. Drop the jwst..._i2d.fits into FITS Liberator
3. Switch to Image 7 (or whatever the last image is) with the drop down menu
4. Save
5. Open the TIFF file that is saved in Siril, GIMP or whatever free photo editing program you like
Note: I still am not sure on the best way to crop/scale/register the images. Should be possible in Siril, but I haven't had time to try yet.
Cheers, Nico
You should find reference photos before coloring stuff
Didn't mean it as Harsh as its written
@@PSwayBeats There is some confusion here from many in the comments. This image HAS to be a false color image since it's an infrared camera/telescope. The infrared light this camera collects is at wavelengths that our eyes cannot perceive, so it doesn't have 'color' in the ordinary sense of the word. Reference images are of no use when colorizing data from multiple filters all outside of the visual spectrum we can see because what is the frame of reference if not our own vision?
@@PSwayBeats reference to what, more false color images? this is infrared, we cant see infrared. if you put all these filters into their real wavelengths you would see a black screen.
@@NebulaPhotos from the Hubble telescope it used normal light
Just to get the colors from it to put into that picture
I'll make sure everything I'm saying is fact so if I don't respond it's right
You are correct that the Hubble telescope picture of this region is in normal light (aka visual). However, the photo I've seen of this region from Hubble was taken with narrowband filters (not broadband). The processors used the same logic for narrowband data introduced by Jeff Hester (Hubble Palette - I have a video on it here: ua-cam.com/video/okEPUA_k2xQ/v-deo.html). The Hubble Palette is also a false color mapping, but one that the public has gotten quite used to. It is just a convention that caught on for it's beauty. True color of this region in the visual spectrum looks like this: en.wikipedia.org/wiki/NGC_3324#/media/File:The_star_formation_region_NGC_3324.jpg
But I don't understand the logic of trying to color infrared data to look like RGB data. For me the true beauty of the infrared data is how different it is from visual with a feeling of X-Ray vision where we can see through the gas clouds to stars we've never seen before.
"Can I process the JWST data better than NASA?"
*Proceeds to immediately ignore all the raw sensor data needed for removing the imaging artifacts*
Then the answer is "no".
It still looks nice, and its just a artistic image, not a scientific paper material.
So, as a newbie here, which files are the raw sensor data? I'd be curious to know how I can make this easier for myself since the video has inspired me to try this
@@arksum6818 see 9:10
@@markmolenaar843 But how do you actually accomplish this? It doesn't look like they would fill in the stars that are blown out or anything. I'm assuming the VAR layers are relevant here
The artefacts on the NASA version were probably removed by subtracting the noise layers/filters which you ignored.
Yes, they provide the raw sensor data, and calibration files so that the data from all the pixels looks similar and becomes smoother, like reality. 30:00
nah it looks highly smoothed
Which you also ignored
Yes, you normally have a darkfield image. The image is otherwise convoluted with the sensor response.
Even without a noise layer nor low pass filter (smoothing/blur) it should be easy to remove such patterns since they are specific to these artifacts.
Thanks for unraveling the mystery of acquiring the data, Nico! 'NASA' is actually my friend Joseph DePasquale, lead Science Imager for the James Webb, and the Chandra X-ray Observatory, beforehand. He was also an early adopter of PixInsight, and a super guy!
Thanks Warren! Great to know the name of the processor behind the image. Joe did an amazing job! What a reception this image has had. Mission succeeded at showing the public the capabilities of JWST!
@@WhiteGuysMadder what were they expecting? Alien fleets? 8k images of earth like worlds? I think they look incredible!
@@WhiteGuysMadder the images that have been released were from the testing phase. JWST first actual mission has yet to be completed
Joe who?
Sorry I had to do it
@@JustAnotherGuyOnTheInternet the horses name was yesterday
27:50 Those "globules" are freshly ignited baby stars streaking through the nebula at hundreds of miles per second gathering more dust to form a protoplanetary disk, what we would call "early heavy bombardment" as all the primordial material collides to form planets, all. of which in that particular star system would be glowing balls of lava.
That's the cutest thing.
Wonderful explanation 🎆
sounds logic for me.
btw, the gas is from a death, exploded star right? for example. So basicly they explode, and form again in millions of years (the 'gas' make some planets, and other objects also i guess) until forever? explode than renew into something else, but not totally different (expect us?). sry for bad english :)
When a star starts ignition it looks pretty chaotic. I think of it like a old gas engine backfiring and whatnot when it starts.
Thank you Nico, you went over all what I wanted to know about processing with multiple filters and the way to colorize the image.
I've been watching you for 2 years and I still find your videos absolutely amazing. I'm just getting back into astrophotography and I'm excited for my next session. 🤓
I recommend you not to crop the images before registering or processing them. Your last step might as well be the croping.
He did it for a reason. He cropped only a little to cut down on processing time in PixInsight. You don't want to process areas of image where not all of the layers will overlap as you are going to crop them anyway.
@@hubertanatol9446 they're not talking about the preview selection, but the actual cropping at 14:55
You definitely want to crop out things like integration artefacts and bright/black edges before you do anything else in your linear processing workflow.
Nice job on this!! Looks terrific, Nico. Thanks so much for showing us how to access this information.
You did a pretty good job tbh. And I'm astonished how good it turned out regarding you didn't process the dark and noise images.
The pipelines for telescope images can get very complex. Your outcome really shows how amazing the instruments on JWST are.
Ah, should have mentioned, this is level 3 data so already heavily calibrated. I took a look at level 1 data and it was not nearly this pretty. The calibration images are included for scientists who need them for their models. The lines I saw were the overlap of the mosaic tiles. I probably could have used one of those calibration images to reduce them, but the sensor noise was already subtracted.
@@NebulaPhotos
Ok
What I find interesting is that your version is closer to that of Hubble colour-wise. And I definitely prefer HST's images' colour scheme over JWST's. You gave us the detail of JWST and the aesthetics of HST, thanks for that!
Your image looks briliant. I love the depth and different colors, just like you said.
Absolutely incredible! Thanks for sharing…what an experience! Be safe. Cheers..
I've seen the other critiques of this video in the comments and I don't know enough to say whether they're valid or not but I find this to be a very good starting point with a simple high level path to follow. I'm interested in making images with the data in color scales more aimed at art than accurate representation and I feel like I could do that easily now and it's given me a good launching pad to learn more. Thanks for such a good video on what I had felt was a very complex process.
So much to learn from this video, although way over my head! Just knowing the raw-data is available stunned me. Seeing the pictures of the different filters almost shocked me by how much they differ. Brave to challenge NASA but if anyone can do this it‘s you - and you nailed it! But I am sure they over-saturated it on purpose because they knew how many times it would copied or printed. Let‘s just not forget the absolutely breathtaking work of everyone involved in even getting this data, JWST is brilliant!
Has NOTHING to do with copying or printing. Like most AMATEURS, they OVER SATURATE EVERYTHING thinking it "looks good". It doesn't. They are rank amateur scientists PRETENDING TO BE ARTISTS.
@@stanislouse4168 but it does look good (at least in this case)! i would even call it absolutly stunning
this is so great thanks for sharing and for all your work in this field!
I'm glad you did this image, so I don't have to, but also I've always wondered how this is done, specifically. Thanks!
Its truly amazing that anyone can take the raw data from essentially the greatest camera ever built, and turn it into an image themselves tailored to exactly how they want.
Ive got allot of skills using photoshop but have never composited data like this before to create an image. Having watched this video I feel so excited to try this for myself. I also think that once I get comfortable composting images I'd like to try use all that extra noise data to clean the image up even further.
I wondered what our more experienced astrophotographers would do with the data if they had it. Great sports at STSci to make it available and give people a fun challenge. Nice work. I'm liking your colors better!
Yes, It was great that STSCi dropped the usual six month wait for data on these initial images. Thanks Danette!
@@NebulaPhotos What about getting rid of the sparks?
@@UA-cam_Stole_My_Handle_Too Why would you want to get rid of the diffraction spikes?
@@jekanyika Because they are consequence of flawed instrument.
@@UA-cam_Stole_My_Handle_Too Without that "flawed" instrument the picture wouldn't exist.
Thanks! It was fun processing this data in PI & PS.
super cool video!! this allowed me to actually try this out with having no prior astrophotography experience. I feel like a master mind!!!
Nicely done, I think you captured more transparency, depth, and contrast within the nebula. You also seemed to better capture the streaking of dust off the "ridge", but NASA might have lost some of that in their de-noising. There are many ways to interpret this data and this was a great educational video.
Thank you for the different perspective!
Thank you for the video, I'll try the process myself!
Very nice rendition of the data Nico, i think i do prefer the way you have perhaps been able to highlight more of the differences in structure due to the greater variance in colour. I for one would very much like to see more detailed PS workflow if you dont mind sharing Secrets :). I use PI myself but feel i am missing out on the subtleties that can be achieved with a tool like PS (although i have Affinity i think the methods should transfer )
This is my second try to thank you for doing this. My first comment was deleted (by whom, why?). Your journey through the NASA jargon and lingo was SO HELPFUL. I diverged after 8:00 and developed, aligned and false colored the IR layers in Affinity Photo. I am a beginner's beginner at this, but you have greatly sparked my interest, and perhaps that of others.
Hi John, Thanks for your comment. Apologies - I didn't delete your previous comment, but youtube does have spam blockers that do seem to make strange mistakes occasionally. Affinity Photo seems great, especially for the low price. I'm still learning it. Cheers, Nico
Thank you for showing us your work and that website!!!!
My pleasure. Cheers, Nico
very nice! I just downloaded the data for the southern ring nebula (MIRI and NIRCam) just going to follow the tutorial and try to get all 10 filters in 1 image. thanks man! also i like your processing better!
Aren't the flat image and noise images there to be subtracted from the data to clean up instrument noise and Black levels?
These are Level 3 images so a great deal of calibration has already been done. I looked at the Level 1 images in the bulk download and they were not pretty! But yes, I suspect you are right that some of those images I discarded could be used to further eliminate the mosaic seams I was seeing towards the end of the video. Cheers, Nico
Incredible work! Two questions...
1. Can you find the spread of the filters light frequency, and then stretch our visible light over the top so that the highest frequency of infrared filter is the highest frequency of visible light? Then do the same for the lowest. Then you have a ruler as to where you put all of the middle filters. Would that make everything more to scale in our visible spectrum?
2. Are some of the filters more "strong" than others? Meaning, do some of the frequencies have more light coming in than others? If so, did you/could you compensate for that somehow?
Again, phenomenal processing!
1. Short answer, no. Different objects reflect and absorb different frequencies of light at different levels (yes that's a lot of "different"). Unless you know what the objects you are looking at are made of, you can not tell how it would reflect (or absorb, scatter, disperse, diffract, refract, transmit...) visible light based only on reflected infrared (and near infrared) light. Hence why NASA use false coloring, albeit they could use the data from the NIRISS and/or NIRSpec instruments onboard the JWST to determinate the composition of every objects in the image...
2. Yes, filters have different transmission rates and yes, you can compensate for it. Here's, as mentioned briefly in the video, a link to the official information page on JWST's filters, including a table where you can see the transmission rate of each filter. There are also other calibration parameter to consider, but that's well above my understanding. Here's a very comprehensive paper on infrared calibration in astronomical photometry if you want to learn more on the subject : iopscience.iop.org/article/10.1088/0004-6256/135/6/2245/pdf
@@nic12344 Thank you for the detailed response!
Oh and I forgot to mention that some objects, like dwarf stars or brown dwarfs, emit most, or all, of their light in the form of infrared radiation. Therefore, there would be no way to translate the light emitted by them to visible light. Their light also interacts with other objects so that some are only visible in infrared.
After a I posted a crap loads of comments everywhere - you are the 1st to address my comment about the blurry (or as you called it "Smoothed") image at the pixel level - Thanks You !!!!!!!!
I like how no matter which processed data anyone prefers over others work, it's all fundamentally as accurate as the rest (save for artifact removal).
Considering the size of the full NIRCAM spectrum I think the difference in size could be binning to prevent unnecessary oversampling at larger wavelength / lower resolution
Ah, didn't think of that. Very interesting
+1 for the algorithm for suggesting something I actually like. More than a few times (many times) I've created my own images from non-user-friendly datasets, bypassing the publicly available visualizations. I really enjoyed this both as a work of entertainment on its own, as well as a reference piece for new information I did not know. It actually made me realize something about myself. Until now, I assumed I chose to make my own visualizations because I had specific esthetic tastes that the available visualizations didn't meet; but now I realize I choose to make my own because I like it. The journey and effort are at least as important as the final result, if not more.
hahaha :)
I totally agree with this sentiment. Thanks for the nice comment Zachary. Cheers, Nico
DUDE! I was looking for 4k images for so long. Can't believe they can't figure that out amazing.
Well done. Clearly you are very well acquainted with Photoshop and it shows. Also I agree with your handling of the color saturation and the reluctance to "over-smooth" the final image. I like the minor equipment generated imperfections in that final image. It tells me that when I look at that image I'm approaching the technical limit for what it's possible to see.
Hey, research astronomer here - this is some really nice looking work! I really admire how well thought out and how well executed this was, and I think the result is absolutely breathtaking!
Thanks! What is your research area if you don't mind me asking?
@@NebulaPhotos Oh I love yapping about it 😉 - interstellar gas properties in strongly star forming galaxies, mainly gravitationally lensed ones.
Any upcoming JWST observations you are particularly excited about that could help your research?
@@NebulaPhotos Yeah I'm involved in two programs where we'll use the NIRSpec IFU to observe two gravitationally lensed galaxies, one of them known as the Sunburst Arc. We'll supplement with NIRCam imaging in the six broad filters. Pretty excited about those!
I'm also loosely involved in one of the Early Release Science programs doing something similar, called TEMPLATES. Some of those observations have already come in!
@@thgeremilrivera-thorsen9556 I can't help but want to ask you a question if you don't mind, In the first image from Webb telescope, we can see there are some stretched galaxies in different circular level, I mean apparently it looks there are two gravitational lenses. My question is, what do you see, is it two or one? Thanks in advance.
I'm now very curious if the NASA processed images have a more "systematic" way of creating the colour, rather than doing it by eye/feel?
Yes, they do. This video is basically clickbait, as just eyeballing photoshop sliders is great for a hobbyist but not great for accuracy. This video didn't even take into account the noise filters, which is crucial to getting a correct image.
@@leftofzen Ya I thought he was gonna compose a new, maybe more accurate image with all the data, not just play with colors in a way that's even less accurate to reality than what NASA presented.
@@Heyim18bro Accurate to reality? It's in fucking infrared... Who's giant infrared eye are you expecting this person to replicate here? You can't see infrared light with your piddly eyeballs so an accurate image would be invisible to you. The audacity
@@Heyim18bro Neither image is accurate to what would be seen by the human eye.
@@leftofzen Its not clickbait at all. He goes into how to use FITS file yourself, which is not common outside the astronomy community, and how to start doing edits on your own. If you want the best corrected image, its already available you nerd. If you expected someone to create some new super detailed image even outside of what Imaging team at JWST can do, then its your own fault for believing that.
Love the vids keep them coming 👍
Love this tutorial, sat down with a plan to try and do it myself, just bought a mono cooled astro cam so good practice..not got pixinsight so it will be with Gimp ...lets see what happens...thanks very much for the great content...Pete
Wonderful work. I prefer your processing on this image, but I also love the NASA version and understand why they went a bit more saturated cleaned the blends.
I have processed some Juno images and this info is all really helpful!
I have spent hours trying to find other images on that site. I keep downloading stuff and using that DS9 program. Most are blank. Is there any rhyme or reason to the MAST site? How does one find just images? Would love to see a galaxy that I can edit, but other than what you've shown, I can't find anything.
Keep up the good content! Your directions were good, and got me to the point of getting to where the data is stored, and using a program to look at it
I imagine you would have to enter in very very specific information of a specific image you want to edit yourself. To find images of Galaxies you should probably first checkmark Hubble Space Telescope in the filter and figure out the date of which your image was taken, and additional info too.
@@ET-yc4wb Thanks, I'll try that.
While following (or trying to) the video’s instructions I’m reminded why the “year of the Linux desktop’ has been postponed for the last one or two decades.
The video presenter did a great job with the tools given!
I dont know how i ended up here, but i reallyu liked the official picture and now i am here, my first edit. awesome video!
Is there a website where users share their own processed images? I'd be interested in seeing different versions.
Flickr?
astronomers use instagram
@@keithancajas4623 that results in roughed up quality though. I'd be interested to see actual size images like the ones released by NASA, but processed by other people like Nico.
r/space
So you made a more esthetically pleasing image... While NASA is more interested in gathering scientifically accurate data. Good job!
It may as well drop some flying pigs for an artistic effect.
good job, good luck bro and keep improving
Thanks Nico…always enjoy your videos ❤️😊🌷🔭
"Can I process the JWST data better than NASA?"
9:24 - "I don't know what this file is"
9:29 - "I don't know what most of these are"
*takes the final composite file and plays with some hue sliders in photoshop*
*adds a ton of noise to the image*
I think you have your answer.
I would be interested in seeing the most realistic image, so wonder what it would look like if each filter was given the correct wavelength colour minis the actual red shift to the target
There isn't actual "correct" wavelength color, because most of the wavelengths captured by JWST are invisible to the human eye. The most realistic colors would of course be colors on the visible spectrum that we can try to match with the invisible wavelengths.
This object is inside the Milky Way, so there is no redshift. What we are looking at is intrinsically infrared and can only ever be seen with an artificial color representation.
Just learning about this option is news to me thank you
This is amazing. Thank you so much!!
The more colorful one from NASA seems more appealing at first but your version really has more depth. The clouds of dust have a more translucent quality. There just seems to be more space. Well done. (27:07 Bravo 👏 exactly!)
Not it does not, has artefacts all over it
Hi! This is quite nice and funny to do! I just want to point a little stuff related to Photoshop reguard the technique You use for "giving a color" to any layer. Since it is completely arbitrary the color You give to any bandpass it's ok to proceed the way You've done but it could be even better in terms of color purity. You have to think You want a "colorscale" that goes from black to the purest color You've chosen. The colorize method it's the right way but, in Your exemple, You've not obtained a pure black to full color scale of each layer 'cos of the way PS works. Under the hood PS accomplish all his operation using the Lab color space since is the widest and more "brain perception related" one. For having a perfect and pure color You have to put saturation at 100% and lightness at 50%, only in this way You will have a black to pure color layer. This is related to Lab color space. If You put any color in the A and B channel but You live the Lightness channel white, the color will move in the HUE circle. The Lab color space has Lightness (which is not luminance) as a +/- variable with the zero point at 50% (medium gray at 128) as the "null point". If You put the lightness at 50% the colors generated by the A and B stymulus passes untoutched (non HUE variation), for the same reason if You put the lightness value in the HUE and Saturation adjustment at 50% You will obtain perfect pure color. Sorry if I was been too "technical" but I thought it was nice to know, mabye it can improve someone's workflow with PS... Thanks anyway for Your videos and efforts on this channel and keep going!!!
Please make a video using this technique with the same image! it would be super cool to compare results!
@@sNsReal I still don't have a UA-cam Channel but mabye I can try... Anyway this Channel by Nico Is extremely good!
@@giovannipaglioli2302 Just download the images, open photoshop and start recording with obs! then upload it to you tube! Thats it!
I never liked Photoshop. None of this looks good to me. Don't tell me numbers or draw a graph. How does it look in the end is what counts.
@@stanislouse4168 this Is, for Shure, a good approach and the thing that matters in the end. My pleasure Is also to underestand why. These numbers and parameters are related to simple tools You can use if You Need them. It Is satisfactory to me first to underestand how and why we "see" and percive outside world the way we do and then choose tools to accomplish an idea. Anyone will enjoy this hobby differently and is totally right.
LOVELY....finished images are grandiose !
i like the official image more, but yours is great as well! It really gives me the feeling that there is a "filter" or "tint" to the image
I like your version better! More smooth and wallpaper friendly. Can you provide a link to download the image in full res? I have NASA version as my desktop wallpaper now, but it's too vibrant.
www.flickr.com/photos/2ghouls/52217303292/sizes/l/
@@NebulaPhotos came here to find this - really enjoyed your edit much more than NASA's one which like you said was much heavier on the saturation
It's different. I know it's a matter of preference. But for me NASA blue - orange / gold version looks way more elegant, artistic and striking - the way to go if you're showing first images of JWST to the world if You ask me.
Shame it got laughed off the internet and now its magically broken. Wake up bud.
@@WhiteWishesHD Wow, what a 'typical' internet user you are. I'm not allowed to like it you say? ;) Nothing is broken because you say so, bud.
@@MicInc87 Meteoroid hit has caused 'significant uncorrectable' damage to James Webb Space Telescope
@@WhiteWishesHD Which was compensated for. The telescope is still working well within expected parameters. They addressed this at the same time when mentioning the damage. It's 'significant uncorrectable' in the sense that the damage was enough that JWST is performing ever so slightly worse than before the hit even after making adjustments to the lenses.
You need to stop picking out only what's convenient for your narrative.
I'm so glad you made the point to people that all these image colors are made up including NASA'S versions of them. I tried to explain this to several people on the internet and got some really intense backlash. You did an excellent job on the video and we all apricate it.
Wow!! 👏 👏 👍👍 I somehow just stumbled on this clip. It's a bit advanced for me but you're so informative. Bet this will help sooo many people. I'm excited to sub & check out the rest of your channel. Will definitely recommend 👌 👍
Similar techniques are used in making trichrome photos with Black and White cameras on Earth, similar to the work of Sergey Prokudin-Gorskii.
One thing worth noting: some of NIRcam's filters are actually in the visible range: the visible range starts at 0.75 microns so, and you can see the last two filters fall within that.
The issue, of course, is an image made entirely of reds and oranges based on only 2 of NIRcam's 6 channels would not represent a lot of the data webb captured and would look kind of bad.
You confirmed what I thought when I saw the NASA release: it looks like the kids at NASA went ham with noise reduction and then sharpening and made it extra crunchy. I do like the more vibrant colors from the NASA version, but I prefer yours overall due to it looking less like a CG construct. Good job, dude.
@@realtraphotography Lots of forums had discussions how authentic and processed these photos are anyway. People were right, they could see it at the first glance. I think seeing how their phones overprocess some photos gave many people even casuals a way to spot "fakeness" in ther pics.
very educational, i like to try my self i am very passionate for space images and this you show us is kind a tutorial to make our own interpretations about the universe is simply fantastic
Amazing work!!!! ❤
I really like the processing that you did. Giving more depth and separation to the different colors looks really good. It is interesting seeing a "for public" image vs a "for enthusiast" image where Nasa is going for the stronger public wow factor with the high saturation and duo tone style instead of maybe being a bit more true to life. Same thing is done with normal photography where the general public would prefer the higher saturation image even if it is over done compared to a true to life edit showing a more natural image. Both are valid approaches and accomplish the goal but I for one am partial to your edit bringing out more tones.
Good post. This basically acknowledges that altering, colorizing, and manipulating for special effects is taking place. No problem as long as viewers are aware of this...
Just think about it right... somewhere in these pictures... is a bunch of aliens having naughty time
I thought you would say "bunch of aliens wondering at the sky", but....damn was I wrong
this is not true
Excelente work mate!
Nice video. As a experienced astrophotographer and PI user. Your technique and and processing goals are pretty much what I would try to do with all those different channels of data.
Thanks Dave. Glad to hear it!
Thanks!
I had no idea that the JWST raw files were available here like this.
Complementary to this video for the non-pro's like me out there, is a video on how to process the exposure and colours using GIMP
ua-cam.com/video/w2ME6AXavAo/v-deo.html
No need for other tools as it handles fits format and does colour layers.
Yeah, you should be able to get raw files from other space telescopes like Hubble as well, actually.
Thank you so much for this! I was wondering if GIMP would support this file type because that's the only software I have.
@@megelizabeth9492 not only JWST and HST, but around all other space observatory missions. Spitzer as well btw.
The more choices are made to make the image look "better", the more artefacts are introduced for the viewer unaware of those choices. Of course it's all false colour to begin with, but the false colour can still be a direct, linear representation of what the actual data is. Making the picture prettier doesn't make it "better".
The rendered colour images are always for the aesthetics first and foremost. NASA's image isn't "linear", fyi. Nor are the different layers here linear either. So yes, absolutely moot "point". If you wnat a direct "linear" representation of the data.... just go on, do your idea of linear, and I guarantee to you it will not only look meh, your representation without any knowledge will just help you get even more wrong ideas.
@@louisvictor3473 I think you understand just fine what my point is and pedantry won't change that. The images are certainly improved for publicity, but the people working on those releases typically have a good scientific understanding of the interpretation of the data represented. Choices they make ought to align with the science, making it easier for laypeople to see what is there and reach conclusions that align with the facts. If you think I meant anything else by giving an oversimplified explanation of how that is done, I don't think your basic mindset is very charitable - you certainly didn't ask.
@@JaapvanderVelde It is not pedantry, and it is not lack of asking. Your point was clear, no need for questions, but it doesn't make your point any less moot, technically wrong in the pedantic or the "charitable" version, and fundamentally meaningless anyway.
To begin with, it is meaningless because the only way the colorized image can give the lay person some insight on "the fuck am seeing here besides pretty colors" is with extra information about the coloring process, which expands the realm of useable schemes vastly. So if there is a science communication issue, it is not what you're complaining.
But, if you don't believe the above or if we ignore that, what you said is techncially outrageously wrong. Because this coloring already _is_ the linear (or "linear" or whatever) representation you say you're asking for in the "charitable" non pedantic sense. It has a single color per wave length, and it has similar wave length associations (shortest source wave length gets a blue, highest gets a red, etc.). The aesthetic color refinements done after that initial assignment still preserve this relatiionship. All it does is make each color more distinct and easier to see. So, if we are not beign pedantic, this is literally what you claim to wish for. Meaning, you're posturing some virtue while complaining about things you know absolutely nothing about for "reasons", and none of said possible reasons is really a good character trait. Hardly a charitable reading, is it?
The actually charitable reading would be the pedantic one, where you want every pixel color to be also mathematically perfectly proporitonal in the visible spectrum. As hinted at 22:40 to 23:00, this would be essentially NASA's version but less saturated (they tune it up for aesthetics too). Still plenty wrong, human color perception doesn't quite work like that (hence all colors are smooshed into a blue or an orange, thus all the details you hoped to communicate are lost), but at least it would be correct to say this coloring isn't that and NASA's is more like this, and it doesn't imply as strongly what the other reading implies. Still visbily wrong, but you come out less bad.
So yes, you got a reading exactly as charitable as what you actually wrote deserves.
This was a good way to demonstrate the difference between what NASA shows us and the raw data and pull back the curtain a little bit to separate the science from the fluff.
Thanks for sharing the your process. I've been playing with this data since it came out, and I can tell it's going to spoil me with my own data, but whatever :) Also, whenever ever I watch someone like you using PS, I realize why I stay with PI for all my processing...Nice meeting you at AIC this year.
great job. nasa has some skilled processors though!
👍 Definite processing taking place. Just wish the masses viewing were aware of this..
I’m confused why does the JWST take pictures of objects in space in black and white?
bro there aint no fuckign space. space is the distance from here to the end of roamability. never played a video game? theres nothing outside the world. you are in the world, there aint nothing but the world
@@__________HolySpirit__________ lol no way u actually believe that
@@__________HolySpirit__________ ever looked through a telescope?
JWST is an infrared telescope. It's much harder to produce a high resolution image using visible light for something very far away because of all the dust in space that can block it.
@@Xynic48 There's also red shift.
thank you I love learning new things but most times it just watching a you tube vid but I never know how they do it. Very cool thanks again.
Thanks, great work!
Oh man I feel so silly I've been using the Terminal and downloading 320 GB folders 🤣 Of course there was an easier way, thank you for showing it! And beautifuuuuul final result!
The website confused me too! I thought it meant the bulk download and stuff on the portal were the same. So I started the bulk download and was trying to figure it out, but just getting so confused by it, I almost gave up before I found this way. Did you end up stitching the mosaic together yourself or did you find the same stitched 'i2d' files somewhere in the 320gb structure of folders?
It all depends on what data you are actually trying to visualize. Sure you can make it "look" different (maybe even nicer) but are you then really seeing the data the scientists need? How one processes the data is 100% dependent on what one is researching so you can never say you can do it better as you might ignore the data they were trying to highlight.
This is amazing!
Looks amazing.
looks amazing but don't forget that the way they edit the images is from a scientific point of view so have to keep that in mind
Your version is not better than NASA's version its just a version you and some other people prefer, but the majority of people, like me, prefer it in NASA's way.
Very interesting. Subscribed!
Great work!
Short answer: No he can’t
Even though it's false color, what NASA does is take near/middle/far ranges of infrared and translates those sections of wavelengths into RGB. Your processing obviously fails because while there are gas clouds present, space is black not blue, even though they obviously don't express it in true black. So your image is just worse in every regard. They may have smoothed out the image a bit to reduce noise, but it doesn't really take away from it, as there's no useful data for the public in those finer details, and there's not much actually lost in the way of overall detail, since contrast and color saturation reinforce the details that are there.
Positivity is spewing out of this comment.
So basically his taste in imaging is wrong and yours is right?
thank you for this video, i've downloaded on single image from 21 october from jwst, and it amazing howmuch detail you have in one single image, it is not as good as the fully false colored image, but when you seen al the stars in the image, and realising it is like a very little portion of the sky, i do feel very little
I can’t wait to try this!,
No you can't, NASA looks better
Awesome stuff.
Followed your method mostly using GIMP and made a nice wallpaper for my Win11 PC. Looks awesome on my 1440p 165Hz 10bit G-Sync monitor.
Amazing video.
great work awesome idea
Great job! Yours is much more dynamic than the NASA version.
Your version looks great as well, Nico! I too normally appreciate a more varied and delicate color palette. I'm giving it a go myself and have a handle on everything except how to align... in GIMP. Any tips, or do you know of a decent video where this is addressed? Initially, I thought that I could just run the six channels through Autostakkert! or Registax, but that's not going to do it since there's no way to separate the alignment from the stacking with those tools... thanks.
As far as I know, the little fiery yellow redish dot ( round abou minute 33 of the video) is thought to be a star, that was thrown out of a binary or higher system, heading with an tremendous speed.
Thank you for this grate tutorial.
CS Christian
You mean the red ball @ 27:40 right? The entire video is just 33:04
@@ssa7843 Yes. Of course. Sorry 😅
my mind was blown. thank you.
Hi Nico, I loved this video! When the JWST photos came out I was hoping for a way to get the raw data so thanks for that. Is there a possibility you could do another video for the first half of the processing with one of the free alternatives for pixinsight. Definitely seems like pixinsight is a bit more user friendly. Thanks!
Agreed. I tried both of the alternatives mentioned and found them to be virtually incomprehensible.
Agreed. I tried both of the alternatives mentioned and found them to be virtually incomprehensible.
good job!
Not bad, I definitely wouldn’t say better, but it’s different and cool to see some of the steps to processing the data… Hats off to the team that made that telescope a reality, it’s astonishing the capabilities of that thing especially if you’ve tried astro imaging as a hobby….