Optimizing Your Image Capture with SkyTools | 2023-11-05

Поділитися
Вставка
  • Опубліковано 22 сер 2024
  • Greg will share the lessons he has learned from his SkyTools Imaging software, which is unique in that it uses a scientific model of the target object, sky, telescope, filter, and camera, to predict the signal that will be received on the detector. As you might imagine, this is helpful for optimizing your approach to image capture. Greg's approach embraces new thinking about selecting targets, choosing sub exposure times, and knowing the best gain to use.
    Greg Crinklaw is an astronomer from Cloudcroft, New Mexico, who is best known as the developer of SkyTools. Greg holds BS and MS degrees in astronomy and an MS in physics. He's worked for NASA as a software engineer in support of the Mars Orbiter Camera, which took thousands of pictures from Mars orbit for a decade. Greg considers himself to be a professionally trained life-long amateur astronomer, who has managed to do just about every kind of astronomy at one time or another.
    Greg's website: observing.skyh...
    🔭 Subscribe to our channel for more astrophotography tutorials: bit.ly/TAIC-su...
    💫 What do you want to learn about? Suggest a topic for a future show: theastroimagin...
    Want more TAIC in your life?
    ✨ Like us on Facebook: / theastroimagingchannel
    ✨ Join our Facebook group: / 1329722943877857
    ✨ Visit our website: theastroimagin...
    Found this helpful? Here are the best ways to support TAIC:
    🌟 Donate: tinyurl.com/Do...
    🌟 Volunteer: theastroimagin...
    Who are we?
    We’re a group of avid astrophotographers who meet online every week to discuss all aspects of astro imaging. From equipment to specialized scripts and everything in between, we connect you with others who are striving to capture and process better images.
    We’re a volunteer-run non-profit, supported by viewers like you
    #astrophotography

КОМЕНТАРІ • 14

  • @terrizittritsch745
    @terrizittritsch745 6 місяців тому

    Interesting software and nice presentation, thanks for hosting and this is a fantastic channel!! WRT to the software whether it's practical, or worth $200-$250, for typical backyard imagers I'm not entirely sure yet. I've purchased the software package and am in the setup phase which is cumbersome at best. One comment about not imaging if you're not above 80% maximum SNR, a comment early in the presentation, if that were the case no one is Vermont would create an image for most of the year. And we have plenty of astro photographers in Vermont who create pleasing images. Struggling astrophotographers I might add given how few clear nights we have. One thing that confuses me is that he shows Chroma LRGB filters in the technical analysis, implying they must be modeled. But they don't seem to exist in the filter pool in the tool and i'm having to model them myself picking off numbers from Chroma graphs. My biggest take away is I wonder how much practical considerations outweigh or trump the theoretical considerations? Maybe with an observatory in Chile, this makes more sense than in a suburb of Detroit, or in Vermont like I am. In the discussion around the planewave telescope, my understanding is no amount of 'tuning' will fix star movement due to refraction differences throughout the sky, and that needs to be modeled if not actively guiding. My guess is the planewave is running on a sky model that takes these things into account. I'll do a review for our club when I'm done.

  • @ronm6585
    @ronm6585 9 місяців тому

    Thank you for this.

  • @--Adrian--
    @--Adrian-- 9 місяців тому +1

    Nice presentation. For RGB imaging there is no standard RGB filters with known central wavelength, filter bandwidth and extinction coefficients. It is fairly easy to use the CCD equation to compute the exposure time needed for desired SNR in each filter with the important distinction that you need to use standard photometric filters parameters to have accurate results. Also you need to measure your sky brightness level in each photometric filter. So I would take these computations in Skytools with a grain of salt. I learned this when I wrote my own python app for exposure time computations ... easy applicable for science imaging but almost useless for amateur 'pretty pictures'

    • @skyhoundastro
      @skyhoundastro 8 місяців тому

      I disagree. I've been working on this for 15 years. In the software the filters are represented by their full transmission vs wavelength curve, so there is no need to know the central wavelength or bandwidth. Those values are moot. As for extinction, it can be estimated based on simple parameters such as relative humidity and ground level temperature. What we care about when scheduling is the differential extinction, not the absolute value. Even if the extinction is over/under estimated, the difference between the extinction at different airmass is only slightly affected. As for sky brightness, the SkyTools model predicts the spectral distribution of the sky brightness, so it can be applied to any filter in the visible range.

    • @--Adrian--
      @--Adrian-- 8 місяців тому

      @@skyhoundastro in the CCD equation the filter bandwidth is a parameter. The wavelength is needed to compute the photon energy is it not? With regard to sky mag, the user should measure it, the more you estimate the bigger the error

    • @skyhoundastro
      @skyhoundastro 8 місяців тому

      @@--Adrian-- Let me just say that I am happy to explain how these things work in my software. But it puts me on the defensive if you make claims that suggest my software can't work. The bottom line is that it has been tested over a wide range of imaging systems and levels of light pollution. So it is better to ask me how it does what it does.
      Computing the photon energy at one point is a gross simplification. Better to integrate over the full range of wavelengths taking into account not only the transmission of the filter but the QE of the detector and all of the other efficiencies that depend on wavelength.
      Your statement that the more you estimate the bigger the error seems convincing but in reality is very often not the case. Errors can combine in many different ways. This is one of the more valuable lessons I learned from physics. In any case, what we often want to know is the difference, rather than an absolute value. And therein lies the trick that makes it all work even if there are estimates involved

    • @--Adrian--
      @--Adrian-- 8 місяців тому

      @@skyhoundastro I'm not saying it can't work, I don't know how accurate it can be for amateur use. As an example what would be the surface magnitude of a galaxy in an Astrodon Luminance filter, as an input parameter, for exposure time computation ? I don't think there is such information, simply because nobody does photometry using an L filter, or RGB for that matter. If you don't know the magnitude you can't compute the flux density, if you can't compute the flux density you can't compute the exposure time.
      I would be more than happy if you have some documentation for the computations involved, references

    • @skyhoundastro
      @skyhoundastro 8 місяців тому

      I know how accurate it is because, as I have explained, I have tested it extensively. The methodologies I've developed resolve the very issues you're concerned about, even if they might not be immediately apparent to you.
      At this point, I believe further debate won't add value for either of us. We could literally go on like this for 100 questions, because doing this is just that complex. I stand by the capabilities of SkyTools, backed by extensive experience and proven results in the field.

  • @midnightlightning1
    @midnightlightning1 8 місяців тому

    re the SNR example (27m in) of adding high SNR L to low SNR RGB and seeing an overall improvement in SNR. Does the L have to be taken with separate subs or could a super luminance be extracted from the RGB. Instinct says it wouldn't work but I'm wondering whether the extracted luminance might not be affected by the chromatic noise in RGB?.

  • @davidemancini7853
    @davidemancini7853 9 місяців тому

    Fantastic work Greg!Is there any way to upload your software schedule in SGP?

    • @skyhoundastro
      @skyhoundastro 8 місяців тому

      Hello. I didn't think to look at the comments here until today, as this isn't my channel. Yes, SkyTools produces a file that can be imported into the latest Beta version of SGP (I am told).

  • @raeiqmusachi
    @raeiqmusachi 9 місяців тому

    at 42:32 the image on the right shows obviously eccentric stars, theres a tracking error with that long exposure

    • @davidkennedy3050
      @davidkennedy3050 9 місяців тому

      Yep, I don't think this comparison shows what he wants it to. Much of the increase in SNR is lost in the blur. What it shows to me is if an imager is exposure limited with an entry level mount, shorter exposures that minimize those errors are better.

    • @skyhoundastro
      @skyhoundastro 8 місяців тому

      Yes, you are correct. But this was not a measurement of the stars. Measuring/estimating the SNR for a star works very differently from an extended object. For an extended object, one can assume that even with poor tracking the light is still spread over the pixels that are being measured, resulting in very little difference in the signal. But it is the noise that really matters! Unfortunately, in this presentation the resolution doesn't allow for the noise to be seen easily. But as I said, I measured the SNR directly. The point was to show that the predictions of the model (for the nebula not the stars) are correct. The other thing this image shows is what I mean by longest practical exposure time. For this telescope, the ten minute exposure trailed. This would mean that, even though a 10 minute exposure is superior to shorter exposures, it is not practical for this telescope. My message was this: the best exposure time is always the longest time practical, and that is true for any amount of light pollution.