Pixinsight Pre-Processing Workflow w/ Blink & Subframe Selector - Plus Faster SSD Setup!

Поділитися
Вставка
  • Опубліковано 3 лют 2025

КОМЕНТАРІ • 49

  • @darkrangersinc
    @darkrangersinc  Рік тому

    Join our growing Community for additional help and support including 1 on 1 time if needed!
    www.patreon.com/DarkRangersInc?

  • @tombock336
    @tombock336 Рік тому +1

    Great covering of these topics, Ryan!!!

  • @ekalbkr
    @ekalbkr Рік тому +1

    You just got one subscriber closer to your goal. I appreciate your technical knowledge and presentations skills. Blink is a great addition to my review process. Subframe Selector will take a while....

    • @darkrangersinc
      @darkrangersinc  Рік тому

      Appreciate it! Just follow it step by step with Subframe and it’s not bad. That’s why I simplified it to just one metric (stars)

  • @simonpepper5053
    @simonpepper5053 Рік тому +1

    Really clear video very impressed. Subfraneselector is a confusing process you made it simple. Thanks for sharing new sub here 👍

    • @darkrangersinc
      @darkrangersinc  Рік тому +1

      Thank you that’s kind of the point of the channel to make things that normally are difficult, approachable. I always try to show how you can go down a rabbit hole if you want but this is how to get great results with a nominal amount of effort. It’s definitely a diminishing return on a lot of these topics when you go super deep into them for most folks who aren’t even using blink or subframe at all simply implementing them at a simple level will lead to big improvements.
      I did watch the Pixinsight video on subframe selector to just see how they discussed it, and it really didn’t delve any deeper I think it was about a 7 min video but only on that one topic not several like this episode

  • @deepskydaddy
    @deepskydaddy 5 місяців тому

    This tutorial is killer. Thank you! Working through my data now, but had an issue with my sulfur. My hypothesis is that there were high clouds that are difficult to see in the subs and made my final image look deepfried. Hoping this process will fix things!

  • @Si-fp2ij
    @Si-fp2ij Рік тому +1

    Very good topic Ryan! I don’t think I scrutinise my lights enough prior to running WBPP so this was ver useful. Also good tips on the use of external HW to store and process the data.
    Have a great Christmas 🎅
    Cheers Si

    • @darkrangersinc
      @darkrangersinc  Рік тому +1

      You too Simon! Yeah, you don’t have to go too crazy but I would at least visually inspect each frame and pick at least one metric to narrow them down with WBPP

    • @Si-fp2ij
      @Si-fp2ij Рік тому

      @@darkrangersinc Yeah I always use the default WBPP setting and occasional blink but think I will add the blink and subframe selector (stars) thresholds for my new workflow 👍🏻

  • @jonrbryan
    @jonrbryan Рік тому +1

    I do all my file transfers wirelessly. I use a Vonets VBG1200 WiFi Bridge at 5Ghz and my ASIAir+ appears on my home network as a storage device. I can even transfer the frames from my first target and start working on it while the A+ is working on the second target. From my chair in the house and the scope 100 feet down the driveway. I usually wait and do it all at once in the morning, though.

    • @rvoykin
      @rvoykin Рік тому

      That’s pretty slick! I’m not on that level yet still walking outside each time. Maybe when I get an eagle with a camera rotator and automated flat panel that also acts as the scope cover I’ll have to do this so I never have to leave my house! Thanks for sharing.

  • @ILParr
    @ILParr Рік тому +1

    Per Adam Block a few tips I always use in WBP.
    Create Cosmectic Correction Process Icon e.g. CC_Auto with Use Auto Detect Hot Sigma = 3.0 (Cold Sigma not enabled).
    After loading lights under Calibration tab / Cosmetic Correction select CC_Auto process Icon created above and Apply To All Light Frames.
    ,For satellite rejection under Lights / Image Integration with Winsorized Sigma Clipping Set Sigma high to 2.2 and for Generalised Extreme Studentized Deviate set ESD Significance to .05. Depending on image quality extreme bright trails get tediously cloned out or discard the frame.

    • @darkrangersinc
      @darkrangersinc  Рік тому

      Yeah windsorized sigma clipping is good for bad trails I did 2.5 and 4 but for the ones in the image not needed - I think for the sake of this video which wasn’t a deep dive in WBPP just touched on it at the end the info above might confuse more than help without showing folks how to actually implement what you’re saying but for folks that know the settings could be helpful

  • @backyardspacedude4238
    @backyardspacedude4238 Рік тому

    Really great to know I’m already following a pre-process workflow that’s recommended by you! Great video.
    Only things I do differently are:
    - Eccentricity and Median as well as Stars on the rejection in Subframe Selector,
    - Don’t worry about the SPCC process for narrowband data as it’s not “correct” anyway,
    - Separate the filters in Subframe Selector (so do separate passes for each filter) because you will have natural differences in star count and median between Ha and Oiii for example.
    - If I’ve weighted the data to Stars in Subframe Selector, there’s no point re-weighting them in WBPP to something else. You can input “SSWEIGHT” (for the Subframe Selector weights) or “WBPPWGHT” (for the WBPP weights) as the FITS keyword when you come to combine the data in the Image Integration process. I’m fairly certain of this but await someone to prove me wrong!

    • @darkrangersinc
      @darkrangersinc  Рік тому +1

      Thank you, I think you have to remember I’m just showing how to use the tool for a broad group of people. Not everyone only shoots narrowband so I mention astrometric for those that don’t, some choose different subframe weights and some don’t etc.
      It’s really just more of an example. I didn’t actually throw out the four lowest star images just showed how to do it so people know
      My job is to show you the door to the rabbit hole and let you jump as far down as you want lol. I typically just use blink and I’m monitoring FWHM of every subframe as they come in and throwing them out before it even hits my computer. I’m eliminating anything that’s elongated or has passing clouds etc.
      But I realize most people don’t do that. So just giving an overview of what’s possible.
      I typically will load them in subframe selector all in one batch because I know on average Ha stars are the smallest and S2 are the largest (for my set up) so I take that into consideration when I’m looking at the data. By that time the data is all been combed through a couple times so I’m just looking for any extreme outliers relative to the other images in the same filter set.
      SNR for all the O3 frames was much lower than S2 for example but that’s normal so I don’t throw them out but if there was an O3 that was way off relative to the other O3 then I would if that makes sense. They’re all in there in order by filter so all the subs from the same filter are next to each other and I look for spikes within the group.
      It’s all going to go through BXT and sharpened and shrunk as it is.

    • @backyardspacedude4238
      @backyardspacedude4238 Рік тому

      @@darkrangersinc makes perfect sense! Thanks for replying. I’ve just come across your channel and I’m really enjoying your content, the production value is fantastic! It’s extra cool that you’ve replied so in-depth, too. Thank you!

  • @matadorbeagles1120
    @matadorbeagles1120 Рік тому +1

    Thanks for this description. Apologies if I missed it, but where is the link to what you used to attach the external drive to the ASIAIR and which rugged external SSD are you using with your rig?

    • @darkrangersinc
      @darkrangersinc  Рік тому +1

      No worries It’s in the description

    • @matadorbeagles1120
      @matadorbeagles1120 Рік тому

      @darkrangersinc Thanks I will look again. I missed it the first time.

  • @joshmanrobertson
    @joshmanrobertson Рік тому +2

    My workflow is pretty similar. I use the move files in the blink process to move all the bad ones to a trash folder. Then I load whats left into SubframeSelector and measure. I load in my acceptance parameters, etc. I use the Output Subframes to move the subs to their final location on my NAS. It might take a litle bit longer, but i find it simplifies the overall amount of data handling that takes place.
    Id be interested in knowing more about your folder structure. Ive got a system thats working for me (so far) but im. Always keen to see how others do it.

    • @darkrangersinc
      @darkrangersinc  Рік тому +1

      The reason I don’t mind leaving a less than perfect one on the drive is because it’s always nice to have one or two if you want to go back to that target at a later point, you can play solve off of one of those images so it matches perfectly.
      I do the filing by target name and then I have the app approved folder and then it automatically does the master folder. Try to keep it pretty simple. So everything is organized by the target. Pixinsight automatically created the registered/calibrated/master/logs so that’s all automatic

    • @joshmanrobertson
      @joshmanrobertson Рік тому

      Ah. I got it. when I was using the ASIair, i would create a folder on the external drive called "Targets." I'd drop a single good quality subframe into this folder for each target. Then i could clean out the Plans and Autorun folder as they became full, and still have a record of targets i could plate solve from and go back to as needed.

    • @rvoykin
      @rvoykin Рік тому

      @@joshmanrobertsonyeah that’s a good way as well. On the actual ASIAIR hardwdrive I have a bunch of targets with 1 or 2 images on there so I can go back to them
      I may just grab a thumb drive since I have an open USB port because I have the USB hub and just create something and leave it in there at all times one of those little tiny ones that I can just leave in there

  • @MazzifLOL
    @MazzifLOL Рік тому +1

    Great workflow example. For subframe selector, I use stars, eccentricity, mean and fwhm. This seems to cull about 20%. Maybe it's too aggressive but I feel the results are quality.

    • @darkrangersinc
      @darkrangersinc  Рік тому

      Yeah could be, depends on the consistency of your data. I don’t think it’s ever bad to be picky with your data. This was all shot on one night and the end result came out fine I did end up adding some more Ha and S2 afterwards but was happy with the outcome using this method.
      I guess I should say I am always checking the FWHM on most of them as they come in on the ASIAIR and tossing anything that is too high.

  • @JethroXP
    @JethroXP Рік тому

    Ever since PI and WBPP were updated earlier this year, with PSF weighting in WBPP, I've found that SubFrame Selector isn't really necessary anymore. I just Blink, toss anything with clouds, trees, or star trails, then put the rest in WBPP. Much simpler. I run Blink with auto animation fully zoomed out looking for big things, then I zoom in to about 3x and just use the arrow keys on my keyboard to cycle through manually looking for star quality.

    • @darkrangersinc
      @darkrangersinc  Рік тому +1

      I tend to typically monitor them on the way in as they’re imaging and check the FWHM of each image and toss the ones that don’t make the cut and obviously any clouds or trees so after blink I’m usually good to go too just showing folks who have never been exposed how to use the tool.
      I probably use Subframe Selector 10-20% of the time so while I don’t use it often due to improvements in PI I’m still glad I know how to use it.
      Had several requests and people like to be able to analyze the data so I still think it’s an important tool to know. Will it make a huge difference in your results probably not but it’s still nice to be able to use it especially over longer multi night projects during different cycles in the moon phase and see its impact on the data or check up on how any changes to your rig may be affecting your results.
      Once everything is dialed in and your in a pretty normal routine, it’s not super necessary I would tend to agree

    • @JethroXP
      @JethroXP Рік тому

      @@darkrangersinc I've got a permanent observatory in my backyard, so after I start a sequence I usually go to bed and check in the morning. I used to stay up all night and monitor, but that gets hard to do night after night ;-)

    • @darkrangersinc
      @darkrangersinc  Рік тому

      @@JethroXP yeah I’ll usually spot check till bed then take a look on my phone over WiFi with some coffee ☕️. Definitely can’t stay up all night!

    • @JethroXP
      @JethroXP Рік тому

      @@darkrangersincThis is a look at my setup. I also timelapse through a PI workflow. Funny how only a year later and the PI workflow I now use is so vastly different.
      ua-cam.com/video/T8DSG1qJylQ/v-deo.html

  • @mikehardy8247
    @mikehardy8247 Рік тому +1

    I'm wondering why such a high capacity, large drive? The USB ports on asiair are very close together, making plugging a thumb drive in precarious. So a cable end puts less strain on the A+ ports. Still?
    I use these SanDisk drives ans smallrig clamps on my cinema camera.

    • @rvoykin
      @rvoykin Рік тому +1

      Just grab a usb 3.0 hub and then you can have plenty of extras 👍🏼. Even with the SanDisk I still have extra spots I could plug more.
      I don’t know if you heard the part where I explain I used to sack off of the SanDisk drives as well as image on to them. I’ve only recently added the external NVME drive. Plus I image a lot and I like not having a million little thumb drives laying around a 2TB SSD now that I’m not stacking on it will fit years of images now that I’m not stacking on it and just imaging. Allows me to have access to so many targets to revisit and add too all in one place.

    • @mikehardy8247
      @mikehardy8247 Рік тому

      I have a light rig. SW SA GTI. Next year, an AM5!
      Thanks

  • @astrofromhome
    @astrofromhome 10 місяців тому

    Hi Ryan, great introduction into your workflow. Within the two years that I am using PI I never used the Subframe Selection. Do you really see any difference if you exclude these few frames with bigger or less stars compared to the full stack? I just exclude via blink and I get away with good round stars and sharp details.
    As most users already use RC-Astro processes I think we don't need to be to too picky on the sub-frame selection. Visual should be ok in that case. By upgrading to a newer notebook using BXT to correct everything for me I am down from 6+ minutes per run to about 30 seconds per run to have perfect stars and good sharpness of the object.
    Nevertheless I admit that the Subframe Selector will help everyone who does not use the RC-Astro processes.
    And a big YES to NVME drives. I have the 850x version of the same drive. I has a read/write of 6.9k and above. Going to extend it in a Raid0 once the second drive has arrived.
    I like the idea to have a portable hard drive connected to the AsiAir for easy file transfer. Removing the Tf card is always a bit of a pain in the mornings. I even had to rescue it from the rain drainage in the morning once I dropped it still being sleepy.

    • @darkrangersinc
      @darkrangersinc  10 місяців тому +1

      Good in good out, garbage in garbage out. The quality of the data in terms of the measurements that we would be looking at in subframe selector cannot be improved with artificial intelligence. I think it is tricking people into a false sense of security.
      Depending on how methodical your visual inspection is, you can still miss a lot of suboptimal frames that can have a negative impact on your data.
      If you watch my top 10 takeaways for 2023 video I say that AI is good only if it doesn’t cause you to lower your standards on your data collection.
      You forget that if your stars aren’t great, that also means the detail in the nebula is going to be soft. Unfortunately BXT doesn’t know the gas structures of every nebula and therefore can only guess what it’s supposed to be when sharpening.
      I still think we should be just as picky as we always have been with our data and utilize AI to take that same high-quality data to another level, not lean on it to compensate for lower quality subframes but that’s just my opinion folks can do whatever they think is right

    • @astrofromhome
      @astrofromhome 10 місяців тому

      @darkrangersinc Thanks a lot Ryan your detailed answer! Sure garbage in lasts in garbage out. It has been like this and will for ever be like this.
      Judging the single frames that you have presented, even with YT compression, at least I would have left the 6 or 7 frames in the stack. Sure stars have been bloated up a bit and some of the nebula areas had a weak SNR. The WSC with the given high/low threshold would have sorted much garbage out and the rest that would have been accepted by WBPP would have helped the overall SNR.
      On the one hand it is true that Ai is presenting a false security if there is real junk. On the other hand the algorithm can determine how structures need to be shrieked from its calculations. This is sharpening the structures back to their intended shape.
      Most likely it depends on the object, how good the good frames are and how many bad frames you have or how bad they are. The frame that you have identified visually to be crap I would have sorted out too.
      Maybe my eyes cannot catch these fine details because at least I cannot see any differences in my photos if I do my normal visual check or if I rely on a much stricter mathematical approach. That might be a reason why I am sorting out bad frames but not being THAT picky on the individual frames.
      Going to check your 2023 learnings video. In there should be a lot to learn for me. 👍🏻

  • @jimangela4589
    @jimangela4589 Рік тому

    I grabbed the Telescope Live data for IC405 and IC410 and the Ha data is bad and won't process. Did you have the same problem?

    • @darkrangersinc
      @darkrangersinc  Рік тому

      Do you know which telescope it was with sometimes they shoot it with multiple before I check it out. Also there Support is very responsive. Usually it’ll be Ernesto and he’s awesome to work with but I can also look into it

  • @Chiclets1
    @Chiclets1 7 місяців тому

    Aren't you supposed to calibrate subs before subframeselector?

    • @darkrangersinc
      @darkrangersinc  7 місяців тому

      @@Chiclets1 depends on what you are trying to evaluate.

  • @nikaxstrophotography
    @nikaxstrophotography Рік тому +1

    I think I'll stick with Astropixel Processor for stacking and everything to do with stacking, way more intuitive and actually faster and I believe, along with others who have used both, it does a better job. As for processing pixinsight and Photoshop all the way

    • @darkrangersinc
      @darkrangersinc  Рік тому +1

      Big fan of APP as well. Love how it does basically the subframe selector for you well it stacks and then at the end you can pull out any frames that look like they don’t make the cut and just stack it again because it’s also way way faster!
      But I don’t think most people have both programs so I had to make one for the majority of people and people who use a PP like you said don’t really need it!

    • @nikaxstrophotography
      @nikaxstrophotography Рік тому

      @@darkrangersinc Yep I can see what you are doing and it's fantastic for the community

    • @nikivan
      @nikivan Рік тому +2

      I wish the APP could save all the selected frames as projects, so they can be opened later to add more data.

    • @darkrangersinc
      @darkrangersinc  Рік тому

      @@nikivan I guess the only alternative would be to save them into a separate folder, but not as convenient I guess

  • @KaidenBainAstro
    @KaidenBainAstro 10 місяців тому

    Asymmetric solution 😂