Learn how to remove tricky colour fringing from your images, both during RAW development and in the post production phase of editing. Credits: Photography by James Ritson.
My muscle memory is still locked into CS5 so, the transition to using "Photo" is a bit tricky. However, the speed of delivery and clear explanations are much appreciated. I'm looking forward to seeing more short, bite sized tutorials that I can come back to.
Are those astrophotography filters also available for Affinity Photo 2 for iPad? The afmacros for astrophotography work great on the iPad Pro M4. But can’t find the filters nowhere to install them.
On the moon image, wouldn't it be better to remove the fringe before all the other edits? That way the other edits won't have fringe to amplify and you won't have to use those large values in the defringe filter.
Predominantly for performance if you have older hardware such as a weak or unsupported GPU-virtually all operations in Photo are hardware accelerated, but if it has to fall back to CPU (software) compositing you will find that having multiple live filters in a layer stack can be quite slow. A destructive version is less flexible, but once the filter is rendered it no longer has any effect on performance.
@@toolazytobeoriginal4587 AI isn't programmed. It's trained. They know how to do it. The question is, is that feature on their roadmap? Another question is, is the time spent on one single feature worth the sacrifice of development time on other features or bugs? You may want generative fill, others may want something else. What do they pick?
I'm sure they will add it at some point, but I'm OK if they never do. With plenty of free tools for this, it's not a priority for those who value this solid and affordable alternative to RENTING software.
@@LV4EVR I have to applaud them for sticking with,buy and forget when the industry is leaning toward subscription based model. As far as pushing the product forward I think adding some sort of generative AI would be beneficial if implemented correctly but the cost of adding it to the current iteration of Affinity world undoubtedly come at a cost. AI features like what Topaz have on offer might be a better fit, their upscaling is quite good, or some of Adobe's other tricks like adding colour to black and white images or expanding the image canvas.
@@LeBurkaTron I think it's important to call Adobe's "subscription model" what it is: software RENTAL. As soon as you stop renting, you can no longer edit your own files. You are essentially paying to be held hostage. I refuse to ever participate in this evil model, from any company, regardless of features or functionality.
That Defringe filter works like magic! ✨
My muscle memory is still locked into CS5 so, the transition to using "Photo" is a bit tricky. However, the speed of delivery and clear explanations are much appreciated. I'm looking forward to seeing more short, bite sized tutorials that I can come back to.
As always an excellent tutorial,thank you 👍
Excellent tutorial. You have some of the best ones out there.
Another great tutorial. Thank you, James.
Great video - I am fully switched over to Affinity Phot now !
Thanks for this reminder. Good tutorial, as always.
An excellent video very well explained.
Are those astrophotography filters also available for Affinity Photo 2 for iPad? The afmacros for astrophotography work great on the iPad Pro M4. But can’t find the filters nowhere to install them.
Very good and succinct.
Very useful! Thanks.
On the moon image, wouldn't it be better to remove the fringe before all the other edits? That way the other edits won't have fringe to amplify and you won't have to use those large values in the defringe filter.
very good!
Is the moon photo suffering from color fringing or chromatic aberration (lens based) ?
I found this tutorial quite useful. But I'm still confused about when to apply the Defringe filter vs. when to apply the Chromatic Aberration filter.
How did you get your AI voice to sound so good?
Trade secret I'm afraid!
so why would you use a destructive filter over a non-destructive one?
Predominantly for performance if you have older hardware such as a weak or unsupported GPU-virtually all operations in Photo are hardware accelerated, but if it has to fall back to CPU (software) compositing you will find that having multiple live filters in a layer stack can be quite slow. A destructive version is less flexible, but once the filter is rendered it no longer has any effect on performance.
A.K.A. Affinity Photo ASMR
When are we gonna see Generative Fill?
No clue it they even know how to program AI
@@toolazytobeoriginal4587 AI isn't programmed. It's trained. They know how to do it. The question is, is that feature on their roadmap? Another question is, is the time spent on one single feature worth the sacrifice of development time on other features or bugs? You may want generative fill, others may want something else. What do they pick?
I'm sure they will add it at some point, but I'm OK if they never do. With plenty of free tools for this, it's not a priority for those who value this solid and affordable alternative to RENTING software.
@@LV4EVR I have to applaud them for sticking with,buy and forget when the industry is leaning toward subscription based model. As far as pushing the product forward I think adding some sort of generative AI would be beneficial if implemented correctly but the cost of adding it to the current iteration of Affinity world undoubtedly come at a cost. AI features like what Topaz have on offer might be a better fit, their upscaling is quite good, or some of Adobe's other tricks like adding colour to black and white images or expanding the image canvas.
@@LeBurkaTron I think it's important to call Adobe's "subscription model" what it is: software RENTAL. As soon as you stop renting, you can no longer edit your own files. You are essentially paying to be held hostage. I refuse to ever participate in this evil model, from any company, regardless of features or functionality.