Indeed: Not just stat mech--->bio physics; but Today, stat mech--->economics (aka "econo-physics"). More fruitful insights than traditional economists have EVER been able to assert about societies of individuals (Macro economics). Fascinating: Boltzmann distribution represents distribution of wealth.
Hi I was reading about independent component analysis (ICA) which is a method for Blind Source Separation problem. The heart of the ICA is finding components whereby maximum entropy is possible. The question is how is the entropy associated with source probability?
+SHUBHAM GUPTA Hi Shubham, I am not sure I completely understand your question but I'll try. A measure of a source distribution's (i.e. probability) information content is related to its entropy. This is true of any distribution. While I am not an expert on ICA, a quick wiki search reveals that the goal of ICA is to identify maximally independent latent variables underlying a distribution. I can see how using a Kullback-Leibler divergence with P(A,B) as p and P(A)P(B) as q (with entropy being -sum p*log p/q) and maximizing over A and B would be a very reasonable thing to do. Essentially, this would give those values of A and B for which the joint distribution is as close to the product of marginals as possible.
+Devin Johnson Hi Devin, suppose each pixel intensity comes with some uncertainty. Then, accounting for all possible pixel values across all pixels, there are a wide range of acceptable images. MaxEnt allows you to converge to the least biased of those intensity assignments across all pixels. The result is the "deconvoluted image". It's a fascinating field and I recommend google Skilling/image processing/MaxEnt for more info. Hope this helps!
Indeed: Not just stat mech--->bio physics; but Today, stat mech--->economics (aka "econo-physics"). More fruitful insights than traditional economists have EVER been able to assert about societies of individuals (Macro economics). Fascinating: Boltzmann distribution represents distribution of wealth.
Is Maximum Entropy is graduate level subject? Which course subject is it usually taught in if I want to search google or course catalog?
Hi I was reading about independent component analysis (ICA) which is a method for Blind Source Separation problem. The heart of the ICA is finding components whereby maximum entropy is possible. The question is how is the entropy associated with source probability?
+SHUBHAM GUPTA
Hi Shubham, I am not sure I completely understand your question but I'll try. A measure of a source distribution's (i.e. probability) information content is related to its entropy. This is true of any distribution. While I am not an expert on ICA, a quick wiki search reveals that the goal of ICA is to identify maximally independent latent variables underlying a distribution. I can see how using a Kullback-Leibler divergence with P(A,B) as p and P(A)P(B) as q (with entropy being -sum p*log p/q) and maximizing over A and B would be a very reasonable thing to do. Essentially, this would give those values of A and B for which the joint distribution is as close to the product of marginals as possible.
Nice
Could you describe how max entropy is used in classifying pixel values in a digital image given a training sample?
+Devin Johnson
Hi Devin, suppose each pixel intensity comes with some uncertainty. Then, accounting for all possible pixel values across all pixels, there are a wide range of acceptable images. MaxEnt allows you to converge to the least biased of those intensity assignments across all pixels. The result is the "deconvoluted image". It's a fascinating field and I recommend google Skilling/image processing/MaxEnt for more info. Hope this helps!