Simple way to present a basic comp sci algo. This brought back sad memories when my first practical quiz in Uni was traversing this algo... I was in first year, and Dustin Huffman was the only Huffman guy i knew then. Later, I was taught via Entropy principle approach, a much steeper learning curve for beginners... I got a grade C for this then (that was 10 yrs ago and was devastating for me then). Of course I have long familarised this algo as any comp sci student should; still great video...
what should i do if the sum of two frequences is equals to another frequence? Which, the sum or the other frequence, do i prior at the next sum? Example: frequences: .2 .8 and the result of a former sum is .2. Do i sum the former sum and .8 or the frequences .2 and .8?
How can we apply for text based (alphabetical words) compression. lets say I have a .txt file with some text in it, How can I apply huffmman conmpression in that since it also contains alphabets.
James Mc Fabulous Hi, thank you for your response. I understand the way this process works, but if you just arbitrarily changed the code of the fourth character to 01, you would reduce your average length from 2.25 bits to 2.1 bits (which somehow is lower than the entropy). Wouldn't that code be more optimal?
Tristan Slater no problem :) so if the fourth character was changed to 01 you would have .35 with code 00 .2 with code 10 .2 with code 11 .15 with code 01 .1 with code 011 this conflicts with the prefix property and creates ambiguity with decoding. For example say you wanted to decode this 011011011011 there's no one answer
Cristi Marin You'd want to connect the two currently lowest value nodes as you want a more shallow tree for your high probability characters. Deeper into the tree = more nodes = more bits to represent that character. The very common character with 0.35 probability you'd want to represent with 2 bits, rather than 3 bits as it occurs more often.
Very similar to Khan Academy not just becase of the colours but the way he talks, good stuff
Totally, he and Sal Khan both have a similar teaching style, love it!
example first & then the formal proof--thal always works in algorithm analysis,good job
Simple way to present a basic comp sci algo. This brought back sad memories when my first practical quiz in Uni was traversing this algo... I was in first year, and Dustin Huffman was the only Huffman guy i knew then. Later, I was taught via Entropy principle approach, a much steeper learning curve for beginners... I got a grade C for this then (that was 10 yrs ago and was devastating for me then). Of course I have long familarised this algo as any comp sci student should; still great video...
You are awesome! Most of the video on youtube are pre-staged and wont get into "crossover situation", now I know how to handle, thanks!
An absolutely brilliant video. A big thanks to you, sir!
amazing technique...cant thank you enough..will pass my exam now..:))
"Ohhh look at me. I precomputed entropy, and I am not going to tell you how to do it!"
Thanks, understood the code this way.
Please can you make a video on Hamming codes for detection and correction of errors?
man your lessons are amazing. thank you for this but i have to ask. do you know how to do this in matlab?
Thanks a lot this really helped me understand how the code works!
Thank you for the video. Really helped remind the algorithm.
Thank you ... Nice Explanation !
What's the point of perfectly showing the method of the huffman tree but not showing how to find the end result?
what should i do if the sum of two frequences is equals to another frequence? Which, the sum or the other frequence, do i prior at the next sum?
Example: frequences: .2 .8 and the result of a former sum is .2. Do i sum the former sum and .8 or the frequences .2 and .8?
Hi, I didn't understand how you got H(P) = 2.2016 bits? Someone can explain me?
Regards,
Ranjith K K.
Ranjith Kumar Kakumanu the formula for entropy (H) is, H=sigma k=1 to m (Pk) LOG2(1/Pk) .H =
+khan sameer ahmed don't forget to mod |H| the result at the end
How can we apply for text based (alphabetical words) compression. lets say I have a .txt file with some text in it, How can I apply huffmman conmpression in that since it also contains alphabets.
This may be a little tricky to write
Sum 0 -> x-1
H(p) = sumof P(x) log_b (1/P(x))
I love this stuff! There is just one part I don't get. You're not taking advantage of all 2 digit codes. Wouldn't it be more optimal if .15 was 01?
01 leads to .25. if 01 was .15 then there would be no way to reach .1 since each prob has to be a leaf node
James Mc Fabulous Hi, thank you for your response. I understand the way this process works, but if you just arbitrarily changed the code of the fourth character to 01, you would reduce your average length from 2.25 bits to 2.1 bits (which somehow is lower than the entropy). Wouldn't that code be more optimal?
Tristan Slater no problem :)
so if the fourth character was changed to 01 you would have
.35 with code 00
.2 with code 10
.2 with code 11
.15 with code 01
.1 with code 011
this conflicts with the prefix property and creates ambiguity with decoding.
For example say you wanted to decode this 011011011011
there's no one answer
James Mc Fabulous Ahhhh...that's the missing link. That's why it didn't add up. Thank you for your help.
Very easy to follow thank you so much ^_^
Excellent! Very helpful thank you!
great story of huffman...
Pizza delivered at 7:19. Some interesting info.
Thank you.. Great work!
Just an idea...how about connect .35 with .4 = .75 with .25 = 1 ??
:) At this case the code is: 00 010 011 10 11
Will be this wrong? Thank you sir!
Cristi Marin You'd want to connect the two currently lowest value nodes as you want a more shallow tree for your high probability characters. Deeper into the tree = more nodes = more bits to represent that character. The very common character with 0.35 probability you'd want to represent with 2 bits, rather than 3 bits as it occurs more often.
What's the software that you are using for recording and sketchin?
Thnx a lot.... :-) this will definitely help me to score gud marx....
I have to create some math videos to post on youtube. I have a HD video camera. What software do you use to make these videos?
Thanks sir, that is very helpful :D
What is the formula of H_2(p)?
H = Sum(p(x)*log_2(1/p(x))), for each symbol x
wooa waht? entropy? doesn't that have to do with thermodynamics?
Thank you!
c(x) last value will be 110
search Entropy(information theory) on wikipedia
thank you sir
Thanks dude
thank you
A chicken that brought his duck soup !
Who was at your door? 7:19
H_2(p) = p_i * log_2(p_i) entropy
how to calculate H(p)😑
A little bit of effort and you could figure it out with ease. Although the OP should have mentioned it.
en.wikipedia.org/wiki/Huffman_coding
thank you