(IC 4.1) Huffman coding - introduction and example

Поділитися
Вставка
  • Опубліковано 1 лис 2024

КОМЕНТАРІ • 53

  • @littlebigadventures
    @littlebigadventures 10 років тому +4

    Very similar to Khan Academy not just becase of the colours but the way he talks, good stuff

    • @mdrasel-gh5yf
      @mdrasel-gh5yf 3 роки тому +1

      Totally, he and Sal Khan both have a similar teaching style, love it!

  • @anurocksification
    @anurocksification 13 років тому

    example first & then the formal proof--thal always works in algorithm analysis,good job

  • @ganbade200
    @ganbade200 12 років тому

    Simple way to present a basic comp sci algo. This brought back sad memories when my first practical quiz in Uni was traversing this algo... I was in first year, and Dustin Huffman was the only Huffman guy i knew then. Later, I was taught via Entropy principle approach, a much steeper learning curve for beginners... I got a grade C for this then (that was 10 yrs ago and was devastating for me then). Of course I have long familarised this algo as any comp sci student should; still great video...

  • @bilokenneth
    @bilokenneth 9 років тому

    You are awesome! Most of the video on youtube are pre-staged and wont get into "crossover situation", now I know how to handle, thanks!

  • @jusefina88
    @jusefina88 12 років тому

    An absolutely brilliant video. A big thanks to you, sir!

  • @gdso90
    @gdso90 13 років тому +1

    amazing technique...cant thank you enough..will pass my exam now..:))

  • @DGNG-dd3to
    @DGNG-dd3to 10 років тому +8

    "Ohhh look at me. I precomputed entropy, and I am not going to tell you how to do it!"

  • @freetorontodatingca
    @freetorontodatingca 11 років тому +1

    Thanks, understood the code this way.

  • @edijemeni
    @edijemeni 12 років тому

    Please can you make a video on Hamming codes for detection and correction of errors?

  • @MrDoicopaci
    @MrDoicopaci 8 років тому

    man your lessons are amazing. thank you for this but i have to ask. do you know how to do this in matlab?

  • @Meemzz15
    @Meemzz15 11 років тому

    Thanks a lot this really helped me understand how the code works!

  • @dextervirus
    @dextervirus 11 років тому

    Thank you for the video. Really helped remind the algorithm.

  • @hit8339
    @hit8339 11 років тому +1

    Thank you ... Nice Explanation !

  • @xSkillzeh
    @xSkillzeh 8 років тому

    What's the point of perfectly showing the method of the huffman tree but not showing how to find the end result?

  • @Fanaro
    @Fanaro 12 років тому

    what should i do if the sum of two frequences is equals to another frequence? Which, the sum or the other frequence, do i prior at the next sum?
    Example: frequences: .2 .8 and the result of a former sum is .2. Do i sum the former sum and .8 or the frequences .2 and .8?

  • @ranjithkumarkakumanu8869
    @ranjithkumarkakumanu8869 9 років тому

    Hi, I didn't understand how you got H(P) = 2.2016 bits? Someone can explain me?
    Regards,
    Ranjith K K.

    • @khansameerahmed8642
      @khansameerahmed8642 9 років тому +2

      Ranjith Kumar Kakumanu the formula for entropy (H) is, H=sigma k=1 to m (Pk) LOG2(1/Pk) .H =

    • @kk100kk100
      @kk100kk100 8 років тому

      +khan sameer ahmed don't forget to mod |H| the result at the end

  • @FaizanZahidNustian
    @FaizanZahidNustian 12 років тому

    How can we apply for text based (alphabetical words) compression. lets say I have a .txt file with some text in it, How can I apply huffmman conmpression in that since it also contains alphabets.

  • @benjaminberlin
    @benjaminberlin 12 років тому

    This may be a little tricky to write
    Sum 0 -> x-1
    H(p) = sumof P(x) log_b (1/P(x))

  • @atrus3823
    @atrus3823 10 років тому +1

    I love this stuff! There is just one part I don't get. You're not taking advantage of all 2 digit codes. Wouldn't it be more optimal if .15 was 01?

    • @richardblack1588
      @richardblack1588 10 років тому

      01 leads to .25. if 01 was .15 then there would be no way to reach .1 since each prob has to be a leaf node

    • @atrus3823
      @atrus3823 10 років тому

      James Mc Fabulous Hi, thank you for your response. I understand the way this process works, but if you just arbitrarily changed the code of the fourth character to 01, you would reduce your average length from 2.25 bits to 2.1 bits (which somehow is lower than the entropy). Wouldn't that code be more optimal?

    • @richardblack1588
      @richardblack1588 10 років тому

      Tristan Slater no problem :)
      so if the fourth character was changed to 01 you would have
      .35 with code 00
      .2 with code 10
      .2 with code 11
      .15 with code 01
      .1 with code 011
      this conflicts with the prefix property and creates ambiguity with decoding.
      For example say you wanted to decode this 011011011011
      there's no one answer

    • @atrus3823
      @atrus3823 10 років тому

      James Mc Fabulous Ahhhh...that's the missing link. That's why it didn't add up. Thank you for your help.

  • @gioco8329
    @gioco8329 6 років тому

    Very easy to follow thank you so much ^_^

  • @Yuqiandygao
    @Yuqiandygao 8 років тому

    Excellent! Very helpful thank you!

  • @VikashChandola
    @VikashChandola 9 років тому

    great story of huffman...

  • @tcpaa
    @tcpaa 12 років тому

    Pizza delivered at 7:19. Some interesting info.

  • @sameerachandimal7613
    @sameerachandimal7613 9 років тому

    Thank you.. Great work!

  • @cristimarinro
    @cristimarinro 10 років тому

    Just an idea...how about connect .35 with .4 = .75 with .25 = 1 ??
    :) At this case the code is: 00 010 011 10 11
    Will be this wrong? Thank you sir!

    • @HolyGarbage
      @HolyGarbage 9 років тому

      Cristi Marin You'd want to connect the two currently lowest value nodes as you want a more shallow tree for your high probability characters. Deeper into the tree = more nodes = more bits to represent that character. The very common character with 0.35 probability you'd want to represent with 2 bits, rather than 3 bits as it occurs more often.

  • @MMarcuzzo
    @MMarcuzzo 10 років тому

    What's the software that you are using for recording and sketchin?

  • @AbhishekSingh-xd3iz
    @AbhishekSingh-xd3iz 10 років тому

    Thnx a lot.... :-) this will definitely help me to score gud marx....

  • @helenjones8779
    @helenjones8779 12 років тому

    I have to create some math videos to post on youtube. I have a HD video camera. What software do you use to make these videos?

  • @nourhamdan1977
    @nourhamdan1977 7 років тому

    Thanks sir, that is very helpful :D

  • @newsun157
    @newsun157 12 років тому

    What is the formula of H_2(p)?

  • @MrGalassi
    @MrGalassi 11 років тому

    H = Sum(p(x)*log_2(1/p(x))), for each symbol x

  • @bmgag19
    @bmgag19 12 років тому

    wooa waht? entropy? doesn't that have to do with thermodynamics?

  • @YoshiBoshiPoochy
    @YoshiBoshiPoochy 9 років тому

    Thank you!

  • @rajnaidu4291
    @rajnaidu4291 10 років тому

    c(x) last value will be 110

  • @MrGalassi
    @MrGalassi 11 років тому

    search Entropy(information theory) on wikipedia

  • @halavathnareshkumarhalavat203
    @halavathnareshkumarhalavat203 9 років тому

    thank you sir

  • @0909PRINCE
    @0909PRINCE 11 років тому

    Thanks dude

  • @maxnoish
    @maxnoish 11 років тому

    thank you

  • @42Siren
    @42Siren 11 років тому

    A chicken that brought his duck soup !

  • @GlennDaytonIV
    @GlennDaytonIV 12 років тому

    Who was at your door? 7:19

  • @cueto303
    @cueto303 11 років тому

    H_2(p) = p_i * log_2(p_i) entropy

  • @harshalpatil2312
    @harshalpatil2312 8 років тому

    how to calculate H(p)😑

    • @StonedJessus
      @StonedJessus 7 років тому

      A little bit of effort and you could figure it out with ease. Although the OP should have mentioned it.
      en.wikipedia.org/wiki/Huffman_coding

  • @ramchoudhary7943
    @ramchoudhary7943 9 років тому

    thank you