Lecture 4, Convolution | MIT RES.6.007 Signals and Systems, Spring 2011

Поділитися
Вставка
  • Опубліковано 26 гру 2024

КОМЕНТАРІ • 186

  • @otsilekgaladua5485
    @otsilekgaladua5485 8 років тому +70

    this man is a legend. He is a legend. MIT thank you for your generous offer of education. #bravo!!

  • @aderants
    @aderants 11 років тому +59

    The best part of this Series is it was given in 1987, and I am in 2013 referring it. Wondering after 20 years my Son will take a visit to this site and finds my comments ..
    Cheers !!

  • @ConsciousnessIsMyGod
    @ConsciousnessIsMyGod 10 років тому +261

    The famous Alan Oppenheim. Thumbs up if you noticed his mustache is the sinc function.

    • @glabka333
      @glabka333 10 років тому +8

      Hah, and I naivly thought it is shifted cosine :D

    • @boughouabdou605
      @boughouabdou605 10 років тому +2

      glabka333 lol nice jokes guys :)

    • @klam77
      @klam77 7 років тому +3

      he's got a nice 70s groove goin on. Funk! Disco! O wait.....this was uploaded in 2011, but RECORDED in 1970! during Peak Disco Inferno......burn baby burn. wooohoooo...love it!

    • @javijee_
      @javijee_ 7 років тому +3

      Ha ha. That's a good one.

    • @leandrogage9409
      @leandrogage9409 3 роки тому

      Instablaster

  • @goharkay
    @goharkay 12 років тому +52

    the cameraman/men for these videos probably got a very good idea of the subject

  • @bandar1606
    @bandar1606 10 років тому +102

    6 dislikes. I think they are some professors because no one attends their lectures and students instead watch this guy.

    • @univuniveral9713
      @univuniveral9713 5 років тому

      You are smart.

    • @manideepp2229
      @manideepp2229 4 роки тому

      @Beyond Oblivion your comment is platinum my friend.

    • @unknownuser927
      @unknownuser927 4 роки тому +1

      Ironically my professors recommended us to follow this lecture series.

    • @amit-mishra
      @amit-mishra 2 роки тому

      @c_a Online allowed us to do that exactly. Should i say thanks to corona? lol

  • @lokeshm2583
    @lokeshm2583 3 роки тому +6

    Alan sir your great nobody in the internet has explained convolution in this way you are a ideal person . Your are awsome. Now I got interest to do convolution. Thank you sir. Your textbook is very nice . 😀

  • @samuelleung9930
    @samuelleung9930 4 роки тому +3

    We should learn from the best people in that field if we can... That’s what MIT OCW keep reminding me. Highly Respect to Prof Oppenheim.

  • @rgseven6557
    @rgseven6557 7 років тому +5

    I consider myself fortunate to have access to such useful lecture videos. I have been struggling to comprehend this topic but now I have a better grasp. Sincere thanks for uploading this video. Regards from Singapore.

  • @boughouabdou605
    @boughouabdou605 10 років тому +7

    Thanks to you Sir Oppenheim. An art of state courses, well done.

  • @josechemist
    @josechemist 11 років тому

    I will not be surprised. This is a will never-die video . Comprehensive and straight to the point.

  • @georgeyu7987
    @georgeyu7987 6 років тому +3

    He go through everything so fast... this is indeed MIT speed

  • @OmarChida
    @OmarChida 4 роки тому

    Pof. Alan V. Oppenheim

  • @VarunKumar-ir6wd
    @VarunKumar-ir6wd 3 роки тому +4

    I'd like to leave you with the fun and opportunity of doing that at your leisure.

  • @imjisooimok1826
    @imjisooimok1826 7 років тому +15

    That opening music😂
    Makes me happy😂
    Thanks for the video though

  • @Lycheeee11
    @Lycheeee11 8 років тому +3

    Finally I understand convolution!! THANK YOU MIT!!!

  • @probono2876
    @probono2876 7 років тому +5

    Prof Oppenheim, many thanks for your great teaching.

  • @LennyGrayGuiltless
    @LennyGrayGuiltless 11 років тому +4

    Thanks for posting; Im an EE major at SFSU I found this very helpful

  • @EngBandar1
    @EngBandar1 10 років тому +3

    I should call this guy the father of signals and systems. His book is the best as far as I know and these videos made the book more popular. I feel sad for other authors of the same field. They need to double their work to catch up this guy. Also, thanks for the cameraman. he deserves a credit. Well done MIT.

    • @drtgyjk
      @drtgyjk 10 років тому

      .

    • @dopier12
      @dopier12 10 років тому

      Bandar I'm just finding him and having a glimmer of hope of passing my signals and systems class, as my professor is on the terrible side. The guy is a walking book. Where are the professors that created the last great generation when you need them?

    • @Bluddyshadowhell
      @Bluddyshadowhell 10 років тому +4

      the book is terrible and should be used like reference book not an actual textbook meant to teach a person new to the coursework.

  • @storgerbenevolent5678
    @storgerbenevolent5678 4 роки тому +1

    this is taking me a huge time to wrap my head around the concept, although they are explained in a nice fashion best there is , it is still taking time

  • @nikshepbangera5416
    @nikshepbangera5416 5 років тому

    cant believe this was made even before i was before... some people are just ahead of their time

  • @dabulls1g
    @dabulls1g 6 років тому +2

    my-my, if 36:19 isn't the charging discharging of a capacitor then idk what is.

  • @RomyvanEs
    @RomyvanEs 10 років тому +6

    the man's a hero.

  • @glaurung78
    @glaurung78 Рік тому

    Does someone know, technically, how the convolution integral is being calculated from 34:00 until after 36:00? Is there some sort of analog computer being used?

  • @frankliou3609
    @frankliou3609 4 роки тому +1

    what a charming smile before the ending!

  • @roros2512
    @roros2512 7 років тому +3

    around the minute 44:50 appears the solution to the sum alpha^(-k), I think there is a mistake, could put some comment with the right answer please?
    The book shows a similar example with that final solution, but the sum is actually alpha(k), without the minus sign
    Thank you very much, these lessons are extremely useful

    • @ze2411
      @ze2411 4 роки тому

      I have the same question!

    • @storgerbenevolent5678
      @storgerbenevolent5678 4 роки тому

      yes i feel same!

    • @beytulk
      @beytulk Рік тому

      I feel the same too, I think multiplying with alpha(k) is forgotten.

  • @karankhatwani8987
    @karankhatwani8987 7 років тому +1

    the greatest , Alan Oppenhiem.

  • @saswatisil8886
    @saswatisil8886 Рік тому

    @28:58 should not the output become zero, as soon as the h[k] crosses the extreme right point of the rectangle?

  • @emrahtokalac1721
    @emrahtokalac1721 4 роки тому

    Thank you so much Prof.Alan V. Oppenheim

  • @cpeter9569
    @cpeter9569 8 років тому +1

    best explaination of the convolution integral I have found!!

  • @rishavkumar9288
    @rishavkumar9288 4 роки тому +1

    He is the father of signal and system 😀 cheers 👍

  • @Gman2486
    @Gman2486 8 років тому

    This guy speaks good English. I can actually learn from this.

  • @RohitPandey127
    @RohitPandey127 7 років тому

    can someone explain me how expression for x(t) and y(t) is same all though they are input and output. at 18:18.

  • @LucasAmorimPlus
    @LucasAmorimPlus 12 років тому +2

    I can't help but imagining Magnum P.I. giving a lecture on Convolution when I see that stache.

  • @chinnu349
    @chinnu349 12 років тому

    It represents everything about the rectangle i.e to say it gives you magnitude, area and position of the rectangle. In other words that equation and the rectangle diagram are interchangeable.

  • @masonhung7061
    @masonhung7061 5 місяців тому

    I love how the way he said “strategy” just like we are solving a problem together. instead of ,this is just how the equation works , eat this shit

  • @khaben6986
    @khaben6986 5 років тому +50

    who is watching this video on 2019 to understand convolution ? 😂

  • @vivekrai1974
    @vivekrai1974 10 місяців тому

    7:16 Shouldn't x[n] be Sigma(n=minus infinity to plus infinity) sigma( k = minus infinity to plus infinity) x[k] delta [n-k]?

  • @area51xi
    @area51xi 3 роки тому +1

    I'm not en engineer. Just a surgeon so bear with me. I was following this until 23:48. I would have thought that h[n-k] is h[n] shifted to the right by k. I just see that h[n] graph and imagine that it's just shifted over to the right by k. Why say that h[n] is h[k] and that h[-k] is h[k] flipped over when you could equivalently say it's just the h[n] shown with a time shift of k?

    • @jacobvandijk6525
      @jacobvandijk6525 3 роки тому +2

      I like the idea ;-) Looks a lot simpler. But k isn't the shifting factor here. The factor k is just an integer on a infinite time-line where you can place the values of h(-k). It is the value of n that determines the shift of h(-k) via h(n-k) over this time-line. As you perhaps know, convolution is all about the overlap of 2 functions: keep one in place and shift the other one over it. Of course, it's up to you which of two function is being shifted.

    • @mridulk81
      @mridulk81 2 роки тому

      @@jacobvandijk6525 so does this mean that the visualization given at 11:05 and the other visualization at 23:48 are just two different perspectives of looking at the convolution sum based on which function we choose to time shift?

    • @jacobvandijk6525
      @jacobvandijk6525 2 роки тому

      @@mridulk81 I like this example very much: 27:56. Instead of reflecting h in the y-axis (what's done here), you could reflect the step-function in the y-axis and make it shift to the right. Same result.

  • @stevenan93
    @stevenan93 9 років тому +65

    this guy is such a gangster

  • @el_witcher
    @el_witcher 4 роки тому +3

    The guy is gold, but learning S&S from his book is extremely difficult. I had a look at S&S by MJ Roberts and quite liked it. I wonder if I'd be too far behind if I learned from this book instead.
    Does any of you guys use the book by MJ Roberts?
    Thanks.

    • @guruG509
      @guruG509 3 роки тому

      Hey, i am From India, and this semester, the official book followed here is His official book, but it is too cluttered to my understanding, fortunately in the huge library, i found out that book and immidiately issued it, it is pictorially easier to understand, and after that i read your comment.

  • @thesecrethero9901
    @thesecrethero9901 9 років тому

    Could anyone please explain why h[n] at 23:00 is decaying? I think the decaying only possible if 0 < α < 1 but there is no such interval in the figure.

    • @dawitmureja2228
      @dawitmureja2228 8 років тому

      +Akis Stavridis
      The time interval is for "n", not for "α ". He just assumed α to be between 0 and 1 for this particular example.

    • @thesecrethero9901
      @thesecrethero9901 8 років тому

      +Dawit Mureja Thank you for answer. Yes, he probably assumed α to be between 0 and 1 but since he did not mention or write this assumption, I was confused.

  • @oskarmeister
    @oskarmeister 10 місяців тому +1

    The background buzzing noise sure need some signal processing

  • @alimousvi9846
    @alimousvi9846 9 років тому +3

    Thank u MIT, u help us learn.

  • @TheReligiousCrap
    @TheReligiousCrap 7 років тому

    23:45 Great explanation! My teacher didn't explain this integral thoroughly.

  • @DF-ss5ep
    @DF-ss5ep 2 роки тому

    Very nice visualization of the convolution integral

  • @superparko1
    @superparko1 5 років тому

    Just a recent comment passing by. This is gold.

  • @fardeszx
    @fardeszx 12 років тому

    it awesome...This lecture provides an easier understanding elaboration than his textbook.

  • @storgerbenevolent5678
    @storgerbenevolent5678 4 роки тому

    at 44:56 i think there is an error , it should be (a^(n+1) -1)/a-1

    • @giacomodemarie2497
      @giacomodemarie2497 4 роки тому

      The two expressions are the same. You may multiply the numerator and denominator by -1 and get the expression in the lecture

  • @abdoaboganima
    @abdoaboganima 4 роки тому

    الراجل ده عظمة اوي :D

  • @aggressivetourist1818
    @aggressivetourist1818 3 роки тому +1

    OMG this lecture is amazing

  • @TheGi0gio
    @TheGi0gio 8 років тому +2

    this guy is da bomb! thanks for the tutorial kind sir! :)

  • @Dr2quan
    @Dr2quan 4 роки тому +2

    20:03 is the wow moment

  • @TANVEER991164
    @TANVEER991164 12 років тому

    very well explained specially
    the dynamic explanation of convolution

  • @mohamedessam1397
    @mohamedessam1397 2 роки тому

    Old but Gold

  • @jerusheng
    @jerusheng 4 роки тому

    Is the demonstration done in an analog oscilloscope? Genius idea of visualization given what they have at that time.

  • @makishimashogo1804
    @makishimashogo1804 8 років тому +1

    I did not understand what is h_k in the video..Can anybody please tell me?

    • @vg5028
      @vg5028 7 років тому

      h_k is the impulse response corresponding to the delta[n - k] impulse input. Where k goes from -inf to +inf

  • @moatacemaskar7313
    @moatacemaskar7313 4 роки тому

    Thanks for great explination. Just I wonder what is impulse response and how we could generate such impluse and what is the amplitude for this pulse and width.

  • @damny0utoobe
    @damny0utoobe 4 роки тому

    Love this dr oppenheim lecture

  • @pyrocolada
    @pyrocolada 4 роки тому

    Why would you ever want to sum any of these functions? Does the sum notation actually represent the whole signal as one formula, rather than just the sum of each sample?

  • @shashibhushansharma1383
    @shashibhushansharma1383 7 років тому

    in case of time invariance system, can we write h(n-k)=h(n)?

  • @socratesuffer2765
    @socratesuffer2765 2 роки тому

    where does Alpha comes from after getting rid of unit steps ?

  • @AnasAhmad7
    @AnasAhmad7 9 років тому

    This is really beautiful .

  •  9 років тому +16

    convolution :D . finally :D

    • @vg5028
      @vg5028 7 років тому +2

      i feel exactly the same. I've been trying to understand it for a while now, but I think I finally get it :)

    • @nagarajuchukkala9538
      @nagarajuchukkala9538 6 років тому

      Exact same feeling

  • @abdelrahmanyasser5720
    @abdelrahmanyasser5720 5 років тому

    well ... i got bored in the middle of the video so i went to another video
    and i didn't get it then i came back here to continue and i understood every thing
    thank's very very much

  • @elmotivoso85
    @elmotivoso85 11 років тому +6

    Ah ok, the book I use was written by him

  • @SmartSula
    @SmartSula 12 років тому

    Where's the lecture that he discussed the properties of systems? I can't find it.

  • @haoyuan92
    @haoyuan92 Рік тому

    if only my professor can explain C-T convolution as clear as him. Credit to Prof Oppenheim
    P.S. he has a very calming voice

  • @LuckFx
    @LuckFx 11 років тому +12

    I think everybody studying Signals and System uses this book lol, we're using it here at Universidad Politecnica de Madrid too

    • @omega7377
      @omega7377 7 років тому +4

      We are using it too, at Istanbul Technical University.

    • @maheryagub
      @maheryagub 7 років тому +1

      University of Patras Greece too

    • @manoelnt0
      @manoelnt0 7 років тому +1

      The omnipresent Oppenheim, even on Brazil (Federal University of Ceará)

    • @imjisooimok1826
      @imjisooimok1826 7 років тому +2

      India too😂😂

    • @ПетрИванович-л6ж
      @ПетрИванович-л6ж 7 років тому +2

      Nazarbayev University, Kazakhstan too

  • @azstudioproductions
    @azstudioproductions 12 років тому +1

    Perfect lesson , thank you so much !

  • @tintindear000
    @tintindear000 12 років тому

    time14:09, a little confused by the words"represent the rectangle"-----represent the area of the rectangle or the magnitude of the rectangle? Seems to me all the work in this part is to introduce the delta into the expression.

    • @superparko1
      @superparko1 5 років тому

      It represents the magnitude of the rectangle, bc the impulse function equals (1/delta) at one particular time. if you multiply x(t)(1/delta)(delta) where t represents a value when the impulse function is equal to (1/delta) you will obtain x(t) which is the magnitude of the rectangle.

    • @superparko1
      @superparko1 5 років тому

      Sorry for being the only person after 6 years who have the courage to answer this question, Im studying this for the first time and enjoyed reading the comment section

  • @FahimKhan-vd8yp
    @FahimKhan-vd8yp 5 років тому

    ladies and gentlemen, convolution is no longer convoluted!

  • @SameerSk
    @SameerSk 4 роки тому

    Thank you MIT.

  • @whodaFru4551
    @whodaFru4551 7 років тому

    shouldnt be Interval 2: t >= 0 ?

  • @adarshabbigeriadarshabbige8004
    @adarshabbigeriadarshabbige8004 7 років тому

    Thank u sir this vidio helps me lot love u mit

  • @klam77
    @klam77 7 років тому

    THE guru.......respect!

  • @ManojKumar-el9bq
    @ManojKumar-el9bq 8 років тому

    why n is increasing when we do summation with k in h(n-k)

    • @aSeaofTroubles
      @aSeaofTroubles 8 років тому

      n is not increasing. It is held constant based on our input.
      Remember, we are now treating h(.) as a function of something; in this case, it is actually a function of k.
      Let's take n = 0 (interpreted as time 0):
      we have h(n-k) = h(-k), which is clearly a function of k since the n disappeared.
      Now we sum across all k indexes to yield what the system would output:
      y(0) = sum x(k)h(k) for all k
      Notice that the sum would be very boring if the response wasn't a function of k

  • @chaitanya.shankar
    @chaitanya.shankar 4 роки тому

    Where exactly is convolution used?

    • @amber1862
      @amber1862 4 роки тому +1

      Used a lot in real-time audio applications such as convolution reverb, guitar amp modelling and physical modelling of acoustic instruments.

  • @makishimashogo1804
    @makishimashogo1804 8 років тому

    What is the meaning of weighted delayed impulse

    • @MoutasemMohammad
      @MoutasemMohammad 8 років тому +1

      well think of it this way a weight is the coefficient , the delayed impulse is the delta function shifted to the right/left "delayed" so you can think of a signal as individual components of x at `k` "weight", multiplied by the unit impulse -delta(n-k) -"delayed"

  • @appleraja
    @appleraja 12 років тому

    how come i cant open this video full screen

  • @SatyamMishraBEE
    @SatyamMishraBEE 3 роки тому

    Great lectures

  • @jacobvandijk6525
    @jacobvandijk6525 3 роки тому +1

    @ 33:57 In my opinion, the lowest graph should not begin at 0 (but at 1 (= e^0)). The same thing here: 35:33. Just compare it with the correct graph of the discrete case: 26:58. Even mr Oppenheim isn't flawless ;-) Nice animation though.

    • @AlexAlex-fo9gt
      @AlexAlex-fo9gt 2 роки тому

      ua-cam.com/video/SNdNf3mprrU/v-deo.html
      This is example of calculation. And in this example values start from 0.

  • @AllHailAkemi
    @AllHailAkemi 12 років тому +2

    Mr Oppeheim, I have your book!

  • @venkatasaketramgoteti8726
    @venkatasaketramgoteti8726 3 роки тому

    great lecture

  • @Giesel47
    @Giesel47 5 років тому

    what a legend...

  • @TheAllboutwin
    @TheAllboutwin 12 років тому +2

    What a stache!

  • @andrewdavis6191
    @andrewdavis6191 8 років тому +1

    Amazing!

  • @fazlanpera
    @fazlanpera 12 років тому

    Very well done! Viva la MIT

  • @strayon7333
    @strayon7333 7 років тому +1

    captain price liked this.

  • @TheRonaldinho80R10
    @TheRonaldinho80R10 12 років тому

    Great video

  • @RAGHAVENDRASINGH17
    @RAGHAVENDRASINGH17 4 роки тому

    i think Howard from Big Bang theory will look like this professor when he starts teaching (p.s. its meant in good way) nice lectures 👍

  • @Captain_Rhodes
    @Captain_Rhodes 9 років тому +59

    signals is interesting but in my experience it is the subject with the worst literature out there. Ive never read a book that actually explained things in a way that didnt assume mountains of prior knowledge. this guys book is terrible to learn from but his lectures are good. If anyone has ever found a text that isn't a total pile of shit let me know!

    • @adityatyagi4009
      @adityatyagi4009 9 років тому +8

      +Captain Rhodes I'm glad I'm not the only one who believes this about the signals literature! If you're into DSP, check out the Lyons book! All the best to you.

    • @Captain_Rhodes
      @Captain_Rhodes 9 років тому

      ***** I will thanks. There is a website called complex to real dot com that has some great PDF's. Unfortunately the examples are full of mistakes but the text is pretty great. check that one out

    • @atifmmahmud
      @atifmmahmud 8 років тому +1

      Try the Lee Varaiya book. It is available for free online, and the practice problems, if not the chapters, are very helpful

    • @ThatOneHandsomeGamer
      @ThatOneHandsomeGamer 4 роки тому +2

      I agree. I think Oppenheim's book is very hard to understand, but his lectures are amazing!

    • @Captain_Rhodes
      @Captain_Rhodes 4 роки тому +1

      @@ThatOneHandsomeGamer yea hes a talented teacher but sadly his book is probably written to impress his friends

  • @owaismansoori1498
    @owaismansoori1498 8 років тому +1

    he's great

  • @faroukelkiouas7828
    @faroukelkiouas7828 4 роки тому

    Who is watching this video on 2020 to understand convolution ? 😂

  • @osamaasif9601
    @osamaasif9601 5 років тому

    Thumbs up, to Alan v openhiem

  • @jj1221ify
    @jj1221ify 10 років тому +7

    fan-frickin-tastic

  •  12 років тому

    Very good!

  • @ranjeetmishra8600
    @ranjeetmishra8600 11 років тому

    nic video........gret work!!!!!

  • @zf164
    @zf164 5 років тому

    O.G. Alan Oppenheim

  • @computerdynamo
    @computerdynamo 5 років тому

    Something about his delivery reminds me of Christopher Walken.

  • @VividlyVicious
    @VividlyVicious 12 років тому

    it's lecture 3