gradientDescent.m Gradient Descent Implementation - Machine Learning

Поділитися
Вставка
  • Опубліковано 7 січ 2025

КОМЕНТАРІ • 75

  • @ozzyfromspace
    @ozzyfromspace 5 років тому +17

    The fact that you wrote this "by hand" using a mouse = respect. Good video Dino!

  • @emcpadden
    @emcpadden 5 років тому +18

    Again, thank you! This is extremely helpful! I was about to give up before I stumbled over your videos. Great stuff!

  • @kushpeter8433
    @kushpeter8433 3 роки тому

    Three years after its release and this is the ONLY video which helped me with the vectorizatin part. THANK YOU MAN!!!

  • @advaitathreya5558
    @advaitathreya5558 4 роки тому +3

    This is exactly what I was looking for. Thanks for showing the vector math, and the subsequent code. Brilliant stuff.

  • @ozzyfromspace
    @ozzyfromspace 5 років тому +6

    Every time you do a transpose of a matrix, it takes time. A faster implementation of ((h-y)' * X)' is X' * (h-y), which is one less transposition. For a real-world situation, this might help. In general, A' * B' = (B * A)' ☺️. Well presented video!

  • @cristianavalos5879
    @cristianavalos5879 5 років тому +4

    DUde this is amazing! I have been breaking my head understanding how to vectorize this part! Thanks a lot!

  • @WOKfucboi
    @WOKfucboi 3 роки тому

    thx god, finally found someone can explain how the function works with plain text, appreciated man

  • @lividpudding8565
    @lividpudding8565 4 роки тому

    Thanks a lot for making this explainer video! I was having a hard time trying to tear the vectorized formula apart . This helped a ton!

  • @easy_learner
    @easy_learner 5 років тому +3

    Brilliant. Thanks for making it very easy to understand.

  • @MattCitrano
    @MattCitrano 4 роки тому

    Thank you so much for these videos. I'm trying to transition from music to working with data and machine learning, so it's been rough. Thanks for helping me learn more about the content I'm exploring!!

  • @zhangjie5781
    @zhangjie5781 4 роки тому

    Thank you so much, I was very confused about each terms, and you explain it really well.

  • @pranavgandhi9224
    @pranavgandhi9224 4 роки тому

    Great man! Really needed this! Thanks ...Best Explanation of this in UA-cam

  • @Terran5992
    @Terran5992 4 роки тому

    Your video saved me from droppping out of the course, thank you so much :)

    • @AladdinPersson
      @AladdinPersson  4 роки тому +2

      Glad I could be helpful, keep at it bro, it'll be more rewarding in the end ;)

  • @samarthrao3793
    @samarthrao3793 4 роки тому

    thank you I was struggling so hard to figure this out , great videos

  • @NicoleQuimper
    @NicoleQuimper 4 роки тому

    you are AMAZING! thank youuu i had no idea how to even begin tackling this exercise and now I even have some ideas for the next ones.. you're the best (=

  • @kenbinner
    @kenbinner 3 роки тому

    Thanks alot man, I spent so long on this thinking what have I done wrong...until realising that h needs to be included in the for loop.

  • @mrunaldivecha4090
    @mrunaldivecha4090 4 роки тому +1

    You got my basics cleared. Cheers!

  • @ribhusengupta9967
    @ribhusengupta9967 4 роки тому

    The One Video explains alot to me, thankyou...I got that how to convert equations into vectorized code

  • @pragyan394
    @pragyan394 5 років тому +2

    Thank you man, I was forgetting to take the transpose of (h - y).

  • @almostworthy2973
    @almostworthy2973 4 роки тому +1

    hii I did not understand how in the theta part (6:53) how was X a 97x2 in the previous part it was 97x1

  • @anshulanilgaur1118
    @anshulanilgaur1118 4 роки тому +2

    Hi, what is the use of J_history in the method gradientdescent.m ?

  • @combardus9309
    @combardus9309 3 роки тому

    thank you so much this has helped immensely !!

  • @pranjalsingh1287
    @pranjalsingh1287 5 років тому +1

    Thanks a lot. You resolved me 100 %.

  • @shadowsong911985
    @shadowsong911985 4 роки тому

    Thank you so much. This's extremely helpful. I wish best thing with you.

  • @gazisalahuddin9030
    @gazisalahuddin9030 2 роки тому

    Thanks, it's a lot of help to me.

  • @hasiburbijoy007
    @hasiburbijoy007 4 роки тому

    Thanks man you are genious! i was just hopeless!

  • @SajidSalim_RetardedJoker
    @SajidSalim_RetardedJoker 4 роки тому

    Thank you. You explained it really well.

    • @AladdinPersson
      @AladdinPersson  4 роки тому +1

      Thank you, I appreciate the kind comment

  • @sabyabera
    @sabyabera 5 років тому

    Thanks for the awesome explanation.

  • @abdulwaris_kenue
    @abdulwaris_kenue 4 роки тому

    Here you have the life saver! :D
    Thanks ...

  • @sergioqs3687
    @sergioqs3687 4 роки тому

    Oh thanks, it was very useful and clear!

  • @deepak2071
    @deepak2071 6 років тому +2

    Hey... Thanks for the videos.... your videos are helpful... needs to be uploaded more soon...

  • @Sarthak631
    @Sarthak631 3 роки тому

    Brilliant. This video helped me a lot. Thank you :D

  • @malepatirahul7339
    @malepatirahul7339 4 роки тому

    Thank you. it helped me in defining a function in python implementation

  • @hieungotrung5411
    @hieungotrung5411 5 років тому

    Thank you alot! you saved my day

  • @mwaleed2082
    @mwaleed2082 4 роки тому

    thanks so much !!!!!!! earned subscription + like

  • @pranayteja8330
    @pranayteja8330 4 роки тому +1

    Great job , Mate :)

  • @shafifarooq
    @shafifarooq 4 роки тому

    after following the instruction, i get this error "
    error: index (-566.396): subscripts must be either integers 1 to (2^63)-1 or logicals". Help!
    And also h=0

  • @iMohammedSu
    @iMohammedSu 4 роки тому

    thanks , i almost gave up on this !!

  • @omolereabove-all2072
    @omolereabove-all2072 2 роки тому

    Very helpful thank you, I was lost in matlab.

  • @AdnanAmmanUllah
    @AdnanAmmanUllah 3 роки тому

    good work, helped a lot, i was taking sum while doing matrix multiplication and that was giving not desired results...

  • @nikhilshukla2832
    @nikhilshukla2832 4 роки тому

    what is the value of numiters in problem

  • @jessephiri7834
    @jessephiri7834 3 роки тому

    Why when I was running this code I was having problems with my m = length...........

  • @MythicWarrior120698
    @MythicWarrior120698 5 років тому

    God bless you! Thanks a lot

  • @notagamer32
    @notagamer32 4 роки тому

    Thank you so much my man

  • @yogeshwarshendye4857
    @yogeshwarshendye4857 2 роки тому

    I am really stuck at the programming assignments..
    It's making me feel really down!

  • @Islam101_Uganda
    @Islam101_Uganda 3 роки тому

    bro, what's the title of the book?

  • @BeardedBong
    @BeardedBong 4 роки тому

    can u please do this for multivariate gradient descent too?? please? As far as i was able to think,for multivariate too the matlab code should be same right??

    • @anelm.5127
      @anelm.5127 4 роки тому +1

      It is exactly the same
      The only difference is that the size of your theta vector will increase

  • @nandanvarma4888
    @nandanvarma4888 4 роки тому

    Thank you so much, your videos are so helpful!!

  • @ramonnepomuceno5299
    @ramonnepomuceno5299 3 роки тому

    Very good!

  • @lorenzoo.4319
    @lorenzoo.4319 4 роки тому

    Why is X 97*1??

  • @joaquincibeira9231
    @joaquincibeira9231 2 роки тому

    you are a GOD

  • @dragon_warrior_
    @dragon_warrior_ 4 роки тому

    Thanks brother

  • @prachibindal3065
    @prachibindal3065 4 роки тому

    I have a question I want to know whether doing these exercises are worth it or not?how these exercises are useful I am just a beginner in machine learning it will be very helpful if you tell me:)

    • @AladdinPersson
      @AladdinPersson  4 роки тому +1

      If you want to learn about machine learning then the course is a great introductory course. For me doing the assignments was worthwhile and I learned a lot from them.

    • @prachibindal3065
      @prachibindal3065 4 роки тому

      @@AladdinPersson okh thnks😄

  • @MrJaszi
    @MrJaszi 4 роки тому

    Why the X is 97x2?

    • @AladdinPersson
      @AladdinPersson  4 роки тому

      If I remember correctly I think it's because we've added a column of ones which will be for the bias term. So we have one extra dimension than what we would think

    • @MrJaszi
      @MrJaszi 4 роки тому

      @@AladdinPersson oh yea the h equation assumes x0 is all 1. Thx man great videos!

    • @AladdinPersson
      @AladdinPersson  4 роки тому

      @@MrJaszi Exactly

  • @devhalder7799
    @devhalder7799 4 роки тому

    RESPECT

  • @hailuu625
    @hailuu625 4 роки тому

    Why dont you use "sum" function in your programe

    • @AladdinPersson
      @AladdinPersson  4 роки тому

      Why do you think I should use it?

    • @M.BilalAhmad
      @M.BilalAhmad 4 роки тому

      @@AladdinPersson Please make me correct if I am wrong @Alladin. We didnt do the sum because on the left side of the minus there is a vector Theta rather than a scaler variable as we have in calculating the cost function.

  • @laodrofotic7713
    @laodrofotic7713 3 роки тому

    Why can you just transpose the h - y? care to explain? this is not clear

  • @catherineli6110
    @catherineli6110 4 роки тому

    Thank you so much ! This helped me a lot !