Matrix Norms : Data Science Basics

Поділитися
Вставка
  • Опубліковано 25 гру 2024

КОМЕНТАРІ • 54

  • @florawang7603
    @florawang7603 4 роки тому +13

    Thank you this is the clearest video on matrix norms I've watched so far

    • @ritvikmath
      @ritvikmath  4 роки тому

      Wow, thank you!

    • @dansantner
      @dansantner 3 роки тому

      Agree. I have been watching Strang's lectures and he skips so many conceptual steps sometimes it's hard to follow. This filled in the gaps.

  • @hba3415
    @hba3415 3 роки тому +1

    I was struggling so hard and finding anything on the internet. Thank God I found your Video.

  • @mosca-tse-tse
    @mosca-tse-tse 3 роки тому +13

    We’re saying “OK, matrix, you’re allowed to have...” 🤣🤣🤣🤣 so stressful for her 🤣🤣

  • @angelmcorrea1704
    @angelmcorrea1704 4 роки тому +13

    Thanks for all this lectures, very clear.

  • @teodorvijiianu41
    @teodorvijiianu41 Рік тому

    I swere to god I watched the uni lecture 3 times i had no ideea what they were talking about. In less than 10 minutes it now makes sense. Thank you!

  • @jtm8514
    @jtm8514 3 роки тому +6

    You make studying fun! Thank you so much, I loved watching this. It wasn't a chore after a bit, I was in bliss from how cool the math was.

  • @jacob_dmn
    @jacob_dmn 3 роки тому

    This Channel Changed My way of thinking.. THANK YOU MAN

  • @sofiyavyshnya6723
    @sofiyavyshnya6723 4 роки тому +6

    Amazing video! Super brief and to the point and super clear! Thanks so much for all your help!

  • @juneshgautam8655
    @juneshgautam8655 Рік тому +1

    4:22 Could you please provide me the information on how could I find a proof for that?

  • @haimteicherteicher4227
    @haimteicherteicher4227 3 роки тому +1

    very focused and clear explanation, much appreciated.

  • @tyflehd
    @tyflehd 3 роки тому +2

    Thank you for your clear and intuitive descriptions :)

  • @emmanuelamankwaaadjei3051
    @emmanuelamankwaaadjei3051 3 роки тому

    Precise yet detailed explanation. Great work.

  • @ЕгорРудица
    @ЕгорРудица 4 роки тому +2

    Thank you bro! Huge Respect from Ukraine!

  • @Posejdonkon
    @Posejdonkon Рік тому

    Applicable and accent-free study material. Greatly appreciated!

  • @kamelismail3730
    @kamelismail3730 3 роки тому

    this is an absolute gem!

  • @youlongding
    @youlongding 3 роки тому +2

    could you give me the detaied discussion of that property mentioned

  • @maciejgarbacz8785
    @maciejgarbacz8785 Рік тому +1

    I think that this video does not give a good, human-like explanation. I have figured out this thing from other sources after some time and I will leave my attempt of explanation here.
    For vectors, you would usually want to know how big they are, based on their length. Applying the Pythagoras theorem, would give that ||v||2 = sqrt(x^2 + y^2 + z^2 + ...), which is considered a 2-norm (or Euclidean norm) and can be used to measure the length of a vector in an nD space. As it turns out, these are not the only ways of measuring length of vectors, as there are different norms (And a generalized formula for an n-norm), like 1-norm (Manhattan norm), which gives the total distance to walk in straight lines: ||v||1 = |x| + |y| + |z| + ... . Last important norm is the infinity-norm, which lightly speaking tells that if you were a chess king piece on a plane, how many moves would you have to perform to get to the tip of the vector: ||v||inf = max(x, y, z, ...). All of those n-norms have a visualization, which is just a graph which you obtain after assuming the length of a vector is 1 and transforming the given formulas.
    1 = sqrt(x^2 + y^2)
    1^2 = x^2 + y^2
    y^2 = 1 - x^2
    y = sqrt(1-x^2) or y = -sqrt(1-x^2)
    (Fun Fact: That is the equation for a circle, and you can technically integrate that to get its area to obtain the value of pi, which is what Newton did to calculate the precise value of pi for the first time in history - Watch "The Discovery That Transformed Pi" by Veritassium)
    As it turns out it is also useful to measure how big the output of a matrix can get! For example if the maximum length of an output vector would be zero, you would know that the matrix gives a 0 to every vector it gets, so it is useless in many cases. This is literally a matrix norm. The matrix 2-norm means that the length of a vector will be measured by the 2-norm method I explained below. So to get that matrix norm, just plug in every possible vector with length 1 and find the output vector with maximum length.
    I hope I have helped someone in despair. I was just really frustrated how everyone on the internet just reads out the formulas and hopes the viewer will memorize them without any understanding. If there is something wrong in my comment, don't hesitate to give a reply there!

  • @pulkitnijhawan653
    @pulkitnijhawan653 3 роки тому +1

    Awesome intuitive explanation :)

  • @KorayUlusan
    @KorayUlusan 2 роки тому

    10/10 video!

  • @邱越-n6u
    @邱越-n6u 3 роки тому

    Thank u! I am just wondering if a non-square matrix has a norm as well. Why you give [ui ui...ui] the matrix rather than the nx1size matrix [ui]?

  • @srs.shashank
    @srs.shashank 2 роки тому

    This would be applicable for only square matrices right? How we calculate 2-norms for rectangular(non-square) matrices, Thanks!

  • @bobbysokhi7296
    @bobbysokhi7296 3 місяці тому

    Great explanation.

  • @SeidelMatheus
    @SeidelMatheus 3 роки тому

    Great lesson!

  • @doxo9597
    @doxo9597 3 роки тому

    This was great, thank you!

  • @manhhungnguyen4270
    @manhhungnguyen4270 3 роки тому

    Thank you for the explanation. I have a question
    Does this spectral norm ( 2-norm) show the "size" of the matrix just like Frobenius norm?
    I know that 2 norm of matrix shows the maximum singular value of a matrix
    and the Frobenius norm shows the "size" of a matrix
    But I am confused when you use 2 norm to compare the matrices
    When I want to compare 2 matrices, which one is better to use?

  • @sandrasyperek1945
    @sandrasyperek1945 2 роки тому

    Great video. I like it. Concrete examples would have been nice. Thank you for making the video. :)

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 4 роки тому

    Awesome and clear

  • @Pukimaxim
    @Pukimaxim 4 роки тому

    What do you mean by decay when talking about negative 9 to the power of a large number?

    • @jako276
      @jako276 4 роки тому +1

      I believe that he said "point 9" meaning 0.9, and by decay he means that 0.9^2 would 0.81, 0.9^3 would be 0.729, thus "decaying" to zero as exponent approaches large numbers.

  • @EmilioGarcia_
    @EmilioGarcia_ 4 роки тому

    Hi I really enjoy your content! Quick question here, what you mean with '' bigger output "? Perhaps that with a matrix A you can span most of 'y' that belongs to R^m with the vector x that belongs to R^n ? Bit confused here, thanks for your help.

  • @sirengineer4780
    @sirengineer4780 2 роки тому

    Great ! keep on bro

  • @fatihsarac662
    @fatihsarac662 Рік тому

    Perfect

  • @thomasheirbaut6612
    @thomasheirbaut6612 7 місяців тому

    insanneeeeeee! Legend

  • @noway4715
    @noway4715 4 роки тому

    Definition of matrix norm can have an example?

  • @potreschmotre1118
    @potreschmotre1118 3 роки тому

    thank you!

  • @tanvirkaisar7245
    @tanvirkaisar7245 Рік тому

    could you please give me the detailed proof of ||A||||B||

  • @박병현-g2e
    @박병현-g2e 3 роки тому

    u r really cooool~~~~~

  • @cahitskttaramal3152
    @cahitskttaramal3152 2 роки тому

    couldn't understand almost most of it but thanks anyway

  • @epsilonxyzt
    @epsilonxyzt 4 роки тому +3

    solve an example is better than so much talk.

  • @bartlmyy
    @bartlmyy 3 роки тому

    merci bisou

  • @Fat_Cat_Fly
    @Fat_Cat_Fly 4 роки тому +1

    The editing brings jumps and looks very uncomfortable.
    The course is superb!

  • @aviator1472
    @aviator1472 5 місяців тому +1

    Didnt understand anything.

  • @broda680
    @broda680 3 роки тому

    Did someone ever tell you that you look like kumar from the movie Harold and Kumar ? :D

    • @psychwolf7590
      @psychwolf7590 3 роки тому +2

      I was thinking of the same actor!! They are soo similar omg

  • @seankeaneylonergan1859
    @seankeaneylonergan1859 3 місяці тому

    Tldw, he doesn’t show how to calculate the norm of a vector.

  • @maksymfigat
    @maksymfigat 2 роки тому

    Thanks!