Spectral Graph Theory For Dummies

Поділитися
Вставка
  • Опубліковано 22 гру 2024

КОМЕНТАРІ • 124

  • @ron-math
    @ron-math  7 місяців тому +10

    To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/Ron . You’ll also get 20% off an annual premium subscription.

    • @ofHerWord
      @ofHerWord 7 місяців тому

      🇵🇸🏳‍🌈FREE🏳‍⚧ PALESTINE!⚧🏳‍🌈🇵🇸

  • @yewdimer1465
    @yewdimer1465 6 місяців тому +20

    For anyone confused about the plot at 12:20, it displays the component number on the x-axis and the corresponding value on the y-axis. For example, one of the eigenvalues for the Laplacian matrix is 2, which corresponds to the eigenvector (0.5, -0.5, -0.5, 0.5). With this in mind, the meaning of the plot should become self-explanatory.

    • @duduwe8071
      @duduwe8071 5 місяців тому

      Thanks @yewdimer1465 , I do need to add those eigenvectors from Laplacian matrix are supposedly valued at different value at multiplicity = 1.
      For the 1st , and 3rd Eigenvalue ; It will produce the same result as in the video if you set the multiplicity = -0.5 , and multplicity = 0.5 respectively.
      The plot at 12:20, is not consistent regarding implementing its multiplicity value for its eigenvalues.
      You can easily notice it at 1st eigenvalue and 3rd eigenvalue. Despite of that, all of the eigenvalues results are correct.
      ==========================================================
      For Example :
      1st eigenvalue = 0, has eigenvector (1,1,1,1) with multiplicity = 1.
      *It will have eigenvector ( - 0.5, - 0.5, - 0.5, - 0.5 ) with multiplicity = ( - 0.5 ) (Negative multiplicity Value) {* The same eigenvector value as in the plot 12:20 }
      3rd eigenvalue = 2, has eigenvector ( 1, -1, -1, 1) with multiplicity = 1
      *It will have eigenvector (0.5, -0.5, -0.5, 0.5) with multiplicity = 0.5 (Positive multiplicity Value) {* The same eigenvector value as in the plot 12:20 }
      2nd eigenvalue = 2 - sqrt(2) ≈ 0.585786437626905 = 0.59
      It will has eigenvector ( −1, ( 1 - sqrt(2) ), ( -1 + sqrt(2) ), 1 ) ≈
      ( -1, -0.414, 0.414 , 1 ) with multiplicity = 1
      *It will have eigenvector ( 0.5, 0.207, - 0.207, - 0.5 ) with multiplicity = ( - 0.5 ) (Negative multiplicity Value)
      4th eigenvalue = 2 + sqrt(2) ≈ 3.414213562373095 = 3.41
      It will has eigenvector ( −1, ( 1 + sqrt(2) ), ( -1 - sqrt(2) ), 1 ) ≈
      ( -1, 2.414, -2.414 , 1 ) with multiplicity = 1
      *It will have eigenvector ( 0.5, -1.207, 1.207, - 0.5 ) with multiplicity = ( - 0.5 ) (Negative multiplicity Value)

  • @Mr.Nichan
    @Mr.Nichan 7 місяців тому +9

    10:00 Lf = λf implies λ = (f^t)Lf ?
    Why? Shouldn't we also have to divide by dot product of f with itself (the square of the length/2-norm of f). Shouldn't (f^t)λf = λ(f^t)f = λ(||f||^2)?

    • @ron-math
      @ron-math  7 місяців тому +5

      Great catch! The assumption that f is normalized is missing.
      Sometimes when I wrote draft, I came back and forth. You can see later when I made the connection with PCA, f is normalized: that's when I assumed that all f are normalized in the video and forgot to state this explicit at 10:00.

  • @Zicrus
    @Zicrus 3 місяці тому +2

    27:05 Why do you specifically choose the first 3 eigenvectors here? Does this perform better than including all n eigenvectors and running k-means in n dimensions? Or is it just to make it easier to visualize?

  • @丹尼-m2l
    @丹尼-m2l 3 місяці тому

    Awesome!! please make more videos in spectral graph theory. can’t wait!

  • @HamidGolahmadi
    @HamidGolahmadi 6 місяців тому +2

    I really like it. some confusing parts after 21:20.(Fiedler Vector) could you please discuss about the below important lemmas as well:
    Perron-Frobenius Theorem
    Rayleigh Quotient
    Courant-Fischer Theorem
    Cheeger’s Inequality
    Spectral Radius
    Algebraic Connectivity (Fiedler’s Theorem)
    Weyl’s Inequality
    Eigenvalue Interlacing
    Lovász Local Lemma

    • @adashao8134
      @adashao8134 5 місяців тому

      maybe you should provide some useful tutorials for him

    • @adashao8134
      @adashao8134 5 місяців тому

      or references

  • @jordanmoore7340
    @jordanmoore7340 7 місяців тому +16

    Love your videos! Wanted to point out a typo at 3:40. Element 4,5 of the adjacency matrix A should be 1, not 0.

    • @ron-math
      @ron-math  7 місяців тому +4

      You are right! Thank you for pointing this out!

  • @Mr.Nichan
    @Mr.Nichan 7 місяців тому +5

    9:58 I think the f(i) and f(j) (which you accidentally wrote as f(i) in the first form) was not the most confusing part of that notation. Rather, I think the most confusing part was the use of "j, i~j" to mean "all j such that i is connected by an edge to j" (based on the fact that "~" often represents a binary relation from a set to itself, which is often represented with a directed graph) - that and the accidental use use of f(i) instead of f(j) when it was the only thing in the summation.

    • @hannahnelson4569
      @hannahnelson4569 6 місяців тому +1

      Thank you so much I had no idea what that meant.

    • @paul_tee
      @paul_tee 6 місяців тому

      i think it's standard in graph theory to denote i~j as i,j are neighbors. i've certainly seen it more than once.

  • @Mr.Nichan
    @Mr.Nichan 7 місяців тому +2

    19:45 If anyone's confused like I was, the following discussion is entirely dependent on the fact that these squares must all be non-negative, so each addend must be zero (so each difference between values of f at adjacent vertices must be zero) for the sum to be zero. No "cancelling out" is allowed here.

    • @ofHerWord
      @ofHerWord 7 місяців тому

      🇵🇸🏳‍🌈FREE🏳‍⚧ PALESTINE!⚧🏳‍🌈🇵🇸

  • @gonzacont4975
    @gonzacont4975 7 місяців тому +14

    Good video! I did miss a discussion of the Perron eigenvector of the adjacency matrix, something essential in graph centrality. Hopefully in the future you can do a continuation of this video!

    • @ron-math
      @ron-math  7 місяців тому +7

      Thank you!
      I skipped some content (there are just too much) and try to create a consistent topic: meaning of eigenvalues and eigenvectors of L. May work on another video focused on random walk in the future :)

  • @liyi-hua2111
    @liyi-hua2111 7 місяців тому

    15:00 How did discuss "smoothness" when the dimension of eigenspace is greater than 1. Even if you require them to be mutually orthogonal, all the eigenvectors in it can mix up and create a new set of eigenvectors.

    • @ron-math
      @ron-math  7 місяців тому

      Good question.
      When the eigenspace has a dimension of 3, you are right that we can create a new set of eigenvectors and the inter-connected-component smoothness will be different as the only requirement is that the corresponding entries of the eigenvector for a connected component are constant.
      The point is that 1) those eigenvectors all have smaller intra-component variance (0) compared to other eigenvectors associated with lambda > 0.
      2) If we try to find a "base" that has many 0 entries as possible ( high sparsity), which is usually a good convention, then our base choice is good. It also seems that this is how numpy does it.

  • @yaqichen-d8k
    @yaqichen-d8k Місяць тому

    03:43 is the adjacency matrix wrong? since v4 and v5 are connected the correspond number should be 1?

  • @t.gokalpelacmaz584
    @t.gokalpelacmaz584 7 місяців тому +1

    At 12:39, I don’t understand what the graph is. After checking your sources, I still could not find where it came from. “Vertices in a Row” headline did not make sense to me. If graph was better labeled, it would be much better. If anyone could explain it, I would really appreciate it.
    It was a nice introduction otherwise. Keep them coming 😉.

    • @ron-math
      @ron-math  7 місяців тому +1

      "Vertices in a row" is just a line graph. It was an early draft title that sneaked into production.
      Thanks buddy, will do.

  • @Infraredchili
    @Infraredchili 7 місяців тому

    Interesting and clear presentation. There is an error: the matrix shown at 3:45 should be symmetrical (as you pointed out) but A(4,5) = 0 and A(5,4) = 1.

    • @ron-math
      @ron-math  7 місяців тому

      Yes. You are right. Never trust copilot too much!

  • @hannahnelson4569
    @hannahnelson4569 6 місяців тому +1

    This was simultaneously an incredibly useful and informative lesson and incredibly confusing difficult to understand.
    Thank you for educating us!

  • @abhalla
    @abhalla Місяць тому

    Great video, thank you so much. It would be great if you could make more videos on this topic, maybe on the Perron-Frobenius theorem and mixing times / Cheeger's inequality. It is a very visual topic. Thanks so much!

  • @hideyoshitheturtle
    @hideyoshitheturtle 7 місяців тому

    Thanks for this amazing video! 19:23 I didn’t quite understand the logic here. How does eigenvectors with nonzero entries on each cluster produces eigenvalues 0?

    • @ron-math
      @ron-math  7 місяців тому

      Hey!
      Each "block" is like a smaller Laplacian matrix itself. As long as the entries, associated with a block, are constant, the eigenvalue is 0.

  • @turnerburger
    @turnerburger 7 місяців тому

    20:10 this does not make sense to me, why are the entries all nonzero for this eigenvector in particular? Specifically why can't the "neighbors" be nonzero?

    • @ron-math
      @ron-math  7 місяців тому

      "why are the entries all nonzero for this eigenvector in particular?"
      In general, c1, c2, c3 can't be all 0: at least one of them must be non-zero. I was trying to make the point of linear dependency so I can write f4=c1f1 + c2f2 + c3f3. But yes, they don't have to be non-zero at the same time.
      At 20:10, c is not 0, so other entries in component teal must also be c. But since you asked this question, I assume you got it.
      Did I answer your question?

  • @weijiejiang2436
    @weijiejiang2436 2 місяці тому

    9:50, second term should be f(j) right?

  • @xbz24
    @xbz24 7 місяців тому +21

    keep it up with the good stuff ron

  • @LucasSilva-ut7nm
    @LucasSilva-ut7nm 7 місяців тому +1

    Pretty good video bro! I'm now excited to see your other videos.
    I would only like to add that, in my opinion, it would've been better if you gave more time to explain the end part. Your pacing is not constant through the video, you did spend more time explaining the easier parts than the harder ones lol.

    • @ron-math
      @ron-math  7 місяців тому +3

      You are right bro. I should keep the pace more consistent for sure.

  • @blacklistnr1
    @blacklistnr1 7 місяців тому +11

    Seems like an interesting video, but I've had to drop it midway since it's more homework than presentation.
    It watches like a semester's worth of material with chapter after chapter to remember with no goal in sight.
    I feel like there's a lot of work expected of me to join the dots that you present.
    I truly mean this as feedback, it's the best I could do to explain how video structure feels off, I hope it's useful for future improvements!

    • @ron-math
      @ron-math  7 місяців тому

      Can't thank you enough for confirming my doubt (and folks who like this comment). And I would happily pin this comment if I can.
      Do you think an introduction with an example that is at the end of the video currently (25:40) is better?
      Storytelling is THE most difficult part for me for making videos. I will try a different approach for my next video (and a fun one).
      And in this video, I actually like the Laplacian matrix naming justification and the Fiedler examples the most. Sadly they are at the second half of the video...
      Thanks buddy.

    • @blacklistnr1
      @blacklistnr1 7 місяців тому

      ​@@ron-math That's actually such a cool and simple-to-understand application!
      Yes, I would suggest starting with a more abstract version of that as motivational problem and build towards it in the video.
      Drop the matrix/Laplacian specifics and keep the

    • @ron-math
      @ron-math  7 місяців тому +1

      @@blacklistnr1 Thanks!
      "It would also be really helpful if you stick to the concentric circles(or part of them) for visualising the graph things I've seen in the first part half of the video."
      I went through different basic types of graphs in hope that they will be beneficial but I agree that from the sentiment in the comment it seems that I overdid it. Well... I also have comment saying that they like this pace. Guess it will take more videos and trial-and-error to find the sweet spot :)

    • @blacklistnr1
      @blacklistnr1 7 місяців тому

      @@ron-math The circles suggestion is more to help me have one unifying structure in my head where I can put the ideas you present.
      If you can draw a line or color a dot/group of dots or something for each idea, it becomes much easier to remember and frame them within the context.

    • @radadadadee
      @radadadadee 7 місяців тому

      Yeah, I stopped at the 11 min mark, I couldn't follow anymore

  • @drdca8263
    @drdca8263 7 місяців тому

    This was nice, I certainly hadn’t expected the technique at the end.
    Do I understand correctly that you took the 3 eigenvalues with the greatest magnitude, and then the corresponding eigenvectors, and sent each graph vertex to the triple of the entries for that vertex in those 3 eigenvectors?
    If so, how was 3 chosen? Was it chosen based on the knowledge that there should be 3 components, or was there a different region?

    • @ron-math
      @ron-math  7 місяців тому +1

      Yes, your understanding is correct.
      For this question, 2 is actually enough (so it doesn't depend on the knowledge that there are 3 components). In real world, it is usually trial and error and depends on experience a lot, like picking the number of cluster in k-means algorithm.

  • @gingerderidder8665
    @gingerderidder8665 5 місяців тому

    Please make more graph theory videos. The production value on this video matches 3B1B. Great stuff!

  • @yandrak6134
    @yandrak6134 7 місяців тому +1

    That soft music in the background is top notch!

  • @chadx8269
    @chadx8269 7 місяців тому +12

    Please increase the Audio Volume for the elderly Mathematician fans.

    • @ron-math
      @ron-math  7 місяців тому +2

      Noted! Thank you for your feedback!

  • @wangzhe5948
    @wangzhe5948 Місяць тому

    The best course if Spectral Clustering for beginners ever!

  • @paul_tee
    @paul_tee 6 місяців тому

    eventually, i really want to see someone make a comparison between spectral theory on closed manifolds and on compact graphs. there are too many similarities too be a coincidence. as a baby example, theorem one also holds on manifolds.
    as for the video, i think the strongest block is the one starting at 9:10. the introduction was too fluffy, and the end was too unfocused. too much fluff risks alienating the audience. it would've been better to split it into different videos.
    nonetheless, thank you for making a video on a very interesting subject

  • @blahch11111
    @blahch11111 7 місяців тому

    this is amazing. I hope there are more of these for different theories!

  • @dominiquelaurain6427
    @dominiquelaurain6427 7 місяців тому

    Very interesting. I have found a motivation for studying the subject, merely for clustering aspects in TDA (Topological Data Analysis). Maybe, a work to be done, with spectral graph theory about graph representing complex simplicies. You talked a little about weighted graphs (Laplacian is changed but the ideas remain, like in your quoted paper) but not about "directed graphs"...only undirected graphs because adjacency matrix is always symmetric. Why ? Does spectral theory exists for these extensions ? Motivationin your video is fuzzy: I deduce some applications in clustering. Are they others ? In graph theory, adjacent matrix M can have symbolic entries (not real constants) for example in graph paths, (M^n is representing sets of paths in at most n steps...the union M^0 + M^1 + ... M^n + ....) is representing all paths between a couple of vertices (i,j). Is there spectral theory there ?

    • @ron-math
      @ron-math  7 місяців тому

      Thank you for commenting.
      There are a lot of motivations. This is a huge topic and this video is just a slice of it. And you are right, M exponents are connected with graph paths.
      I was trying to frame different motivations as different stories. This video focuses on "clustering" as you said.

  • @happi5159
    @happi5159 7 місяців тому

    I don't understand what you mean by plotting the eigenvectors. How do you plot them? What do the axes mean? Why do equal value eigenvalues produce different eigenvectors on the graph? It seems so hand-wavy to me.

    • @drdca8263
      @drdca8263 7 місяців тому +1

      I think, “for each vertex of the graph, associate to it a point in n-D space (here n=3 was chosen) where the i-th coordinate is the coefficient of that vertex in the i-th eigenvector (with i ranging from 1 to n)

  • @CaiodeOliveira-pg4du
    @CaiodeOliveira-pg4du 7 місяців тому

    Absolutely brilliant! Thank you so much for this video!

  • @alextrebek5237
    @alextrebek5237 7 місяців тому +1

    This is exactly what I needed. Thank you!

  • @michaeln.8185
    @michaeln.8185 7 місяців тому

    Loved the video! Keep up the great work Ron!

  • @0MVR_0
    @0MVR_0 7 місяців тому

    been using an old cosine similarity algorithm to accomplish the task of metric distance
    yet directly manipulating a matrix seems much more efficient

  • @gabrielsembenelli7633
    @gabrielsembenelli7633 7 місяців тому +4

    Did you use Manim for the animations? Really cool video!

    • @ron-math
      @ron-math  7 місяців тому +1

      Yes. Thank you buddy.

  • @purpleAiPEy
    @purpleAiPEy 7 місяців тому +19

    I was at a combinatorics conference.. and as soon as they went to eigenvalues of graphs I melted into a puddle of shame.
    Well. I felt that for the whole conference but, I appreciate the video

  • @BalthazarMaignan
    @BalthazarMaignan 7 місяців тому +4

    I love this kind of videos, it switches it up a bit from my uni classes haha

    • @ofHerWord
      @ofHerWord 7 місяців тому

      🇵🇸🏳‍🌈FREE🏳‍⚧ PALESTINE!⚧🏳‍🌈🇵🇸

  • @benjiusofficial
    @benjiusofficial 7 місяців тому

    Man, what a stellar video. Explained stuff that I didn't even know I wanted from it. Great job, dude

  • @MechanicumMinds
    @MechanicumMinds 7 місяців тому +1

    I never knew graph theory could be so relatable. I mean, who hasn't had a 'genuine friendship' on social media, only to realize the other person doesn't follow them back? And don't even get me started on the'most popular guy in your social bubble' - I'm pretty sure that's just my high school ex. Anyway, on to the adjacency matrices...

  • @RandomBurfness
    @RandomBurfness 7 місяців тому +1

    Perhaps I didn't quite catch this detail, but why are we even interested in the "smoothness" of eigenvectors to begin with?

    • @ron-math
      @ron-math  7 місяців тому +1

      It my fault of story telling. You can check the fielder eigenvalue example and the spectral clustering example in the later half of the video: the eigenvector signs and magnitude can be used to perform tasks like graph partition and clustering.

  • @jaf7979
    @jaf7979 6 місяців тому

    You have a great teaching style.

  • @jorgelombardi169
    @jorgelombardi169 6 місяців тому

    ¿Always a Graph matrix are symetric? indiferently of the type of graph?

    • @ron-math
      @ron-math  6 місяців тому

      The graph has to be unidirectional in order for the matrices to be symmetric.

  • @jacksonherron2889
    @jacksonherron2889 6 місяців тому

    Thanks for the educational video!! :)

  • @IkhukumarHazarika
    @IkhukumarHazarika 7 місяців тому

    It was a spiritual experience to watch this maths videos make more videos on ml and optimization and please make it free and available for everyone

  • @andyhughes8315
    @andyhughes8315 7 місяців тому

    Any arbitrary pair of eigenvectors are not always orthogonal to eachother. Only eigenvectors corresponding to the same eigenvalue are orthogonal.

    • @ron-math
      @ron-math  7 місяців тому +1

      You mean different eigenvalues?

    • @huhneat1076
      @huhneat1076 7 місяців тому

      Yeah, I was confused on this too. I've rarely seen eigenvectors be perpendicular to each other, unless there's essentially a whole plane of eigenvectors (such as the matrix 2 * Identity)

    • @drdca8263
      @drdca8263 7 місяців тому

      @@huhneat1076For a self-adjoint linear operator H, if u,v are two vectors with eigenvalues \lambda and \mu respectively, and \lambda ≠ \mu ,
      then = = \mu
      but also, = = = (\lambda)^* ,
      so, because \lambda is real valued, \lambda = (\lambda)^* ,
      so, (\lambda - \mu) = 0,
      And as \lambda ≠ \mu, therefore = 0
      so, eigenvectors with different eigenvalues of self-adjoint linear operators , are orthogonal.
      If all the matrices here are self-adjoint (and, they seem to be real and symmetric, and so they should be), then the eigenvectors for different eigenvalues should be orthogonal.

  • @LunizIsGlacey
    @LunizIsGlacey 5 місяців тому

    Awesome video!!

  • @wargreymon2024
    @wargreymon2024 7 місяців тому

    excellent, I don't think we need that bgm tho, keep it up🔥💯

  • @isaac10231
    @isaac10231 6 місяців тому

    Was this made with Manim?

  • @quocanhnguyen7275
    @quocanhnguyen7275 3 місяці тому

    Hi, i think you went too fast from 9:51 to 10:47. I have no idea it was happening

  • @ZheZhang-yi2qn
    @ZheZhang-yi2qn 7 місяців тому +2

    Brilliant!

  • @MDNQ-ud1ty
    @MDNQ-ud1ty 7 місяців тому +5

    My suggestion is not to interrupt the flow of explaining things with tangential information such as notational conventions and to not make a big deal out of them unless it is very confusing or very non-standard. Breaking the flow for those things is worse than not mentioning them at all. If you do mention it maybe keep it short or use a footnote on the screen or mention it before going into the discussion.
    You make a big deal about using function notation rather than subscript notation as if it is a big deal when it isn't. There literally is no difference between f_i and f(i) except symbolically. Also, another solution would have been to start out with using "vector notation" then switching over to function notation when it was useful. Anyone confused due to lack of knowledge might be confused but they will quickly get over it or learn their lesson about it at some point and then it won't be a problem.
    It's important to try to avoid unnecessary things that ultimately undermine the learning process as it is self-defeating. Also, make sure you do not cut off a section too soon else it requires the viewer to rewind the video to see what just happened. Also, you might want to run your audio through a sibilance remover as your recording system/environment/voice is a bit sibilant.

    • @ron-math
      @ron-math  7 місяців тому

      Thank you for the suggestion.
      I was trying to make it consistent with the function Laplace operator that applies on a normal function f(x) and use f_i to represent the i-th eigenvector and f(i) to represent an eigenvector's i-th entry. Seems that I am not doing a good job enough here.
      But I guess I make it too "tedious". Will keep your suggestions in mind!

    • @MDNQ-ud1ty
      @MDNQ-ud1ty 7 місяців тому +2

      @@ron-math The video was pretty good in terms of explaining some cool things about how graph theory works and the details and it seems to flow better after that.
      The problems are minor and obviously everything can be improved(nothing is perfect). So don't take it person. I just thought I'd mention it so you are aware of such things for future videos(and ultimately it's your choice to listen to me or not).
      I understood what you were doing but for someone where such things are obvious and not bothered by the issue it becomes a little tedious to be told how it works and then for you to apologize for it(as there is no need and the apology takes up extra space).
      The principle should generally be: Explain things as effectively and efficiently as possible. Say as much as possible, as little as possible, and as effectively as possible.
      Also remember the "fan out principle". E.g., if you have a node(you) that reaches a lot of other nodes(your viewers) then any "mistakes" you make will be amplified across those nodes(viewers). What this means is the more effective you are then it is amplified by N where N is the number of viewers. Of course you can't be perfect so you have to find the balance of perfecting enough and not too much so you can also make the most progress.
      What I mean by this are such things as "explaining the basics". Many times people will try to explain things that, for the material, the viewer should already know or be able to deal with on their own. This just takes time away because M of those viewers will not need it explained and those that do can deal with it on their own.
      It is an optimization problem in some sense where one has to try and find the right combination of "details" to get the biggest result and it isn't easy.
      Again, these are minor things but if no one mentioned them then you'll never think about them.
      If it were me, what I would have done would just put a little text block explaining that detail so it could be picked up visually very quickly while also listening to the vocals(so as this is done in parallel it consumes almost no extra time). So this idea can be used by you in the future if you want to explain extra things you can have "footnotes" and the * is pretty universal. Just keep it non-intrusive and say enough but not too much so that most(say 80% of people have no problem with understanding it and what it means and the rest can put in whatever effort they need to understand what is going on).

    • @Debrugger
      @Debrugger 7 місяців тому +1

      @@ron-math First of all I loved the video and I'm going to work through it again more slowly :)
      I also agree with this person's comments. A person watching this definitely has some linear algebra background, otherwise they wouldn't understand anything, and you rightly assume that they've heard of stuff like PCA and K means. But anyone with that background can probably handle indices moving around from subscripts to (parentheses), so no need to get caught up on it.
      Another note on pacing, because it feels a bit uneven. The first part is very slow and basic, spending precious minutes on explaining node degrees and star/line/cycle graphs which I'm going to guess everyone who knows what an eigenvalue is can probably conceptualize. But then for the proofs about multiplicity and number of eigenvectors, things suddenly go really fast comparatively. On their own both of these are ok, but in the same video it feels very weird and will suit nobody (either you need 20 seconds of explanation that an adjacency matrix is symmetric, or you can understand in 5 seconds why the last eigenvector must be linearly dependent, but not both).
      Looking forward to more videos!

    • @ron-math
      @ron-math  7 місяців тому

      @@Debrugger Your profiling of the audience is amazing. Thank you so much! Will do better in the future.

  • @grayjphys
    @grayjphys 7 місяців тому

    Nice video :) I would say you need to put a de-esser on your sound though.

    • @ron-math
      @ron-math  7 місяців тому

      Good point! Thank you!

  • @math__man
    @math__man 7 місяців тому +1

    Liked the video :)

  • @howwitty
    @howwitty 7 місяців тому

    Thanks.

  • @MrAman47
    @MrAman47 3 місяці тому

    Man it's impossible to hear you without cranking the volume up to max

  • @vkessel
    @vkessel 5 місяців тому

    Lots of great knowledge here. Thank you. But please work on the flow and explanations. Some things are wrong, some information doesn't stay on the screen long enough, some important things are assumed and not explained, and some step-by-step equations would benefit from not using that distracting deformation transition animation.
    Like, let me see the before and after equation of each transformation step instead of hiding the last step. Constantly pausing and rewinding to fully understand 100% of the content turned this from 30min to 1h.
    Thanks again.

  • @juandavidrodriguezcastillo9190
    @juandavidrodriguezcastillo9190 7 місяців тому

    awesome

  • @ominollo
    @ominollo 7 місяців тому +1

    Nice 😊
    I didn’t understand everything but It was interesting nevertheless 👍

    • @ron-math
      @ron-math  7 місяців тому +1

      Happy to help you understand buddy.

  • @dimastus
    @dimastus 4 місяці тому

    peak math content

  • @SoufianeIdrissi123
    @SoufianeIdrissi123 7 місяців тому +1

    Hhh mines maths 2 2024

  • @siegfriedbarfuss9379
    @siegfriedbarfuss9379 7 місяців тому

    Great content but not for real dummies. Need to process a lot of info.

  • @fayadkhairallah2760
    @fayadkhairallah2760 7 місяців тому

    Is it true that when you write a book for dummies you actually mean that they're so. A Dieu 😮

  • @ivanmarkov3879
    @ivanmarkov3879 День тому

    Animations were strong, but you were going waaaay too fast through some parts and quite slow through others. Overall quite a mid video I would say.

  • @cv462-l4x
    @cv462-l4x 7 місяців тому +2

    It was too quick an explanation for me.... I wish I had more time to be able to calculate in my mind what is happening on the screen... it looks like an explanation for somebody who already knows 95% of the material...

    • @ron-math
      @ron-math  7 місяців тому +2

      Hey, slow it down on your side and pause for as long as you can and think about it.
      And let me know any confusions you have ;)

    • @cv462-l4x
      @cv462-l4x 7 місяців тому

      ​@@ron-mathit's wrong attitude. Why don't you slow down yourself

    • @ron-math
      @ron-math  7 місяців тому +2

      I will try next time 👍.
      If you check a few other comments, some people complain it is too "easy". It is always hard to balance but I will try my best.

    • @herrk.2339
      @herrk.2339 6 місяців тому +2

      @@cv462-l4x It's a video, you can pause and slow it down and work things out yourself. That's the whole point

    • @adiaphoros6842
      @adiaphoros6842 Місяць тому

      ​@@cv462-l4x your attitude is wrong. Why demand others to make videos slower when you can do it yourself. Your mind promotes laziness in learning.

  • @isiisorisiaint
    @isiisorisiaint 7 місяців тому

    this is ridiculous! if you'd write all this down as a course lecture then maybe it would have some value (i say "maybe" because i can't even follow your vid, let alone appreciating it in any meaningful way), but a video with the density of information you have here is just insane. who exactly is your supposed target audience with this vid??? it t=11.49 and i just gave up.

    • @ron-math
      @ron-math  7 місяців тому +2

      Hey buddy, you may not know how much I appreciate this comment.
      If you check a few other comments, you may see that people are complaining I spent too much time on "easy" stuff". The comments will collectively influence how I create content in the future. Your comment gave me an important data point that I also have audience who think this video's information density is too high.
      What's your suggestion to make this video valuable to you?

  • @eiseks3410
    @eiseks3410 7 місяців тому +25

    Too easy subject presented in a very complicated and time consuming way. Video should have been shorter in my opinion.

    • @ron-math
      @ron-math  7 місяців тому +21

      Hey, you are a theoretical physicist.

    • @ron-math
      @ron-math  7 місяців тому +36

      And I somewhat agree with you about the "shorter" argument. The original draft for this video is very concise and short:
      1. Remove the review part, jumps to L matrix and Fiedler vector directly.
      2. Talk about spectral clustering directly.
      But I decided to make it way more accessible for folks who have no knowledge of graph definitions and likely forget stuff that 0 can be an eigenvalue, etc.
      There are a lot of stuff that audience like you may enjoy more. Like this one ua-cam.com/video/uxsDKhZHDcc/v-deo.html

    • @eiseks3410
      @eiseks3410 7 місяців тому +30

      @@ron-math Hey, thank you very much for your reply...and sorry for the harsh criticism :P You're doing a great work with your channel :)

    • @4thpdespanolo
      @4thpdespanolo 7 місяців тому +11

      LOL not a simple subject if you dig deep enough

    • @umbraemilitos
      @umbraemilitos 7 місяців тому +5

      ​@ron-math Negative people are everywhere, don't worry about them. Opinions are cheap.