Didn't Graduate Guide to: Kolmogorov-Arnold Networks

Поділитися
Вставка
  • Опубліковано 29 січ 2025

КОМЕНТАРІ • 70

  • @deepfriedpancake
    @deepfriedpancake  8 місяців тому +20

    Update: wow this video blew up. Thanks I love you all!
    So, *feel free to give suggestions here on what topic I should do a didn't graduate guide next!*
    I'll see if I can figure out how to explain research topic in AI/ML or computational science in a 1st~2nd year undergraduate level
    *Mistake correction* : in 1:31 - 1:42, it should be p = 1 ~ n , and q = either 0 ~ 2n or 1 ~ 2n+1.
    Sorry I mistyped the latex!
    So yes, I said I'd make a lecture video on this and now y'all shall suffer with me on this cardboard+glue lecture
    also, learning 3b1b's Manim is killing me and I don't even really use the geometric/topological animations anyways hellllppppp....

    • @LittleWhole
      @LittleWhole 8 місяців тому

      Can attest to Manim 💀

    • @michaeletzkorn
      @michaeletzkorn 8 місяців тому +1

      I'm learning Manim too! As 3b1b demonstrates, pedagogical content truly excels when the visuals are of high quality. Thanks for the brief walkthrough of the paper!

    • @seraphimwang
      @seraphimwang 8 місяців тому

      Manim powerful but “harsh” with steep learning curve. Probably PowerPoint or Blender might be better or even a combo of these tools?

    • @sfilkin
      @sfilkin 8 місяців тому

      the recent xLSTM and how could RNN and LSTM still be useful in production

    • @xxlvulkann6743
      @xxlvulkann6743 8 місяців тому

      Have you tried using Blender instead? @BobbyBroccoli , a popular youtuber who creates documentaries (with some animation), has a video titled "How to make a BobbyBroccoli video" it may be an easier start than Manim. Great video btw, this is the most concise explanation of KANs I've seen so far.

  • @DB-nl9xw
    @DB-nl9xw 8 місяців тому +11

    The way you are able to explain something so complex more easily is amazing! Keep us updated!

  • @pflintaryan8336
    @pflintaryan8336 2 місяці тому

    This is what i call explanation. examples, visualization, theory all in beautiful harmony

  • @Naitry
    @Naitry 8 місяців тому +11

    love the little banter you add in, not many people make subjects so deep into a field so fun
    + 1 sub

  • @MarcoD-k1p
    @MarcoD-k1p 8 місяців тому +1

    Hi, the jupyter python code shown in 5:40 have been public in github?😁

  • @AC-go1tp
    @AC-go1tp 8 місяців тому +2

    I feel that giving a real-world potential application and/or intuition would have been a great addition to the video. Very cool presentation. Thank you!

    • @intertextualdialectic1769
      @intertextualdialectic1769 8 місяців тому +1

      Finding analytic solutions to differential equations has a lot applications. Maybe you just don’t have the background.

    • @AC-go1tp
      @AC-go1tp 8 місяців тому +3

      ​@@intertextualdialectic1769 That's precisely the point. Because I do not have the background , I did ask for concrete details like for the potential chemistry application mentioned briefly in the video. I am sorry but how is your comment useful ?

    • @w花b
      @w花b 8 місяців тому +1

      ​@@AC-go1tpthey're just being mean, that's comments for ya

  • @sagarramchandani3139
    @sagarramchandani3139 8 місяців тому +2

    As someone working to be a computational physicist without a formal background in computer science,
    I would love more of these videos!
    These networks could be pretty good in experimental physics as well.

  • @keenshibe7529
    @keenshibe7529 8 місяців тому

    Really enjoyed your style of presentation on new ML technologies! Subscribed and look forward to more updates!

  • @wankachalawea
    @wankachalawea 8 місяців тому +3

    Really fun video, and the concept seems very interesting, as with everything, it'll eventually be optimized

  • @draggador
    @draggador 8 місяців тому

    when i learnt about & wanted to look more into KANs through youtube, this upload seemed like the only one that wasn't full of buzzwords & forced hype; thank you for your hard work

  • @Kram1032
    @Kram1032 8 місяців тому +4

    This is a bit crazy but you could take each function to be its own NN as those also are, effectively, a bunch of functions with learnable coefficients. So using this you could have matrices of entire NNs.
    I think from that perspective this actually looks quite a bit like meta-learning: You could have many such layers and use various basic NN architectures as your possible input functions. The output would then be "the best fitting NN architecture" I think?

    • @xxlvulkann6743
      @xxlvulkann6743 8 місяців тому

      Exactly what I thought! It is essentially like a mixture of experts setup but with greater interpretability and expressivity because the activation functions are not fixed. If KANs ever find themselves in LLMs, it'll likely be in the form of an MOE.

  • @valentinfontanger4962
    @valentinfontanger4962 3 місяці тому

    This is the first time I've ever seen a matrix of operator, confusing but I somehow like it.

  • @AsmageddonPrince
    @AsmageddonPrince 8 місяців тому +9

    For a "didn't graduate guide", it's pretty much 100% heavy duty math right away instead of intuitive explanations.

    • @deepfriedpancake
      @deepfriedpancake  8 місяців тому +3

      hmm perhaps you are correct
      since even 1st~2nd year undergrad courses will hit you with the math right away, what I can do is just to make the hit softer, but I can also try focus more on the qualitative side of explanation in coming videos.
      thanks for the feedback

    • @AsmageddonPrince
      @AsmageddonPrince 8 місяців тому +4

      @@deepfriedpancake I mean, it's valuable, I'm sure people appreciate a gentler introduction to the actual theory underlying it, but as a person who doesn't really understand much math, I've gotten nothing out of this video, and I feel like the title promised that. Even actual educational value aside, intuitive explanations are usually a better way to integrate why/how something works than the equations describing it do.

  • @ps3301
    @ps3301 8 місяців тому

    Is there architecture compatible with liquid neural network? Can they be integrated ?

  • @youtubepooppismo5284
    @youtubepooppismo5284 8 місяців тому +27

    You need to use \sin(x) not sin(x) in TeX notation

  • @Ryuuuuuk
    @Ryuuuuuk 8 місяців тому +3

    This reminds me of the whole field of symbolic regression.

  • @sharptrickster
    @sharptrickster 8 місяців тому

    I wonder if its gonna be a big thing for audio DSP, more precisely, analog modelling, where usually you have circuit solvers and sometimes LSTMs for nonlinearities (like vacuum tubes, transistors, transformers and etc). With the added bonus of giving you a mathematical representation of the signal processing function, if I understand it correctly? Sounds like a great application.

  • @huxleymarvit3832
    @huxleymarvit3832 8 місяців тому

    this was a lot of fun! really enjoyed the video

  • @seraphimwang
    @seraphimwang 8 місяців тому

    I am looking forward the upcoming next paper and your explanation on this topic which KAN be useful both chemistry and Physics of materials. Thanks 🙏🏻

  • @cv462-l4x
    @cv462-l4x 8 місяців тому +1

    Saw project which just checks all possible functions to fit data with mininal error. As far as I remember, they did it without neural networks at all. Just mathematical operations and functions to brute force their combinations

  • @iliyanborisov4252
    @iliyanborisov4252 8 місяців тому

    Very cool video, excited me to continue learning math!

  • @alphatra4593
    @alphatra4593 5 місяців тому

    make some video about benchmarking and analysing geometric, graph, neurosymbolic neural networks - some new hyping approaches in e.g. MNIST. That would be very interesting

  • @minecraftermad
    @minecraftermad 8 місяців тому

    so this would be great to find whatever PySR is trying to find if i got that correctly?
    Basically you could possibly use this to optimize the runtime cost of an already trained neural network, by replacing middle parts with singular functions.

    • @deepfriedpancake
      @deepfriedpancake  8 місяців тому

      Yes! One selling point mentioned in the original paper is that KANs is a better version of symbolic regression

  • @mertkalkanci
    @mertkalkanci 8 місяців тому

    incredible video, so simple and understandable

  • @thiagomata-coding-and-math
    @thiagomata-coding-and-math 8 місяців тому

    5 seconds of video and already hit the like button

  •  8 місяців тому

    Thank you very much. Nice and interesting video. Greetings from a bioeng student from universidad del Cauca, Colombia.

  • @matthewrberning
    @matthewrberning 8 місяців тому

    keep it up, this was an excellent review

  • @ultrasound1459
    @ultrasound1459 8 місяців тому +1

    Any code on Mnist dataset?

    • @deepfriedpancake
      @deepfriedpancake  8 місяців тому +2

      That is one criticism of KANs I saw on Reddit: the team publishing it hasn't even tested it on MNIST which have 784 input dimensions
      Admittedly they have highly focused on solving function finding problems and PDEs
      I could be one of the first to try validating KANs with vision/image problems tho, as a follow up vid to this one!

    • @manavsingh6145
      @manavsingh6145 8 місяців тому

      @@deepfriedpancake do it

    • @DB-nl9xw
      @DB-nl9xw 8 місяців тому

      @@deepfriedpancake let us know more updates

  • @Umesh-lz2ro
    @Umesh-lz2ro 8 місяців тому

    how do one know is next paper is released or not?

    • @deepfriedpancake
      @deepfriedpancake  8 місяців тому +1

      I guess we will have to stay tuned to the citations sections at arXiv!

  • @loweffortdev
    @loweffortdev 8 місяців тому

    Very cool topic + you are good at explaining + subbed

  • @merbst
    @merbst 8 місяців тому +1

    there is a reason why everything in physics is an operator, because functional analysis is universal!

    • @merbst
      @merbst 8 місяців тому

      but also, thank you! I didn't graduate because I was chasing the girls, but Kolmogorov will always be my love!

  • @BjornHeijligers
    @BjornHeijligers 8 місяців тому +1

    Cool! Yes please make more. Using search keywords like #Some2, #Some3 or #4Percent might help you connect with your tribe even better!

    • @deepfriedpancake
      @deepfriedpancake  8 місяців тому +1

      I do love to join the SoME by 3b1b this summer, tho since I use manim for less than 30% of the video and only for writing math equations I feel quite out of place 😅
      I can definitely learn it more and also use blender and 3d models as other commenters suggested

    • @BjornHeijligers
      @BjornHeijligers 8 місяців тому

      Don't worry. You are in the same league as them already. Take it from a stranger.

  • @__-de6he
    @__-de6he 8 місяців тому

    Good explanation. Thanks :)

  • @amortalbeing
    @amortalbeing 8 місяців тому

    Thanks, that was neat

  • @Speak4Yourself2
    @Speak4Yourself2 8 місяців тому

    Thanks a lot!

  • @ДанїїлІльченко-я7э
    @ДанїїлІльченко-я7э 8 місяців тому

    Do you have a discord channel?

  • @augustocsc
    @augustocsc 8 місяців тому

    You would love symbolic regression

  • @andrewpolar1685
    @andrewpolar1685 8 місяців тому

    Anyone interested in c# implementation? I have KAN fully functioning code. 500 lines no 3rd party libraries.

  • @sabriath
    @sabriath 7 місяців тому

    cool, but still not the right way to bring the singularity.....ug, must i do everything? lol

  • @abdelfata7_m3nnoun
    @abdelfata7_m3nnoun 8 місяців тому

    thank you

  • @sergiomanuel2206
    @sergiomanuel2206 8 місяців тому

    In your expansion of case n=3, the last function should be PHI6( ... ) not PHI7(). For 3 vars, you should have 6 functions 1 to 6

    • @deepfriedpancake
      @deepfriedpancake  8 місяців тому +1

      arrgh I spot that mistake
      For the correct KAR theorem p goes from 1 to n but q goes from either 0 to 2n or 1 to 2n+1
      so for n=3, p=3 and q=7
      thanks for spotting it!

    • @sergiomanuel2206
      @sergiomanuel2206 8 місяців тому

      @deepfriedpancake You are right!!! Thanks for the answer!!

  • @simpleguy2557
    @simpleguy2557 8 місяців тому

    bro your explanation is so good didn't understand few points dont worry it is due to my low iq rewatching it multiple tims till i get it

  • @shadeofsound23
    @shadeofsound23 8 місяців тому

    As a college dropout, fukken sweet.

  • @Electronics4Guitar
    @Electronics4Guitar 8 місяців тому

    Nice 👍🏻

  • @XAheli
    @XAheli 8 місяців тому

    Damn