far1din
far1din
  • 15
  • 204 454
Cubic Formula. Wait what? #SoMEpi
In this video, we'll explore how to solve third-degree polynomial equations using Cardano's method. We'll start with depressed cubic equations and learn how to transform any cubic equation into its depressed form. Then, we'll use the cubic formula to find a solution and finally, translate that solution back to the original equation, allowing us to solve any cubic equation.
Manim code: github.com/far1din/manim?tab=readme-ov-file#the-cubic-formula-wait-what
► SUPPORT THE CHANNEL
➡ Paypal: www.paypal.com/donate/?hosted_button_id=K6ER3TW5J32LN
These videos can take several weeks to make. Any donations towards the channel will be highly appreciated! 😄
► SOCIALS
X: x.com/far1din_
Github: github.com/far1din
----------- Content -----------
0:00 - Introduction
00:25 - Visualizing Depressed Cubic Equations
01:37 - Completing the Cube: Prelude to the Cubic Formula
02:47 - Deriving the Cubic Formula (Algebra)
05:18 - Converting a General Cubic Equation into a Depressed Form
07:02 - Solving Cubic Equations: The Entire Process
07:27 - A Simple Example
► Contributions
Intro background music: pixabay.com/music/pulses-intro-180695/
Background music by BlackByBeats: pixabay.com/music/solo-piano-ambient-piano-114-109400/
Pyramid Illustration by Storyset: www.freepik.com/free-vector/pyramid-concept-illustration_168303227.htm
Greek mythology Illustration by Freepik: www.freepik.com/free-vector/hand-drawn-greek-mythology-illustration_25793984.htm
Fibonacci sequence and other manim examples: slama.dev/manim/camera-and-graphs/
#cubicequation #cubicformula #cardano #cardanosmethod #polynomialequations #math #maths #algebra #visualization #animation #proof #manim #SoME4 #SoMEPI #SoMEπ
Переглядів: 849

Відео

The Quadratic Formula from Scratch.
Переглядів 69511 місяців тому
In this video, we are going to look at how we can solve second-degree polynomial equations without the quadratic formula. We will instead use a technique called completing the square. Finally, we will proceed to prove the quadratic formula. ► SUPPORT THE CHANNEL ➡ Paypal: www.paypal.com/donate/?hosted_button_id=K6ER3TW5J32LN These videos can take several weeks to make. Any donations towards the...
Training a Convolutional Neural Network (CNN)
Переглядів 6 тис.Рік тому
Visualizing a convolutional neural network through the training process. Witness the Evolution of a Cutting-Edge Model, From Untrained to Trained, with comparisons at zero, one, five, and 15 Epochs. ► SUPPORT THE CHANNEL ➡ Paypal: www.paypal.com/donate/?hosted_button_id=K6ER3TW5J32LN These videos can take several weeks to make. Any donations towards the channel will be highly appreciated! 😄 ► S...
Convolutional Neural Networks from Scratch | In Depth
Переглядів 71 тис.Рік тому
Visualizing and understanding the mathematics behind convolutional neural networks, layer by layer. We are using a model pretrained on the mnist dataset. ► SUPPORT THE CHANNEL ➡ Paypal: www.paypal.com/donate/?hosted_button_id=K6ER3TW5J32LN These videos can take several weeks to make. Any donations towards the channel will be highly appreciated! 😄 ► SOCIALS X: x.com/far1din_ Github: github.com/f...
Backpropagation in Convolutional Neural Networks (CNNs)
Переглядів 43 тис.Рік тому
In this video we are looking at the backpropagation in a convolutional neural network (CNN). We use a simple CNN with zero padding (padding = 0) and a stride of two (stride = 2). ► SUPPORT THE CHANNEL ➡ Paypal: www.paypal.com/donate/?hosted_button_id=K6ER3TW5J32LN These videos can take several weeks to make. Any donations towards the channel will be highly appreciated! 😄 ► SOCIALS X: x.com/far1...
Visualizing Convolutional Neural Networks | Layer by Layer
Переглядів 81 тис.2 роки тому
Visualizing convolutional neural networks layer by layer. We are using a model pretrained on the mnist dataset. ► SUPPORT THE CHANNEL ➡ Paypal: www.paypal.com/donate/?hosted_button_id=K6ER3TW5J32LN These videos can take several weeks to make. Any donations towards the channel will be highly appreciated! 😄 ► SOCIALS X: x.com/far1din_ Github: github.com/far1din Manim code: github.com/far1din/mani...

КОМЕНТАРІ

  • @hewramanwaran6444
    @hewramanwaran6444 4 дні тому

    Great Explanation. Thank you very much.

  • @saultube44
    @saultube44 4 дні тому

    Nope there are 3 results: 2, 3.17, and 17.1 approx; calculated from the last formula, the ± on the √ gives 4 variations, the 1st and 4th are the same, the 2nd & 3rd give the other 2 results, and there are 2 cubic roots, that's 3 results each

    • @far1din
      @far1din 4 дні тому

      Very much a valid concern! From 5:04, you will see that we first solve for "t". Then, since u=m/t, we substitute in that solution for "t". However, since the baseline is that "u" is a function of t, we must pick the same t. Hence, two variations. In other words, x = t - u/3. Since u = m/t, we substitute u in the formula for x. Therefore, we get that x = t - m/(3t). This practically means that in the written out formula for x, which you see at 5:14, the cube root terms has to be the same. This is why 5.46 and -1.46 (suppose the other two variations) are not valid solutions in 7:47. Don't fully get how you got 3.17 and 17.1 (?) When that's said, I fully understand the confusion which can occur when simply looking at the formula. To avoid the confusion, you could write x = t - m/(3t) and have the definition of "t" written right besides. Let me know if anything! 😃

    • @saultube44
      @saultube44 4 дні тому

      @@far1din Yes, I'm confused; Hubris, the pretext humans make "when they have figured out the Universe" ha! 😊 People in Marh/Science like to discard results they don't want, as simple as that, even when clearly the results should be more, but people don't search for the truth, they search for some, convenient truth

  • @manfredbogner9799
    @manfredbogner9799 6 днів тому

    Sehr gut

  • @bug8628
    @bug8628 7 днів тому

    Amazing video!! :D

    • @far1din
      @far1din 5 днів тому

      Thanks! 😄

  • @manfredbogner9799
    @manfredbogner9799 7 днів тому

    Sehr gut😊😊

  • @franciscobrizuela766
    @franciscobrizuela766 8 днів тому

    Thank you! Now I'm one step closer to finishing a model for hw :)

    • @far1din
      @far1din 5 днів тому

      You can do it!

  • @PiyushBasera-dy9cz
    @PiyushBasera-dy9cz 9 днів тому

    ❤❤

  • @AlbertoOrtiz-we2jc
    @AlbertoOrtiz-we2jc 13 днів тому

    excellent explanation thanks

    • @far1din
      @far1din 5 днів тому

      Glad it was helpful!

  • @chatgpt-nv5ck
    @chatgpt-nv5ck 15 днів тому

    Beautiful🙌

    • @far1din
      @far1din 5 днів тому

      Thank you 🙌

  • @DarrabEducation
    @DarrabEducation 23 дні тому

    More amazing content such that will be apprecaited.

  • @JamieTx23
    @JamieTx23 26 днів тому

    Excellent video! Thanks for taking the time and breaking it down so clearly.

    • @far1din
      @far1din 5 днів тому

      Very welcome!

  • @tejan8427
    @tejan8427 26 днів тому

    How do we know how many layers or filters we need at each layer ? I mean, how can we construct our architecture.

  • @ForbiddenPrime
    @ForbiddenPrime 26 днів тому

    Thank you for the source code. this will help me to create some content for my syllabus. with love <3

    • @far1din
      @far1din 5 днів тому

      Glad it was helpful although not the most ideal code 😂

  • @vishvadoshi976
    @vishvadoshi976 28 днів тому

    “Beautiful, isn’t it?”

  • @Param3021
    @Param3021 Місяць тому

    Amazing CNN series, super intuitive and easy to understand!❤

  • @SterileNeutrino
    @SterileNeutrino Місяць тому

    Nice. I remember working on digit recognition using handcoded analysis of pixel runs a long time ago. It never worked properly 😂 And it was computationally intensive.

  • @mahmoudhassayoun9475
    @mahmoudhassayoun9475 Місяць тому

    Good job , the explanation is super, I hope you do not stop making videos in this calibre . Did you use manim to make this video or an other video editor?

  • @dhudach
    @dhudach Місяць тому

    I'm new to machine learning and neural networks. Your video is very helpful. I have built a small python script just using numpy and I can train numerous samples. So this is a big picture question. Let's say I've trained my program on thousands of inputs and I'm satisfied. Now I want to see if it can recognize a new input, one not used in training. What weight and bias values do I use? After I'm finished with training, how do I modify the script to 'guess?' It would seem to me that back propagation isn't used because I don't actually have a 'desired' value so I'm not going to calculate loss. What weight and bias values do I use from the training sessions? There are dozens of videos and tutorials on training but I think the missing piece is what to do with the training program to make it become the 'trained' program, the one that guesses new inputs without back propagation.

  • @suthanraja1657
    @suthanraja1657 Місяць тому

    Thank you so much!

  • @rubytejackson
    @rubytejackson Місяць тому

    This is an exceptional explanation, and I can't thank u more... u have to keep going, u enlighten many student on the planet! that's the best thing a human can do!

    • @far1din
      @far1din Місяць тому

      Thank you brother, very much appreciate it! 🔥

  • @Brandonator24
    @Brandonator24 Місяць тому

    I'm curious, why is the first convolution using ReLU and then later convolutions using sigmoid? Edit: Also, when convolving over the previous convolution-max pooling output, we have two 2 images, how are the convolutions from these two separate images combined? Is it just adding them together?

    • @far1din
      @far1din Місяць тому

      Hey Brandon! 1. The ReLU and Sigmoid are just serving as examples to showcase the different activation functions. This video is just a «quick» visualization from the longer in depth version. 2. Not sure if I understood, but if your referring to the filters, they are added. I go through the math behind this with visualizations in the in depth video. I believe it should clarify your doubts! 😄

    • @Brandonator24
      @Brandonator24 Місяць тому

      @@far1din Will be checking that out, thanks!

  • @jaberfarahani6645
    @jaberfarahani6645 Місяць тому

    the best channel❤

  • @shirmithNirmal-
    @shirmithNirmal- Місяць тому

    That was awesome explanation

  • @SamuelMoyoNdovie
    @SamuelMoyoNdovie Місяць тому

    What an explanation man 🫡

  • @chinmaythummalapalli8655
    @chinmaythummalapalli8655 Місяць тому

    I racked my brain for hours and couldn't figure out why the features' maps aren't multiplying after each layer and this video just helped me realize they become channels of images , it helped me relax and I think I can go downstairs for dinner now.

    • @far1din
      @far1din Місяць тому

      Glad it helped! 😄

  • @rubytejackson
    @rubytejackson Місяць тому

    exceptional explanation u did! I have several questions , but first id like to ask is it ok to support u from the thanks button since i dont have any paypal account? thnks warmest regards ruby

    • @far1din
      @far1din Місяць тому

      Ofc my friend! Feel free to shoot me a DM on X if you have any questions aswell 💯

  • @RUDRARAKESHKUMARGOHIL
    @RUDRARAKESHKUMARGOHIL Місяць тому

    Sorry if this sound silly but what actually is inflection point ? is it f" or any other geometric intuition ?

    • @far1din
      @far1din Місяць тому

      You’re correct. It is the point where the double derivative is equal to 0. There are many geometrical intuitions. For cubic equations, the inflection point serves as the point where there is rotational symmetry. This means that you can rotate a cubic function 180 degrees around the inflection point and still have the same plot. I actually cut this part out as it felt like a digression and I didnt want to prolong the video any more than necessary. Maybe I should have kept it in 😭😂

    • @RUDRARAKESHKUMARGOHIL
      @RUDRARAKESHKUMARGOHIL Місяць тому

      If you already made it the better idea would have been to keep it...but btw ty❤​@@far1din

  • @RUDRARAKESHKUMARGOHIL
    @RUDRARAKESHKUMARGOHIL Місяць тому

    At 2:03 I have a doubt you took the m×x cube divided it into 3 parts and then place that on the other cube but you only covered 3 sides and not all 6...so vol will be 1/2 of t^3 no ?

    • @RUDRARAKESHKUMARGOHIL
      @RUDRARAKESHKUMARGOHIL Місяць тому

      Sorry now I got it 😅 it was still somewhat subtle confusion..

    • @far1din
      @far1din Місяць тому

      Haha nice! You’ll see that it is a «cube» with sidelengths = t once it starts spinning. 😄

  • @Minayazdany
    @Minayazdany Місяць тому

    great videos

  • @RSLT
    @RSLT Місяць тому

    ❤❤❤ Liked and subscribed .

  • @far1din
    @far1din 2 місяці тому

    Watch the full video: ua-cam.com/video/JboZfxUjLSk/v-deo.html

  • @averagemilffan
    @averagemilffan 2 місяці тому

    Great video!! Just one question, why does the inflection point of a depressed cubic fall on x=0?

    • @far1din
      @far1din 2 місяці тому

      It is explained at around 06:09 in the video, but maybe not well enough so I’ll try again haha. The «x» value for the inflection point is found by setting f’’(x) = 0. For a general cubic function f(x) = ax^3 + bx^2 + cx + d, the double derivative f’’(x) = 6ax + 2b If we want the inflection point, we have to set f’’(x) = 0, which will give us 0 = 6ax + 2b which in turn will give us x = -b/(3a). This means that we can find the inflection point for any cubic function at -b/(3a). Now, if we want the inflection point at x = 0, we will get that 0 = -b/(3a) which equates to b = 0. If we go back to our initial equation f(x) and set b = 0, we will get f(x) = ax^3 + 0*x^2 + cx + d. This eliminates the x^2 term as we multiply by zero, and leave us with f(x) = ax^3 + cx + d which essentially is a depressed cubic function. I hope this cleared any doubt. Please let me know if there is anything else! 😄

    • @averagemilffan
      @averagemilffan 2 місяці тому

      @@far1din Ah I see, thank you, that explains it pretty well. Cheers on your future videos!

  • @SelfBuiltWealth
    @SelfBuiltWealth 2 місяці тому

    beautiful explanation❤

  • @SelfBuiltWealth
    @SelfBuiltWealth 2 місяці тому

    this is a very unique and underrated explanation!beautiful work thank you so much❤

  • @rainbow-cl4rk
    @rainbow-cl4rk 2 місяці тому

    Nice ! Would you do the same for degree 4?

    • @far1din
      @far1din 2 місяці тому

      Great suggestion! I’ll give it a try if I can find a compelling way to visualize both the problem and the solution. However, the extra dimension might make it challenging :/

    • @rainbow-cl4rk
      @rainbow-cl4rk 2 місяці тому

      @@far1din you can add colour to visualise it for example, or draw the projection in R³, there is many way to represent a tesseract

  • @윤기좔좔엉덩이
    @윤기좔좔엉덩이 2 місяці тому

    What are the criteria for setting filters?

  • @MarshallBrunerRF
    @MarshallBrunerRF 2 місяці тому

    I think this is a great explanation! My only thing would be that some of the equation transformations are hard to follow, since they're so rapid fire. Keep up the good work!

    • @far1din
      @far1din 2 місяці тому

      Thank you for the feedback. Rewatching the video now, I understand that the transformation might have been a bit too rapid. Will take that into consideration for the next videos! 😄

  • @MarshallBrunerRF
    @MarshallBrunerRF 2 місяці тому

    That intro was so well done! Still watching but just wanted to say that before I forget

  • @emmanuelsheshi961
    @emmanuelsheshi961 2 місяці тому

    nice work sir

  • @srinathchembolu7691
    @srinathchembolu7691 2 місяці тому

    This is gold. Watching this after reading Michael Nielsen makes the concept crystal clear

  • @r0cketRacoon
    @r0cketRacoon 3 місяці тому

    tks u very much for this video, but it's probably more helpful if you also add a max pooling layer.

  • @r0cketRacoon
    @r0cketRacoon 3 місяці тому

    what happens if I specify the convo layer 2 have only 2 dimensions? the same kernel will be applied for both 2 images? then be added?

  • @r0cketRacoon
    @r0cketRacoon 3 місяці тому

    WWOWW! I need a video visualizing CNN from layer to layer like that, and I encountered ur channel. The best CNN visualization for me now. Tks u!

  • @boramin3077
    @boramin3077 3 місяці тому

    Keep it up and I really encourage you to make more content! Data science community needs more of such high quality contents!

  • @boramin3077
    @boramin3077 3 місяці тому

    Best video to understand what is going on the under the hood of CNN.

  • @boramin3077
    @boramin3077 3 місяці тому

    Great explanation!

  • @boramin3077
    @boramin3077 3 місяці тому

    Amazing work! Please continue to make more videos on other ML topics. I find your videos are really helpful to understand the concepts.

  • @mateokladaric
    @mateokladaric 3 місяці тому

    Finally someone who doesn't just say "it convoluted the image and poof one magic later it works"

  • @kyugelblitz
    @kyugelblitz 3 місяці тому

    Can't express my gratitude, albeit here I am. Everything is shown very detailed, explained accurately and understandably. Keep up the good work.

  • @AsilKhalifa
    @AsilKhalifa 3 місяці тому

    Thanks a lot!