Create a Simple Neural Network in Python from Scratch

Поділитися
Вставка
  • Опубліковано 19 гру 2024

КОМЕНТАРІ • 713

  • @JonasBostoen
    @JonasBostoen  6 років тому +173

    In the next video we’re going to be making a blockchain in JavaScript, so subscribe if you’re interested in that stuff!

    • @SoumilShah
      @SoumilShah 6 років тому

      great video so made everything so easy

    • @CrypticConsole
      @CrypticConsole 5 років тому +1

      Dow stupid schools blocked pip and zip archives so I can't install numpy

    • @itstatanka
      @itstatanka 5 років тому

      Which compilar did you use?

    • @siddhant5697
      @siddhant5697 5 років тому

      in which software r u coding??

    • @frankynakamoto2308
      @frankynakamoto2308 4 роки тому

      Polycode
      Can the neurons and inputs be placed together, like neurons with much built in data??
      Also I need a very powerful neural network for several different purposes, speech, faceID and math solving problems, do you have something that you made that is open source that you can share with me??

  • @johnc3403
    @johnc3403 5 років тому +188

    "stay with me, it's gonna be ok"... dude, that's such a lovely sentiment. You were born to teach I think, with that ability to keep pupils onboard. Very good video my man, thank you so much..

  • @mohamedsuhailirfankhazi6628
    @mohamedsuhailirfankhazi6628 4 роки тому +49

    My friend, your explanation in 15 minutes gave more clarity to me than hours of crash course tutorials online. So simple and well explained. Awesome stuff my man!

  • @morphman86
    @morphman86 5 років тому +177

    After watching hyper-advanced tensorflow/keras stock market prediction tutorials for a while, being completely lost, I stumbled on this.
    I finally, after weeks of trying to learn NN and decades of practical programming experience, understand it.
    The iterative backpedaling was what confused me with all of those other videos, but taken down to its most simple form, like in this video, I can now see that it's merely looking at what it got, what it was trying to get and make adjustments to the appropriate synapses based on that, then trying again.
    It's not the maths that confused me, it's how the machine actually learned. And that was perfectly demonstrated in this video. Thank you!

    • @RandageJr
      @RandageJr 5 років тому

      Do you know where I can find these tutorials? It would be very helpful for me, thanks!

    • @jacobokomo1880
      @jacobokomo1880 4 роки тому

      kindly feel free to share with us Who was the teacher who took you through the Previous Tutorials. However, This teacher is doing well. Credits 💪

    • @Govind_Sisma
      @Govind_Sisma 4 роки тому

      B

    • @morphman86
      @morphman86 4 роки тому +1

      @Isaiah _ Neural Network

    • @KennTollens
      @KennTollens 4 роки тому

      I agree too. So many videos complicate and dance around simple mechanics. Knowing the flow of the engine and the simple concept of what is happening, the other videos might make more sense now that I can put it into context.

  • @hfe1833
    @hfe1833 5 років тому +731

    What the?....this is it, finally I found good tutorial

    • @Pancake3000
      @Pancake3000 4 роки тому +9

      same lol Ive finally can actually flippin understand thank much
      +1 sub
      i can english.

    • @scottpatterson9136
      @scottpatterson9136 4 роки тому +1

      I agree

    • @koksem
      @koksem 4 роки тому +2

      ye someone finally explains what it is XD

    • @mariomuysensual
      @mariomuysensual 3 роки тому +2

      same!

  • @djjjo6130
    @djjjo6130 4 роки тому +50

    “Stay with me, it’s gonna be okay” that makes me feel like I’m actually learning something and not just being told something

    • @MC_MrOreo
      @MC_MrOreo 3 роки тому +1

      (I know I’m late but) Literally came to the comment section about this 😂

  • @mattisaderp8929
    @mattisaderp8929 5 років тому +190

    "stay with me it's gonna be okay"

  • @arifmeighan3162
    @arifmeighan3162 3 роки тому +23

    This tutorial is a perfect blend of talking/programming and slides. Its also quick and to the point 8)

  • @hdluktv3593
    @hdluktv3593 4 роки тому +3

    I watched a lot of videos about Machine Learning because I wanted to unterstand how that works. Non of these Videos explained so good like yours how a neuron and the adjustment actually works. Good work, now I finally understood it.

  • @paulschmidt8742
    @paulschmidt8742 5 років тому +42

    Bro, it was much easier then I thought. Thx for explaining.

  • @Awesomer5696
    @Awesomer5696 5 років тому +3

    What a fantastic way of explaining it. Whilst this is obviously not immediately useful, It's a sort of toy approach that gives you a building block to understand the greater scope.

  • @nocopyrightgameplaystockvi231
    @nocopyrightgameplaystockvi231 3 роки тому +7

    Line no 16 : synaptic_weights=2 * np.random.random((3,1))-1
    this line makes an array of 3X1 or a matrix of size 3X1. I did not understand this line before I tried this line separately.
    This makes an easy grasp of the random concept, but as I learned in Soft Computing in my Btech, you can directly initialize the weights as 1, which will then get adjusted during training.
    you can also replace the line with it : synaptic weights=np.array([[1,1,1]]).T
    THANKS TO YOU for making this short and easy tutorial!

    • @Retriiiii
      @Retriiiii 10 місяців тому

      Hey can you tell me why are we multiplying 2 and subtracting 1?

    • @nocopyrightgameplaystockvi231
      @nocopyrightgameplaystockvi231 10 місяців тому

      ​@@Retriiiiiwhere??

    • @Retriiiii
      @Retriiiii 10 місяців тому

      @@nocopyrightgameplaystockvi231
      2 * np.random.random((3,1)) -1
      ^ ^

  • @shimuk8
    @shimuk8 6 років тому +9

    I joined my university 2 months late, absolutely had no idea how to learn the lost neural network project topic and then I saw your video !!! Thanks a lot dude !!! For saving my semester HAHAHA

    • @JonasBostoen
      @JonasBostoen  6 років тому +1

      meaaaww hahaha nice, share it to any of your buddies if you think they need it ;-)

    • @shimuk8
      @shimuk8 6 років тому +1

      @@JonasBostoen Oh yes already did that,,, right now you have blessings of many helpless students LOL

  • @brehontechologies
    @brehontechologies 5 років тому +12

    Finally, a clear, straightforward tutorial to code along. GREAT JOB!

  • @robertdraxel7175
    @robertdraxel7175 5 років тому +17

    Most useful video on the internet for a total beginner, for anyone new to AI. Thanks.

  • @EricCanton
    @EricCanton 5 років тому +1

    Just a note on sigmoid_derivative, for myself as much as anyone else. Since you're inputting the output of sigmoid to sigmoid_derivative, he's using that sigmoid satisfyies the differential equation
    y'(x) = y * (1 - y)
    so we can compute the derivative sigmoid'(x) by inputing sigmoid(x) into [y --> y(1-y)]. That's very clever!

    • @victoryfirst06
      @victoryfirst06 Рік тому

      But you should run the outputs through the sigmoid derivative, right? And the outputs are sigmoided by default, so shouldn't you use the sigmoid twice?

  • @calmo15
    @calmo15 5 років тому +116

    Amazing video, too few sources do the absolute basics. however, can you please crank your volume up!

  • @joesminis
    @joesminis 5 років тому +3

    At the 10 minute mark and I just wanted to say that your explanations are clicking left and right with me thank you!!!!

  • @abdechafineji8782
    @abdechafineji8782 5 років тому +3

    The best one who can give you the right explanation of creating of a neural network from scratch.

  • @stevesajeev6477
    @stevesajeev6477 3 роки тому

    Wow... The perfect tutorial.. I have been searching in the internet for a tutorial on how to make neural networks from scratch .
    now I got it.. this is soo cool...
    Very detail explanation...

  • @notyourtypicalanime7475
    @notyourtypicalanime7475 3 роки тому +1

    This is what I'm looking for, on how to train your datasets by adjusting weights. Thank you so much!

  • @ankitds1369
    @ankitds1369 5 років тому +8

    in output after training : you can use this, and this will round off the decimal as a round off value - print(np.round(outputs,1))

  • @JonasBostoen
    @JonasBostoen  6 років тому +119

    Coding starts at 2:30

    • @ChillGuyYoutube
      @ChillGuyYoutube 4 роки тому +1

      Polycode ping your comment so others will see it!

    • @du42bz
      @du42bz 4 роки тому +3

      @@ChillGuyUA-cam maybe his firewall blocks icmp packets

    • @rr.studios
      @rr.studios 4 роки тому +1

      @@du42bz I read that as "pimp packets"

  • @povmaster235
    @povmaster235 3 роки тому +2

    At last... the video that doesn't just explain stuff but, but actually tells you what to do too!

  • @Oleg-kk6xv
    @Oleg-kk6xv 5 років тому +2

    Thank you very much. I constantly see these videos about the theory of Machine Learning and AI but I have never found an in-depth start from scratch tutorial with mo libraries, all while explaining everything. Thank you!

  • @karim741
    @karim741 5 років тому +4

    Thanks for the video,
    I try to follow this but I see the solution can be other way in binary logic,
    the first column is multiplied by the sum of the two other columns,
    not only first column is what decides the output but the others also as bellow.
    if we take this table at 0:20
    Example 1: 0x(0+1)=0
    Example 2: 1x(1+1)=1
    Example 3: 1x(0+1)=1
    Example 4: 0x(1+1)=0
    New situation: 1x(0+0)=0

  • @rahulaga
    @rahulaga Рік тому

    This is by far the best explanation. I guess by keeping the complexity level of chosen example pretty low, you landed the message perfectly, thanks !!

  • @novi0
    @novi0 Рік тому

    2 minutes in and I already have a better understanding than 2 semesters worth of lectures

  • @MsRAJDIP
    @MsRAJDIP 5 років тому +3

    So far the best simplest and practical tutorial I got. U cleared all my doubt and little background in python helped me lot.

  • @k.chriscaldwell4141
    @k.chriscaldwell4141 5 років тому +4

    Superb! Using the seeded weights so that you and the viewer get the same results was a brilliant touch. Helps the viewer know if he miscoded or not. Thanks.

  • @botancitil92
    @botancitil92 2 роки тому

    I have been looking for a toy example of Neural Networks, thanks to your video I get to see one. Your video is very concise. Thank you. Also, thank you for sharing your Python code.

  • @traeht
    @traeht 2 роки тому

    Thank you for very useful insigth into what is behind the neural network. At 10:00: (the derivative of a sigmoid function)=(sigmoid funcion)*(1-sigmoid function) and not x(1-x)

  • @sreedeepsreedeep2260
    @sreedeepsreedeep2260 5 років тому +2

    Best tutorial on neural networks i have seen till now....thanks buddy😘

  • @timothec.8216
    @timothec.8216 5 років тому +2

    Thanks a lot. This is much more comprehensible than all I have watched and read

  • @akmaleache4735
    @akmaleache4735 6 років тому +2

    I watched lot of Ann videos on UA-cam, and all of them missing something which I am not getting
    But thanks to you I got what I need. Especially explaining the working. Thank u again

  • @samayvarjangbhay8987
    @samayvarjangbhay8987 5 років тому +6

    finally a properly structured tutorial

    • @0siiris
      @0siiris 5 років тому

      Nice profile pic 😂

  • @aizej9896
    @aizej9896 4 роки тому +1

    thx for the totorial gived the neural network my own training data and it worked geat!

  • @trianglesupreme
    @trianglesupreme 5 років тому

    At 0:40 ,
    The output depends on both first and last input not only on first. If i label the inputs a,b,c from left to right respectively, then according to the 4 states truth function, the output is
    O= abc + ab'c
    =ac(b+b')
    =ac.
    So nn output for 100 input should be 0.

  • @jeffwads
    @jeffwads 4 роки тому +5

    It helps to have someone who actually knows how to break a "problem" down to its bare essentials. Excellent work.

  • @coleboothman1158
    @coleboothman1158 5 років тому

    Hey dude just saw this video from your post on /r/programming - This video is awesome! You're great at explaining everything. Neural nets can sometimes be confusing but this makes a lot of sense to me. Thanks so much!!

  • @chessprogramming591
    @chessprogramming591 4 роки тому +2

    Man, this was so to the point! Thanks for your efforts. Best NN basics tutorial I've found so far! Very very useful!

  • @REVscape95
    @REVscape95 6 років тому +20

    waiting for the next video, this type of explanation really helps

  • @blubaylon
    @blubaylon 2 місяці тому

    This is such s good tutorial!!! I finally understand how these things are actually coded!

  • @BeSharpInCSharp
    @BeSharpInCSharp 4 роки тому

    Lots of people can code only few can teach.. well done

  • @mwont
    @mwont 5 років тому +11

    Just a note: sigmoid_derivative is based on the exact analytical formula for the sigmoid derivative.

    • @sonic597s
      @sonic597s 4 роки тому

      thanks so much for this, I was really confused during that bit!

    • @pluronic123
      @pluronic123 4 роки тому

      @@sonic597s dont get it. He still uses x(1-x) which has nothing to do with sigmoid, but it is just an approximation to the shape of the curve (signs are opposit)

    • @sonic597s
      @sonic597s 4 роки тому +1

      @@pluronic123 a derivative finds the slope of the line at some given point. the sigmoid derivative being the formula x(1-x) (where x is the sigmoid fn.) means that if you were to plug in some sigmoid function given some value (z) as x, you would get the slope of the sigmoid fn at that value (z)

    • @pluronic123
      @pluronic123 4 роки тому

      @@sonic597s thanks precious internet dude

  • @ogregolabo
    @ogregolabo 5 років тому +2

    Thanks for great video!
    Possible code to find output for [1,0,0] :
    p_in=np.array([1,0,0])
    p_out=sigmoid(np.dot(p_in, synaptic_weights))
    print("Predicted Output After Training:")
    print(np.round(p_out))
    =>
    Predicted Output After Training:
    [1.]

  • @computerguy7451
    @computerguy7451 3 місяці тому

    Before I slightly understood how neural networks work, now I understand how they work slightly better than before.

  • @progmaster15
    @progmaster15 5 років тому +1

    Dude this video was really helpful! Thank you for explaining the basics of neural networks! :D

  • @ciencialmente9969
    @ciencialmente9969 4 роки тому +203

    1:39
    "so we need a little meth"

  • @volador2828
    @volador2828 4 роки тому

    Nice work! Finally found someone that can teach the way I can understand it..
    I subscribed and look forward to watching all your videos!

  • @title601a
    @title601a 5 років тому

    NICE!!!!! Finally, I can understand what is NN and backpropagation. Simple and Easy to understand. Thank a lot to Polycode :)

  • @deanresin4804
    @deanresin4804 5 років тому +3

    This was a such a great tutorial. Very clear, concise and well paced.

  • @aaronisinjapan
    @aaronisinjapan 5 років тому +1

    Wow, I’ve been looking for a tutorial just like this for a long time! Subscribed! Please keep making videos!!

  • @industrialdonut7681
    @industrialdonut7681 5 років тому +32

    15 minute video... takes me 2 hours to get through XD

  • @harlongbitimung4108
    @harlongbitimung4108 5 років тому

    This video has taught me more than anything about ANN.

  • @StreetArtist360
    @StreetArtist360 3 роки тому

    Simple, Clear and straight to the point. Great Job!!!

  • @jefersonferri
    @jefersonferri 2 роки тому

    You did a great job, you should make more videos. May be explaining how to make a more complex neural network.

  • @marcusaureliusregulus2833
    @marcusaureliusregulus2833 4 роки тому +1

    Output = array[1[1]].value
    Lol just kidding. This was a great video and I understood a ton

  • @landaravi
    @landaravi 4 роки тому

    This is the tutorial actually I'm searching for understanding of Neural network... Thanks a lot...

  • @chandlerlabs2478
    @chandlerlabs2478 3 роки тому

    Completely new to this and you made it very easy to understand. Thank you and good job!

  • @critterpower
    @critterpower 5 років тому +2

    Great tutorial, better than the usual,"Just use this library...."

  • @drakemeyers8746
    @drakemeyers8746 5 років тому +1

    So i tweeked training outputs to 1,1,1,0 with an interation in range of 100,000 and the computer gave me a perfect answer to the third output of 1. The other outputs where close to true answers but i didn't think the computer could give a 100% true answer. I guess im confused that it didn't take that many training loops to give that answer.
    Btw great video finally got me to get the computer out and start!

  • @elephant1989811
    @elephant1989811 5 років тому +1

    what a excellent explanation of complex subject! Please keep up the videos.

  • @SuryaPrakashVIT
    @SuryaPrakashVIT 4 роки тому

    Wonder full video, this will definitely turn upside down of my project. Thank You so much!!! :)

  • @KonradGebura
    @KonradGebura 4 роки тому +1

    Thanks this was so helpful it really cleared up a lot of my questions about the topics other videos said let’s not talk about that yet..., thanks again these videos are super helpful keep up the amazing work

  • @fiveoneecho
    @fiveoneecho 5 років тому +2

    Great tutorial, but I might have used a different approximation for d-sigmoid. I'm not sure where you got x(1-x) from as an approximation- it does not share a derivative with d-sigmoid and the vertex is off in space. I'm not sure if it is a standard to use and I'm just misunderstanding (I'm watching this tutorial to learn, after all), but I did a quick Taylor polynomial approximation and got the function:
    d-sigmoid ~= (2 - x^2) / 8 -------This won't work very well for things not centered at x = 0
    This is about the same in terms of typing effort and computer processing, but a little more accurate. It is also based around x = 0 so it won't be biased towards one outcome (unless you built a weight into your function, in which case it makes a lot of sense).
    You can continue on to the 4th derivative in the series and add a third term which doesn't factor as nice but is extremely accurate (+/- 0.001) on the domain -1

  • @thegoonist
    @thegoonist 5 років тому +10

    0:38 the rule could also be that that first and third outputs have to be 1, and not just the 1st output.

  • @SureshSingh-en5uj
    @SureshSingh-en5uj 4 роки тому

    FINALLY!!.... I have been looking for such tutorial which teaches from scratch... That's Very good of you to do so... Keep it up bro.. Make more videos like this... BTW I am new to your channel. Just subscribed

  • @madanvishal1
    @madanvishal1 5 років тому

    Excellent Explanation making things crisp and clear

  • @ShradhanandVerma
    @ShradhanandVerma 4 місяці тому

    THANKS FOR VERY SIMPLE WAY TO EXPLAIN... FINALLY UNDERSTOOD.

  • @KomputasiStatistik
    @KomputasiStatistik 4 роки тому +1

    The best neural network hands on

  • @seeking9145
    @seeking9145 4 роки тому

    You are my hero! My prof is so bad explaning exact the same things over I guess 4 or 5 lesson of 3 hours each. And you just need some minutes ... haha I subscirbed you immediately. I need more of it!

  • @Rekefa
    @Rekefa 5 років тому +3

    What a great video! Keep up with the good work, thanks for sharing your knowledge

  • @warrenkuah4314
    @warrenkuah4314 3 роки тому +3

    Incredible! I think this is the first video that has helped me understand the formulas behind a neural network! However, I was wondering how you implement the calculation of biases into the actual code and Backpropagation steps and formula?

  • @CroissantAC01
    @CroissantAC01 5 років тому +2

    I changed the sigmoid derivative function to this and got better results in less tries and this is the actual derivative of a sigmoid function:
    def sigmoiddeivative(x):
    return np.exp(-x) / ( pow( ( 1 + np.exp(-x) ), 2) )

    • @JonasBostoen
      @JonasBostoen  5 років тому +1

      This is indeed a better derivative, good job! For the purposes of simplicity though I have kept the less complicated function since it's almost the same shape. Yours is better though.

    • @CroissantAC01
      @CroissantAC01 5 років тому

      @@JonasBostoen thanks but I didn't understand the use of the 2 * random.random(3,1) in the beginning of the class initialisation

  • @prasanjitrath281
    @prasanjitrath281 5 років тому +1

    Your video is a life saver, thanks! Hope you make more such videos!

  • @dridihamza7157
    @dridihamza7157 4 роки тому

    this is on of the best yet simple explanation. keep up

  • @Democracy_Manifest
    @Democracy_Manifest Рік тому

    This video deserves an award

  • @Pancake3000
    @Pancake3000 4 роки тому

    This is the thing that finally helped me understand! Never stop doing the grade vids!

  • @blackdedo93
    @blackdedo93 5 років тому +1

    wow couldnt be better explained, keep the good job.
    there are not many sources for newbs machine learners, specially with no libraries !!

  • @gabrilrh
    @gabrilrh 5 років тому +1

    i need more, thats awesome

  • @diljithpk1615
    @diljithpk1615 2 роки тому

    Nice presentation. Made it feel very simple

  • @Adam-ze3pr
    @Adam-ze3pr 3 роки тому

    Hai, thank you, this is very easy to catch for newbie like me. Simple and clear. Keep going 👍

  • @utkarshankit
    @utkarshankit 5 років тому +1

    first time i understood back propagation from your video.

  • @xddddddize
    @xddddddize 4 роки тому

    For this simple problem backpropagation is not needed. The gradient formula can be computed analytically and would reduce the training iterations a lot. (I achieved high confidence with 500 iterations only)

  • @alidakhil3554
    @alidakhil3554 4 роки тому

    That is best empirical lesson on basic NN

  • @OMAAKAAKORJOHN
    @OMAAKAAKORJOHN 8 місяців тому

    you are so wonderful , i quite understand by you basic and easy to learn method, thanks

  • @MrFrostsonic
    @MrFrostsonic 5 років тому +15

    In line 16, why have you multiplied the random weights by 2 and then subtracted 1 ? Great video .. very helpful .. Thank you very much.

    • @JonasBostoen
      @JonasBostoen  5 років тому +29

      np.random.random returns floating point values between 0 and 1, but since we need values between -1 and 1, this is the way to do it.

    • @nurhaida1983
      @nurhaida1983 5 років тому +6

      @@JonasBostoen thank you for this clarification. i was lost at this line but luckily stumbled to this comment. thank you very much! cheers!

    • @BiCool03
      @BiCool03 4 роки тому +1

      @@JonasBostoen I'm very late to the party, but since we need a random number between -1 and 1, wouldn't it be better to add two random numbers, then substract 1, or does it matter?

  • @mattheoswho1010
    @mattheoswho1010 4 роки тому

    The derivative of the sigmoid is: σ'(x) = σ(x)(1 - σ(x)). It took me a while to understand what you meant at 9:33, maybe you should consider adding a comment.

  • @sorooshnazem
    @sorooshnazem 6 років тому +10

    The derivative of sigmoid function is: \phi*(1-\phi). x*(1-x) is wrong

    • @taravanova
      @taravanova 5 років тому

      Lol, spent like 10 min trying to get his result and then eventually googled it to find out I had the correct result the whole time. At least the correct version was used in the code.

    • @VoidFame
      @VoidFame 4 роки тому

      yet somehow it gives the incorrect result when using the correct derivative. Something else is missing here.

  • @Alislaboratory
    @Alislaboratory 4 роки тому

    Thanks so much! After days of looking, found a great tutorial and can expand my knowledge!!!

  • @MikhailBortsov
    @MikhailBortsov 3 роки тому

    Thanks for thinking about the equality of random weight for us.

  • @nukzzz5652
    @nukzzz5652 4 роки тому +12

    There is something i'm not understanding, when its time to change the weights, you're supposed to multiply the input with the adjustment and add it to the weights right? doesn't that mean if the input is 0 then the weights wont change at all? i noticed this when i tried different inputs and outputs, your example works fine but when i tried {0,0,0},{0,1,0},{0,1,1},{0,0,1} as inputs and {0,0,0,0} for outputs it was a mess and no matter how many tests i did it couldnt figure out the correct answer

    • @sonic597s
      @sonic597s 4 роки тому +1

      it does, this is a mistake in the code and can be fixed if you add a learning rate variable to multiply by the adjustments, rather than using the training inputs.

    • @sonic597s
      @sonic597s 4 роки тому

      @@havoc3135 instead of dotproducting the (transposed) training inputs with the adjustments, multiply the adjustments by some scalar, so you can scale your adjustments manually. hope this helps

  • @siddharthsinghchauhan8664
    @siddharthsinghchauhan8664 5 років тому +1

    derivative of pi(x) is pi(x)*(1-pi(x)) and not x(1-x) at 10:00

  • @twittuks
    @twittuks 5 років тому +1

    Amazing tutorial, keep up the good work

  • @travisjol
    @travisjol Рік тому

    Finally a video I can understand! Thank you

  • @MCLooyverse
    @MCLooyverse 5 років тому +8

    If Φ(x) = 1 / (1 + e^(-x)), then Φ'(x) = e^(-x) / (1 + e^(-x))^2, not x(1 - x).
    I'm curious about your Atom setup. Are the text overview on the side and the code suggestions hidden in Atom somewhere, or are they plugins?

  • @SoulEaterZika
    @SoulEaterZika 4 роки тому

    Thank you so much. This tutorial is direct, clear and instructive. 1 more inscribed.

  • @reddinghiphop1
    @reddinghiphop1 4 роки тому

    This video is 100% gold, thank you !

  • @ccuuttww
    @ccuuttww 5 років тому

    This is kind of logistics regression if u go deep u may realize that it is lower the KL divergence in each iteration
    however u can only classify 2 types of class in this example u may try sofmax also
    and u may save and run it on google colab now no need to install python by yourself