Yesterday I click on a video called 'learning phyton for Beginners'. Today youtube's algorithm sent this video. I was so confuse but somehow listen to it and when I feel I understand something from this explanation, it makes me excited. A genius can make someone understand complicated things, I am very grateful.
For anyone getting an error related to converting a list to a float, the model.evaluate is actually returning a list. She has the correction in the code at around 2:05:51, but she doesn't explicitly mention the correction. You just grab the first value in the list (which is why she puts [0]). So change the line where you obtain the val_loss to: val_loss = model.evaluate(X_valid, y_valid)[0]
Happy to help, everyone. It caught me a little off guard at first, and by the way she was using it I knew it had to be a collection of some sort and it turned out it was some little change she made and forgot to mention. No biggie, but it can definitely cause a headache - I know :-)
I've been trying to learn ML for quite awhile but could never really grasp the algorithim. She explains how the formula comes about and why is it used in the classification or regression so well. My god. Thumbs up for sensei Kylie and free code camp!!!
Yeah same I'm always interested in learning where the algorithms and concepts we use in machine learning comes from and she explained it the most clearly to me. Thank you!
I have to agree with those calling this tutorial too hard. I am a professional developer studying Cyber-Sec at the Master's level and found the first hour of the tutorial to be so intimidating that I had to go and learn Python again, just to boost my confidence. I followed it by getting a tutorial on Pandas as well as Numpy, those helped. I came back and realized that, while this is a really good tutorial, it isn't beginner-friendly at all. The kind of stuff Kylie accomplishes in a single line needs multiple lines from me and many more minutes to understand what's going on. As advice to all the newbies, don't be intimidated, try taking the Python basics, Pandas and Numpy courses before attempting this tutorial, perhaps watch the first hour to see what's required and come back.
@@devafterdark don't go hard on him! I personally have 0 experience of numpy or pandas but the logic she explains is very simple, remember you're here to learn ML concepts, you can revise python syntax later.
day 1 (8/8) -> 58:35 day 2 (9/8) -> 2:34:27 day 3 (18/8) -> The end Thanks for this great course! I think that this course is not for absolute beginners despite the description, maybe if you've used libraries like Numpy and Pandas, but not for "absolute beginners". Nevertheless, she is an amazing tutor and this course is amazing for an overview. I suggest searching for a great visual of classification of ML and watch the course later on.
i think the main concept and ideas behind this can be grasped by complete begginers too, maybe the coding part would go over the head but atleast they understand the core concepts
I fell asleep once and ends up on this channel with a video about how to make a neural network playing. There was also a time when I fell asleep watching a video about nihilism on repeat and a character in my dream said what the video was saying word for word. It freaked me out.
I once fell asleep watching a space and time tv shows , in the dream I could understand what the guy was explaining in the docu , it’s crazy how the real speech of a person is translated in a dream .
00:06 Kylie Ying teaches machine learning for absolute beginners 03:29 Importing and labeling data with pandas in machine learning. 10:47 Unsupervised learning uses unlabeled data to find patterns. 14:19 One hot encoding is used to represent categorical data as binary values. 21:04 Data set structure and model training process. 24:22 Training and validation sets are used to assess and tune machine learning models 31:10 Plotting histograms for different features based on class 34:24 Creating train, validation, and test data sets 41:23 Using random oversampling to rebalance data sets 44:57 Introduction to creating a scatter plot to represent data visually. 52:39 Implementing K Nearest Neighbors with SK learn package 55:42 Understanding precision, recall, f1 score and accuracy in machine learning 1:02:36 Bayes rule helps calculate probability based on given conditions. 1:05:51 Explaining Bayes' theorem and applying it to classification 1:13:06 Using probability to make classification predictions 1:16:39 Using MAP and Naive Bayes for classification. 1:23:49 Introduction to the sigmoid function and logistic regression 1:27:53 Logistic regression with L2 penalty 1:35:13 Introduction to Kernel Trick in SVM 1:38:37 Neural networks use neurons to process input and determine output. 1:45:12 Back propagation adjusts weights based on loss. 1:48:19 TensorFlow simplifies defining and training ML models 1:54:54 Training and evaluating a machine learning model 1:58:27 Setting up and compiling the neural network model for training. 2:06:19 Analyzing the accuracy and validation of the machine learning model. 2:09:24 Neural nets and SVM performed similarly, demonstrating different model options. 2:16:19 Simple and multiple linear regression explained 2:20:24 Residual plot shows distribution around line of best fit. 2:27:40 Understanding root mean squared error and coefficient of determination 2:31:32 Understanding total sum of squares and its significance 2:38:07 Creating and manipulating a data frame in Python for data analysis 2:41:29 Plotting and analyzing the relationship between various weather parameters and bike count. 2:48:48 Training and evaluation of a linear regression model 2:52:58 Using TensorFlow for regression with neural networks 3:00:14 Using a neural net with more than one node for prediction 3:04:00 Using neural net model for non-linear prediction. 3:12:12 Discussing the concept of unsupervised learning and introducing k means clustering 3:15:41 Assigning points to the closest centroid 3:23:32 Unsupervised learning techniques: clustering and dimensionality reduction 3:27:10 Principal Component Analysis (PCA) maps points onto one-dimensional space 3:34:09 Predicting wheat varieties based on wheat kernel features. 3:37:41 Unsupervised learning uses no class information for analysis. 3:45:20 K-means clusters and PCA explained 3:48:46 PCA reduces dimensions to 2 for better visualization
The way she explained bayes theorem I generally donot find such clear explaination anywhere It was reallized and practical explaination . Truly appreciative
I just finished watching the entire video, taking a few days off to rest. Meanwhile, I also typed out all the code on Colab, mimicking what was shown. Overall, this video gave me a preliminary understanding of machine learning and opened the door to this unknown field for me. Of course, there is still a lot I need to learn, and many concepts are not very clear to me yet. However, with the prevalence of ChatGPT now, I can ask questions whenever I don't understand something. Compared to the previous technical environment, ChatGPT has made my learning process faster and more efficient. Finally, I would like to thank the blogger again. Perhaps, in the not-too-distant future, I might also make videos to guide others in getting started, haha.
Kudos, bro. I am doing the same, whenever and whatever I don't understand something, I will just head over to chatGPT and ask him to explain what is going on. I haven't watched it fully, but I benefited a lot even from the beginning.
Kylie is such a great teacher and obviously not only understands but applies these topics in the real world. What a great combination, thanks for the course!
@@leagueofotters2774 Because it's obvious to anyone interested in machine learning that Python is the language you use. If you can't even recognise that the language she used in the video is Python, what are you doing watching a course on machine learning?
I recently completed the ML tutorial, and I wanted to express my gratitude for the outstanding content. The derivation and mathematical explanations were particularly impressive. I've been trying to grasp the fundamentals of machine learning for quite some time now, and this is the first tutorial where I genuinely understood the derivations and gained valuable knowledge about the topics plus the side by side implementation also helped alot. Thank you for creating such an informative and well-structured tutorial!
I think it is a very advanced video honestly. it should mention that you need a very solid foundation of statistics, coding, data science to even understand what is going on!!! its a medium level video
If only everyone recognised the value of courses like this as you have done. First time I've seen that thanks/donation button which is sad but now will know how to contribute back! Thanks for letting me know that exists :D
Levei exatos 3 meses e 6 dias pra terminar esse vídeo assistindo em 10 momentos (parcelados) diferentes e hoje eu concluí ele Estou muito contente por ter finalizado esse compromisso e por ter consumido todo esse conteúdo Excelente material, muito didático e insightfull (não sei qual melhor expressão pra isso em pt-BR) esse conteúdo, excelente
Yes, as everyone is the audience let's start with an example that few can relate to and then just jump into code, again, everyone is well versed in code, without any explanation or overview.
What do you mean, she started off with code, but quickly went on to explain what was going on. Besides, if you want to learn ML, you should be familiar with basic python and linear algebra.
If you're getting an error about comparing a list to a float. Changing the "least_val_loss" variable to a list with two infinite floats will fix it. Like this: least_val_loss = [float('inf'), float('inf')]
@@sigmaohiofortniterizz That section had the correction edited out. Right after she says "now we wait" (2:05:50) the video skips her debugging and cuts straight to the correct code. Notice the [0] that gets added to the "val_loss = model.evaluate(X_valid, Y_valid)" line right after the "now we wait".
when compling the model we asked for metrics, binary_crossentropy and accuracy, nn_model.compile(optimizer=tf.keras.optimizers.Adam(lr), loss='binary_crossentropy', metrics=['accuracy']), so val_loss = model.evaluate(X_valid, y_valid) returns a list with those two datapoints val_loss[loss, accuracy] we are only interested in loss to plot so val_loss[0] is needed.
This is amazing. I'm a data analyst and had some formal training in machine learning, but my classes were really surface and "teach yourself" style. This is so much better. I also find it easier to listen to women, so that's a bonus lol
Spent 5 days on this video and it really taught me a lot about ML, it might be tougher for absolute beginners to grasps at first due to the knowledge of numpy and maths needed. But Beginner friendly or not, this was a good tutorial
It would be very beneficial for beginners to make Shallow, Deep and Convolutional Neural Networks from scratch. Because by doing so, they can learn many activations and their derivatives, forward propagation, and backward propagation. Along with, the dimensions of matrices and what is actually happening at each layer. Later on, they can shift to Tensorflow or Scikit when doing professional learning but I do advise to at least implement a neural network from scratch.
@@lakshman587 Yes. A Shallow Neural Network. This week I will be making a Deep Neural Network Vectorized Approach from scratch and 2 weeks later I will also implement CNN from scratch
@@muhammaddarab7474 Can you share your code? I have tried to hardcode the backpropagation algorithm for 1 input and 1 output with only 1 hidden layer So even I need to implement full NN from scratch...
@@lakshman587 I can share my Shallow Neural Network code with you for 1 layer but for N Hidden layer (Deep Neural Network), you will have to wait till I implement it
@@muhammaddarab7474 Ok I guess we are on same track. Ill try to implement NN for N hidden layers from scratch... Let's hope we will finish and gain knowledge on how NN works 🥲 All the best!
She is so good at this I woke up yesterday and startng writing an entirely custom Vector Matrix AI learning algorithm from scratch . After watching her explain the concepts from the simplest to the more complex. As a software programmer this has been very interesting to see how simple these algorithms really are.That is how you know you're a good teacher when im not even paying attention and somehow it sticks😁
It was a good watch, I think I still have a long way to go But I was able to understand the course and got lots of doubts as well. Very good course for some one who know basic concepts wanting to improve them. I would recommend this to someone who has an idea of what machine learning, because it is not entirely beginner friendly. Learn the basic concepts and come watch this to understand better about it
I cannot even begin to describe how useful this video was for me. Endless thanks for the priceless knowledge Kylie and freeCodeCamp provided. This video seriously gave me an edge when it came to machine learning. It organized all the abstract concepts I have been hearing about while also describing them perfectly. I have learned SO much. Thank you, thank you, thank you!
Had to learn ML and had no idea where to start, how to start, though I had heard of certain terms and knew bits and pieces (and was confused). This has given me a good overview of the basics and the basic concepts and contents of ML are sort of organized in my mind now with the "what is ML?", "what are the basic methods" and most importantly, "why the methods" and "How they work" information clearly explained and examples of how to use the methods on python, so that it is now easy to take off from here - A relatively comprehensive yet brief overview - a need of the hour for me right now as I cannot afford to spend weeks or months learning everything, but needed a good overview for my tasks at hand. Searched many sources but this seems to be one of the best in clarity. Thank you!
Such a great video. I’m already 30min and learning a lot. My major is centered around AI and machine learning so I wanted a sneak peek at what’s to come. Thank you!
It's 2:16 am and I just woke up to this on my screen and yes I wanted it so badly. So strange but it's now downloaded and will watch and learn when I wake up. Thank you
I just finished taking this course. I took a while. but she explains the theory and examples. I loved the math part of it. she goes on explaining the Supervised vs Unsupervised ML tools and methods. I learnt a lot also with Pyrhon. Thanks for your effort.
Thanks for the explanations! it is really detailed, her tone is comfortable, I can easily to understand what she said and she elaborates each steps (it is so important as every self learner can know the rationale behind each steps). Hope I can watch another lesson videos of this editor!
Hey, invite you to use my AI learning tool -- Coursnap for this course! It has course outlines and course shorts, so you can get the gist of 1-hour in just 5 minutes. Try it out and boost your learning efficiency!
18:10 As soon as you mentioned "Hot Dog or Not Hot Dog", it instantly reminded me of Jian Yang's classification model from the HBO comedy series Silicon Valley. 😂 But this course is very useful and easy to grasp for beginners like us. 👍👍
Hats off.. cannot imagine that it is possible to explain these concepts in such a simple way. Thank you and please keep on posting such learning videos.
Hello, I really enjoyed the entire journey with you. This is where my journey in Machine Learning begins. I have gained enough confidence to take it to the next level. ❤❤❤❤❤❤
I was so excited to see this course because usually freecodecamp courses are for beginners but this one is not. You need to have good knowledge of numpy and other libraries to be able to learn more from this course. Really disappointing. They should let users know the basics that are required for this course. But ultimately this is a very good course and it's free so really appreciate it guys. Thanks
I think most people understand intuitively that something as complex sounding as “machine learning” will require prior knowledge. Lol. It’s like walking into an “Intro to Quantum Physics and Relativity” class and expecting to get by without understanding Newtonian physics or calculus.
I would never complain about free education - do one of their Python/numpy courses and return here. Better to build on what you know already than every course starting from 0 :)
Amazing course, I coded it along in VSCode. This made me learn quite a lot about conda, environments, etc. With this kind of hands on learning, not only I believe I can start doing data analysis and creating my own models and understanding their predictions. Thank you all so much! 👏Great work! (And well, it's not for Python noobies... Anyone past the basics with most of the libraries used can understand it)
to anyone who is starting i will suggest you should read this book -(Fundamentals of data science by Jugal K. Kalita,Swarup Roy,Dhruba K.)up to machine learning unit..and if any doubt do gpt,,it will definitely help yu to understand this wonderful course more effetively.
The best introductory machine learning video! I have tremendous respect for Kylie. I think it is the clearest and easiest to understand explanation in the world. Thank you so much.
🎯Course outline for quick navigation: [00:00-06:57]1. Machine learning and data analysis -[00:00-00:28]Kylie ying, a physicist and engineer, will teach machine learning to beginners. -[01:05-02:12]Uci ml repository offers magic gamma telescope dataset for predicting particle type using camera patterns. -[05:04-05:38]Using 'fdist' and 'class' to label columns in a csv file to a pandas data frame. -[05:59-06:28]Converting gs to 0s and hs to 1s in data frame class.unique. [07:01-34:39]2. Machine learning fundamentals -[08:00-08:29]In supervised learning, 10 features are used to predict the class label g. -[09:53-10:27]Supervised learning uses labeled inputs to train models and learn outputs. -[13:14-13:45]Two categories: qualitative and categorical data. -[23:57-24:24]Data set is divided into training, validation, and testing sets, with distribution like 60%, 20%, and 20% or 80%, 10%, and 10% depending on statistics. [34:39-44:02]3. Data scaling, splitting, and oversampling for machine learning -[35:13-35:43]Splitting data into train, valid, and test sets, shuffling with 60% split -[37:38-38:10]Imported standard scaler from sklearn for data scaling. -[39:06-39:36]Using numpy to reshape 1d vector y into 2d object for stacking x and y. -[40:28-40:59]Around 7,000 gammas and 4,000 hadrons, oversampling needed. -[42:05-42:37]Oversample the smaller class to match the larger class in the dataset. -[43:43-44:12]Validation and test sets are not oversampled to assess model performance on unlabeled data. [44:02-57:30]4. K nearest neighbors model and its implementation -[44:35-45:03]Introducing knn model for predicting family size based on income. -[49:02-49:34]Using k-nearest neighbors algorithm with a k of 3 or 5 to determine labels for data points. -[52:37-53:10]Introduction to k-nearest neighbors and using sklearn for implementation. -[56:39-57:12]Precision score: 77-84%, recall: 68-89% [57:31-01:36:19]5. Probabilistic classification models -[57:31-58:09]Analyzing unbalanced test data, improving f1 score to 0.87, accuracy at 82% -[01:01:35-01:02:06]Probability of having covid, given a positive test, is 96.4%. -[01:06:00-01:06:32]Probability of positive test given disease is 0.99, probability of having disease is 0.1. -[01:12:22-01:12:52]Naive bayes assumes independence in calculating joint probability. -[01:16:45-01:17:15]Using the training set, we apply the map principle to maximize expression for hypothesis selection. -[01:26:27-01:27:43]Logistic regression fits data to sigmoid function to build a model with multiple features. -[01:28:56-01:29:33]Model achieves 65% precision, 71% recall, and 77% accuracy, outperforming naive bayes but not knn. -[01:33:30-01:33:58]Goal: maximize margins in svms for best class separation. [01:36:21-01:46:45]6. Svms and neural networks -[01:37:13-01:37:41]Exploring the definition and power of svms through the kernel trick. -[01:38:23-01:39:17]Achieved 87% accuracy with svm model, class one scores 0.9, covered four classification models: svm, logistic regression, naive bayes, and knn. -[01:39:44-01:40:12]Neural networks consist of input, hidden, and output layers, each containing neurons. -[01:42:50-01:43:22]Introduction of sigmoid, tanh, and relu activation functions to prevent collapse of the model into a linear one. -[01:43:42-01:44:16]During training, the model adjusts based on l2 loss function to reduce error. -[01:45:53-01:46:26]Updating weight w0 with a new value using a factor alpha. [01:46:45-02:03:23]7. Neural network training -[01:46:45-01:47:13]Explaining the use of learning rate and negative gradient in adjusting neural net convergence. -[01:47:39-01:48:13]Iteratively adjust weights in neural network for machine learning. -[01:55:58-01:56:24]Validation accuracy improved from 0.77 to around 0.81, and loss is decreasing, indicating positive progress. -[01:56:50-01:57:27]Discussing grid search for hyperparameters with variations like 64 nodes and 16 nodes, and adjusting learning rate and epochs. -[02:00:57-02:01:36]Analyzing model performance with loss and accuracy plots. [02:03:24-02:36:19]8. Model training, evaluation, and regression in ml -[02:07:02-02:07:37]Best performance achieved with 64 nodes, 0.2 dropout, 0.001 learning rate, and batch size of 64. -[02:16:09-02:16:46]Using linear regression to minimize error and make predictions for data points. -[02:17:19-02:17:51]Kylie discusses assumptions, including linearity in data analysis. -[02:28:04-02:28:32]The root mean squared error allows expressing error in dollars and same unit. -[02:33:09-02:33:32]High r squared indicates good prediction, adjusted r squared accounts for terms. [02:36:20-03:12:03]9. Data analysis and regression modeling -[02:42:59-02:43:25]Data frame modified: wind, visibility, and functional dropped, leaving 6 columns. -[02:48:10-02:48:42]Demonstrating simple linear regression using temperature data. -[02:54:17-02:55:15]Improved r squared from 0.4 to 0.52 indicates progress in regression analysis using tensorflow. -[03:11:18-03:11:45]Neural net underestimates for larger values, linear regressor also used. [03:12:04-03:22:13]10. Unsupervised learning: k-means clustering -[03:12:04-03:12:32]Linear regressor is limited in capturing non-linear relationships; suggesting the need for alternative models in certain cases. -[03:12:58-03:13:29]Unsupervised learning: using k-means clustering to compute k clusters from unlabeled data. -[03:17:35-03:18:02]Data points recalculated to update centroids for groups. -[03:21:54-03:22:33]Using expectation maximization, reaching stable clusters allows stopping iteration. [03:22:15-03:33:52]11. Expectation maximization and principal component analysis -[03:22:15-03:23:20]Using expectation maximization for unsupervised learning to find data patterns and structure. -[03:24:10-03:24:42]Pca reduces multi-dimensional data to one dimension to capture key information. -[03:25:49-03:26:19]Demonstrating distance using pca to find direction in space. -[03:30:55-03:31:23]Minimizing projection residual to find largest variance dimension in pca. [03:33:53-03:53:51]12. Unsupervised learning and dimensionality reduction -[03:33:53-03:34:22]Implementing unsupervised learning on the uci seeds dataset with three types of wheat kernels -[03:35:06-03:35:51]Importing pandas and seedborne for specific class. -[03:45:46-03:46:12]Using k-means, identified 3 classes based on compactness and asymmetry in scatter plot. -[03:47:54-03:48:18]Pca reduces multiple dimensions into a lower dimension number. -[03:51:32-03:53:49]Unsupervised learning algorithm predicts three categories fairly well without labels. -[03:53:11-03:53:37]Algorithm predicts three categories fairly well without labels. demonstrates unsupervised learning. offered by Coursnap
It would be helpful for people without coding experience to explain the libraries being used, the languages, and how it works. Or to explain the bare minimums required. This seems more like it's geared to people with python programming experience as a base
@Freecodecamp you're always teaching us great skills with great mentors. Salute sir 🤗. Kindly upload the electric vehicle course if you have a great mentor available.
Wow! I started learning this morning from your roadmap and the algorithm provided me this video. And this is the best one indeed. Thank you, Kylie Ying mam. Stoked to the neuron.
Thanks for the content. This isn't for everyone though.. She speeds through so much and beginners are just sitting here going "What the hell is going on...". Looks like that was being pointed out early in the comments and then there seems to be a flood "this is so clear and fully fleshed out!" comments that have been pushed to the top.. That's a bit sus.
Could you explain what the things are that you are labeling and importing as you are doing them? It helps me to make connections in my brain if I can associate what you are doing with the reason behind it. Otherwise it is just an exercise in memorizing and just replicating what you are doing without any understanding. Thank you for sharing and all the effort put into it.
It seems that nowadays the TensorFlow Model.evaluate returns both the loss and the accuracy in a list so at around 2:05 of the video the comparison val_loss < least_val_loss is no longer working. Maybe they changed what the method returns since this video was published. I got this fixed by referring to the first element of the List (according to the docs the first item is loss and second accuracy) like this: if val_loss[0] < least_val_loss: least_val_loss = val_loss[0]
In 2:40:12 u rerun the cell with the "df['functional'] == (df['funtional'] == 'Yes').astype(int)" statement. Because after the first run of this cell there is no 'Yes' in the column all values has to be 0. So u should reconsider u'r statement at 2:42:14 " 'functional' doesn't give us any utility" .
This is a great course and thank you so much. I was able to understand and learn so many new things! Some topics I had struggled a bit were quite clear through this course.
Did a refresher on ML as it's been a while... Kylie rocks! Delivered with the right amount of theory combined with examples and importantly, clarity! Top marks!!!!!
I’ve programmed in Python for a while now but I understand nothing 😂. I think she lost me when she started doing all of the Data Science stuff and there are multiple vocabulary words that could have been explained more. Overall, no shade to her but I’m probably just a noob
Can someone confirm if this goes beyond the basics and actually helps you to build stuff? I’ve done plenty of these courses and most of them only contain basic information about a 100 different things and very little demonstration of actually building a significant project from raw data. So, my fundamentals should be fine after a bit of a brush-up but I need a course that goes beyond.
English is not my native language and I am new at machine learning but it was easy for me to understand you. i watched too many videos about ML but this is the best
hi im sorry i have a really stupid doubt related to the bayes theorem question... Please correct my logic: i thought that P(having disease) = P(True positive) + P(false negative). But acc to that, the ans to that question should be 0.1-0.01=0.09. which it isn't... Is this eqn wrong or can we not add probabilities like that or... Maybe im doing it all completely wrong. Pls help. Thank u so much
Thank you so much for your brilliant tutorials and courses Kylie (please do more!!!)! Could you please recommend some books on the mathematics of machine learning (and books that you found useful when you dived into the subject).
Yesterday I click on a video called 'learning phyton for Beginners'. Today youtube's algorithm sent this video. I was so confuse but somehow listen to it and when I feel I understand something from this explanation, it makes me excited. A genius can make someone understand complicated things, I am very grateful.
For anyone getting an error related to converting a list to a float, the model.evaluate is actually returning a list. She has the correction in the code at around 2:05:51, but she doesn't explicitly mention the correction. You just grab the first value in the list (which is why she puts [0]). So change the line where you obtain the val_loss to:
val_loss = model.evaluate(X_valid, y_valid)[0]
literal life saver man. Thank you
Thank you so much! You are truly a life saver.
i dont want to change the likes on here bc you killed it! 187!
Happy to help, everyone. It caught me a little off guard at first, and by the way she was using it I knew it had to be a collection of some sort and it turned out it was some little change she made and forgot to mention. No biggie, but it can definitely cause a headache - I know :-)
You just saved me!! I wrote literally the exact same question before reading you input, and then saw your query and ...voila!!! thanks
Her voice and way of teaching is so soothing. I fell asleep listening to her and I am gonna watch this every night.
Semester video
Us 🙃
I've been trying to learn ML for quite awhile but could never really grasp the algorithim. She explains how the formula comes about and why is it used in the classification or regression so well. My god. Thumbs up for sensei Kylie and free code camp!!!
Does it includes stats too? Or just ml algorithms?
@@networkserpent5155 im not quite sure what you mean. ml is mostly stats. I guess both would be the right answer.
Yeah same I'm always interested in learning where the algorithms and concepts we use in machine learning comes from and she explained it the most clearly to me. Thank you!
what programming language she used in this course?
@@yehiaelhariry9356 python
I wonder how many of those millions of views are from the people who fell asleep and woke up to this
😂😂😂 real
Same! It's pissing me off.
Same here. Bizarre how this happened
I have to agree with those calling this tutorial too hard. I am a professional developer studying Cyber-Sec at the Master's level and found the first hour of the tutorial to be so intimidating that I had to go and learn Python again, just to boost my confidence. I followed it by getting a tutorial on Pandas as well as Numpy, those helped.
I came back and realized that, while this is a really good tutorial, it isn't beginner-friendly at all. The kind of stuff Kylie accomplishes in a single line needs multiple lines from me and many more minutes to understand what's going on.
As advice to all the newbies, don't be intimidated, try taking the Python basics, Pandas and Numpy courses before attempting this tutorial, perhaps watch the first hour to see what's required and come back.
Wao, thanks a lot. I will learn python until pro!
skill issue
@@devafterdark don't go hard on him! I personally have 0 experience of numpy or pandas but the logic she explains is very simple, remember you're here to learn ML concepts, you can revise python syntax later.
@@kklol07 I cannot understand her. Any direction where to take this course?
I cannot understand her. Any direction where to take this course?
⌨ (0:00:00) Intro
⌨ (0:00:58) Data/Colab Intro
⌨ (0:08:45) Intro to Machine Learning
⌨ (0:12:26) Features
⌨ (0:17:23) Classification/Regression
⌨ (0:19:57) Training Model
⌨ (0:30:57) Preparing Data
⌨ (0:44:43) K-Nearest Neighbors
⌨ (0:52:42) KNN Implementation
⌨ (1:08:43) Naive Bayes
⌨ (1:17:30) Naive Bayes Implementation
⌨ (1:19:22) Logistic Regression
⌨ (1:27:56) Log Regression Implementation
⌨ (1:29:13) Support Vector Machine
⌨ (1:37:54) SVM Implementation
⌨ (1:39:44) Neural Networks
⌨ (1:47:57) Tensorflow
⌨ (1:49:50) Classification NN using Tensorflow
⌨ (2:10:12) Linear Regression
⌨ (2:34:54) Lin Regression Implementation
⌨ (2:57:44) Lin Regression using a Neuron
⌨ (3:00:15) Regression NN using Tensorflow
⌨ (3:13:13) K-Means Clustering
⌨ (3:23:46) Principal Component Analysis
⌨ (3:33:54) K-Means and PCA Implementations
Ihihhhih
Ihihhhih
how are these things useful?
i dont see how
just watch the whole video
@@cubeheadii9284 how is your comment useful?
day 1 (8/8) -> 58:35
day 2 (9/8) -> 2:34:27
day 3 (18/8) -> The end
Thanks for this great course! I think that this course is not for absolute beginners despite the description, maybe if you've used libraries like Numpy and Pandas, but not for "absolute beginners". Nevertheless, she is an amazing tutor and this course is amazing for an overview. I suggest searching for a great visual of classification of ML and watch the course later on.
i think the main concept and ideas behind this can be grasped by complete begginers too, maybe the coding part would go over the head but atleast they understand the core concepts
can you pls help how she copied the data set information on google collab
I have no idea how my UA-cam algorithm brought me here while I was sleeping but it made for some strange dreams
I fell asleep once and ends up on this channel with a video about how to make a neural network playing.
There was also a time when I fell asleep watching a video about nihilism on repeat and a character in my dream said what the video was saying word for word. It freaked me out.
@Keys for Wealth like what? I’m not curious
Same here haha. I woke up with this vid.
I once fell asleep watching a space and time tv shows , in the dream I could understand what the guy was explaining in the docu , it’s crazy how the real speech of a person is translated in a dream .
@@fullmetaltheorist like same I am not doing a joke
00:06 Kylie Ying teaches machine learning for absolute beginners
03:29 Importing and labeling data with pandas in machine learning.
10:47 Unsupervised learning uses unlabeled data to find patterns.
14:19 One hot encoding is used to represent categorical data as binary values.
21:04 Data set structure and model training process.
24:22 Training and validation sets are used to assess and tune machine learning models
31:10 Plotting histograms for different features based on class
34:24 Creating train, validation, and test data sets
41:23 Using random oversampling to rebalance data sets
44:57 Introduction to creating a scatter plot to represent data visually.
52:39 Implementing K Nearest Neighbors with SK learn package
55:42 Understanding precision, recall, f1 score and accuracy in machine learning
1:02:36 Bayes rule helps calculate probability based on given conditions.
1:05:51 Explaining Bayes' theorem and applying it to classification
1:13:06 Using probability to make classification predictions
1:16:39 Using MAP and Naive Bayes for classification.
1:23:49 Introduction to the sigmoid function and logistic regression
1:27:53 Logistic regression with L2 penalty
1:35:13 Introduction to Kernel Trick in SVM
1:38:37 Neural networks use neurons to process input and determine output.
1:45:12 Back propagation adjusts weights based on loss.
1:48:19 TensorFlow simplifies defining and training ML models
1:54:54 Training and evaluating a machine learning model
1:58:27 Setting up and compiling the neural network model for training.
2:06:19 Analyzing the accuracy and validation of the machine learning model.
2:09:24 Neural nets and SVM performed similarly, demonstrating different model options.
2:16:19 Simple and multiple linear regression explained
2:20:24 Residual plot shows distribution around line of best fit.
2:27:40 Understanding root mean squared error and coefficient of determination
2:31:32 Understanding total sum of squares and its significance
2:38:07 Creating and manipulating a data frame in Python for data analysis
2:41:29 Plotting and analyzing the relationship between various weather parameters and bike count.
2:48:48 Training and evaluation of a linear regression model
2:52:58 Using TensorFlow for regression with neural networks
3:00:14 Using a neural net with more than one node for prediction
3:04:00 Using neural net model for non-linear prediction.
3:12:12 Discussing the concept of unsupervised learning and introducing k means clustering
3:15:41 Assigning points to the closest centroid
3:23:32 Unsupervised learning techniques: clustering and dimensionality reduction
3:27:10 Principal Component Analysis (PCA) maps points onto one-dimensional space
3:34:09 Predicting wheat varieties based on wheat kernel features.
3:37:41 Unsupervised learning uses no class information for analysis.
3:45:20 K-means clusters and PCA explained
3:48:46 PCA reduces dimensions to 2 for better visualization
nice :))
10q 🙏
18:08 🌭 Not 🌭
Very useful, thank you. Would be nice if this were pinned too
Kylie Ying is a gift to humanity
The way she explained bayes theorem
I generally donot find such clear explaination anywhere
It was reallized and practical explaination . Truly appreciative
I just finished watching the entire video, taking a few days off to rest. Meanwhile, I also typed out all the code on Colab, mimicking what was shown. Overall, this video gave me a preliminary understanding of machine learning and opened the door to this unknown field for me. Of course, there is still a lot I need to learn, and many concepts are not very clear to me yet. However, with the prevalence of ChatGPT now, I can ask questions whenever I don't understand something. Compared to the previous technical environment, ChatGPT has made my learning process faster and more efficient. Finally, I would like to thank the blogger again. Perhaps, in the not-too-distant future, I might also make videos to guide others in getting started, haha.
Kudos, bro. I am doing the same, whenever and whatever I don't understand something, I will just head over to chatGPT and ask him to explain what is going on. I haven't watched it fully, but I benefited a lot even from the beginning.
Kylie is such a great teacher and obviously not only understands but applies these topics in the real world. What a great combination, thanks for the course!
what programming language she used in this course?
@yehiaelhariry9356 I'm pretty sure that she used python
@@yehiaelhariry9356 She is such a great teacher that she didn't mention that or why it is used.
@@leagueofotters2774 if you need an explanation on python then you are probably watching the wrong video lol
@@leagueofotters2774 Because it's obvious to anyone interested in machine learning that Python is the language you use. If you can't even recognise that the language she used in the video is Python, what are you doing watching a course on machine learning?
Who else woke up to this?
Me too
Nah, this comment is legit so trippy. I woke up around 5 am, and this video was playing, and it was halfway through😂
Bruh, I just woke up 3:39:58
Now I understand why I was dreaming about being at high school (I'm in college 💀)
Same here. Bizarre
I woke up and apparently I watched the whole thing
my 7th day - still not finished. Just so nice to see someone do ML work live! Thank you
r u a programmer before?
@@liudavid9792 jut dont have the time to see the whole 3 hours in one shot. I had to watch few minutes at a time. :)
@@liudavid9792 8the
@@geld5220 A
watch at 2x speed
I recently completed the ML tutorial, and I wanted to express my gratitude for the outstanding content. The derivation and mathematical explanations were particularly impressive. I've been trying to grasp the fundamentals of machine learning for quite some time now, and this is the first tutorial where I genuinely understood the derivations and gained valuable knowledge about the topics plus the side by side implementation also helped alot. Thank you for creating such an informative and well-structured tutorial!
Impressive comments. Did you encounter a problem with data links 🔗? I am, how can I navigate it please?
This is my first Course which I've completed from FCC, got a good understanding on ML now, Thank you !!
I think it is a very advanced video honestly. it should mention that you need a very solid foundation of statistics, coding, data science to even understand what is going on!!! its a medium level video
Thanks for an amazingly simplified approach to ML 👍
If only everyone recognised the value of courses like this as you have done. First time I've seen that thanks/donation button which is sad but now will know how to contribute back! Thanks for letting me know that exists :D
give me some
@@oktjona 😭😭🤣🤣
@@TheGoat62607 m serious 😂 😂
Levei exatos 3 meses e 6 dias pra terminar esse vídeo assistindo em 10 momentos (parcelados) diferentes e hoje eu concluí ele
Estou muito contente por ter finalizado esse compromisso e por ter consumido todo esse conteúdo
Excelente material, muito didático e insightfull (não sei qual melhor expressão pra isso em pt-BR) esse conteúdo, excelente
Yes, as everyone is the audience let's start with an example that few can relate to and then just jump into code, again, everyone is well versed in code, without any explanation or overview.
What do you mean, she started off with code, but quickly went on to explain what was going on. Besides, if you want to learn ML, you should be familiar with basic python and linear algebra.
that's a losing mindset. I'd save the video and come back to it once I understood the basics used
i mean at this point you would argue she is speaking english too or what
UA-cam somehow made me watch this video while I was sleeping and caused me to exceed my data plan
If you're getting an error about comparing a list to a float. Changing the "least_val_loss" variable to a list with two infinite floats will fix it. Like this: least_val_loss = [float('inf'), float('inf')]
thank you so much, that really drove me crazy. can you explain why it fixed it, and why she didn't need to do it in the video?
@@sigmaohiofortniterizz That section had the correction edited out. Right after she says "now we wait" (2:05:50) the video skips her debugging and cuts straight to the correct code. Notice the [0] that gets added to the "val_loss = model.evaluate(X_valid, Y_valid)" line right after the "now we wait".
when compling the model we asked for metrics, binary_crossentropy and accuracy, nn_model.compile(optimizer=tf.keras.optimizers.Adam(lr), loss='binary_crossentropy', metrics=['accuracy']), so val_loss = model.evaluate(X_valid, y_valid) returns a list with those two datapoints val_loss[loss, accuracy] we are only interested in loss to plot so val_loss[0] is needed.
@@michaelwindon681 can we add only one float to the list " least_val_loss = [float('inf')]" ? Does this do the same function as above
That‘’s the error I have encountered. Thank you so much!
This is amazing. I'm a data analyst and had some formal training in machine learning, but my classes were really surface and "teach yourself" style. This is so much better. I also find it easier to listen to women, so that's a bonus lol
Repost from deep in the comments:
⌨ (0:00:00) Intro
⌨ (0:00:58) Data/Colab Intro
⌨ (0:08:45) Intro to Machine Learning
⌨ (0:12:26) Features
⌨ (0:17:23) Classification/Regression
⌨ (0:19:57) Training Model
⌨ (0:30:57) Preparing Data
⌨ (0:44:43) K-Nearest Neighbors
⌨ (0:52:42) KNN Implementation
⌨ (1:08:43) Naive Bayes
⌨ (1:17:30) Naive Bayes Implementation
⌨ (1:19:22) Logistic Regression
⌨ (1:27:56) Log Regression Implementation
⌨ (1:29:13) Support Vector Machine
⌨ (1:37:54) SVM Implementation
⌨ (1:39:44) Neural Networks
⌨ (1:47:57) Tensorflow
⌨ (1:49:50) Classification NN using Tensorflow
⌨ (2:10:12) Linear Regression
⌨ (2:34:54) Lin Regression Implementation
⌨ (2:57:44) Lin Regression using a Neuron
⌨ (3:00:15) Regression NN using Tensorflow
⌨ (3:13:13) K-Means Clustering
⌨ (3:23:46) Principal Component Analysis
⌨ (3:33:54) K-Means and PCA Implementations
Spent 5 days on this video and it really taught me a lot about ML, it might be tougher for absolute beginners to grasps at first due to the knowledge of numpy and maths needed. But Beginner friendly or not, this was a good tutorial
*NichesPanel likes this xD* we all know that they isn't, but do you think models buy followers to appear on the internet?
😊
spectrum
B😊
Enjoy these thorough, clear, visual explanations. She makes what we do accessible to beginners and a perfect refresh for seasoned users.
Absolutely brilliant. As mentioned in the intro Kylie is a true genius. god bless her
No idea why I woke up to my UA-cam algorithm playing this, but her voice is so soothing and I think I’m in love
It would be very beneficial for beginners to make Shallow, Deep and Convolutional Neural Networks from scratch. Because by doing so, they can learn many activations and their derivatives, forward propagation, and backward propagation. Along with, the dimensions of matrices and what is actually happening at each layer. Later on, they can shift to Tensorflow or Scikit when doing professional learning but I do advise to at least implement a neural network from scratch.
Have you Built NN from scratch?
@@lakshman587 Yes. A Shallow Neural Network. This week I will be making a Deep Neural Network Vectorized Approach from scratch and 2 weeks later I will also implement CNN from scratch
@@muhammaddarab7474 Can you share your code?
I have tried to hardcode the backpropagation algorithm for 1 input and 1 output with only 1 hidden layer
So even I need to implement full NN from scratch...
@@lakshman587 I can share my Shallow Neural Network code with you for 1 layer but for N Hidden layer (Deep Neural Network), you will have to wait till I implement it
@@muhammaddarab7474 Ok
I guess we are on same track.
Ill try to implement NN for N hidden layers from scratch...
Let's hope we will finish and gain knowledge on how NN works 🥲
All the best!
ขอบคุณมากค่ะที่มีบทช่วยสอนแบบนี้
ขอบพระคุณจากประเทศไทย Thank you very much for this tutorial.
Thank you from Thailand.
She is so good at this I woke up yesterday and startng writing an entirely custom Vector Matrix AI learning algorithm from scratch . After watching her explain the concepts from the simplest to the more complex. As a software programmer this has been very interesting to see how simple these algorithms really are.That is how you know you're a good teacher when im not even paying attention and somehow it sticks😁
This is basically Machine learning for everyone who is a Python programmer. But seems to be a nice video. Will get back to this after I finish Python
You are literally the best, I've been looking for a tutorial for three days and yours works
It was a good watch, I think I still have a long way to go
But I was able to understand the course and got lots of doubts as well. Very good course for some one who know basic concepts wanting to improve them.
I would recommend this to someone who has an idea of what machine learning, because it is not entirely beginner friendly.
Learn the basic concepts and come watch this to understand better about it
This is like the tenth time UA-cam algorithm auto plays this video. Guess I gotta learn machine learning now.
A great introduction into machine learning, even for someone who may not be much familiar with Machine Learning, can learn a lot.
I cannot even begin to describe how useful this video was for me. Endless thanks for the priceless knowledge Kylie and freeCodeCamp provided. This video seriously gave me an edge when it came to machine learning. It organized all the abstract concepts I have been hearing about while also describing them perfectly. I have learned SO much. Thank you, thank you, thank you!
Had to learn ML and had no idea where to start, how to start, though I had heard of certain terms and knew bits and pieces (and was confused). This has given me a good overview of the basics and the basic concepts and contents of ML are sort of organized in my mind now with the "what is ML?", "what are the basic methods" and most importantly, "why the methods" and "How they work" information clearly explained and examples of how to use the methods on python, so that it is now easy to take off from here - A relatively comprehensive yet brief overview - a need of the hour for me right now as I cannot afford to spend weeks or months learning everything, but needed a good overview for my tasks at hand. Searched many sources but this seems to be one of the best in clarity. Thank you!
Such a great video. I’m already 30min and learning a lot. My major is centered around AI and machine learning so I wanted a sneak peek at what’s to come. Thank you!
I'm here for the same reason! About to go into junior year and leap headfirst into restricted electives
do you have any recommendation of youtube tutorial to dive into ai and machine learning major?
41:52
1:02:49
Just reminding myself where I stopped the video. Absolutely fantastic walkthrough!
I'm very excited about this. Her neural network video was amazing !
It's 2:16 am and I just woke up to this on my screen and yes I wanted it so badly. So strange but it's now downloaded and will watch and learn when I wake up. Thank you
Ironically, she is so humble as well. She is great. Thanks for making this.
I just finished taking this course. I took a while. but she explains the theory and examples. I loved the math part of it. she goes on explaining the Supervised vs Unsupervised ML tools and methods. I learnt a lot also with Pyrhon. Thanks for your effort.
You guys are awesome.... just search for any tech skill on youtube and bang here is your great videos lectures or tutorials presents in results....
Thanks for the explanations! it is really detailed, her tone is comfortable, I can easily to understand what she said and she elaborates each steps (it is so important as every self learner can know the rationale behind each steps).
Hope I can watch another lesson videos of this editor!
Hey, invite you to use my AI learning tool -- Coursnap for this course! It has course outlines and course shorts, so you can get the gist of 1-hour in just 5 minutes. Try it out and boost your learning efficiency!
18:10 As soon as you mentioned "Hot Dog or Not Hot Dog", it instantly reminded me of Jian Yang's classification model from the HBO comedy series Silicon Valley. 😂
But this course is very useful and easy to grasp for beginners like us. 👍👍
Same here bro. Jian Yang is a legend
love this XDD
Not Hotdog
falling asleep lands me in odd places
I also fell asleep 😂😅
Bruh 4th time now
hahaha man same here
i thought i am the only one!
this is perfect! By far the best I´ve found out there, such a clear and complete explanation. Great teacher.
Hats off.. cannot imagine that it is possible to explain these concepts in such a simple way. Thank you and please keep on posting such learning videos.
Kylie explains better than my professors LOL. Great job FCC and Kylie, thank you very much
i like to see what your prof thinks when they know about this lol
Hello, I really enjoyed the entire journey with you. This is where my journey in Machine Learning begins. I have gained enough confidence to take it to the next level. ❤❤❤❤❤❤
Thanks . I am an experienced java lead . I always wanted to learn Machine learning
The best teacher I have ever seen in this subject...!
I was so excited to see this course because usually freecodecamp courses are for beginners but this one is not. You need to have good knowledge of numpy and other libraries to be able to learn more from this course. Really disappointing. They should let users know the basics that are required for this course. But ultimately this is a very good course and it's free so really appreciate it guys. Thanks
I think most people understand intuitively that something as complex sounding as “machine learning” will require prior knowledge. Lol. It’s like walking into an “Intro to Quantum Physics and Relativity” class and expecting to get by without understanding Newtonian physics or calculus.
I would never complain about free education - do one of their Python/numpy courses and return here. Better to build on what you know already than every course starting from 0 :)
Amazing course, I coded it along in VSCode. This made me learn quite a lot about conda, environments, etc. With this kind of hands on learning, not only I believe I can start doing data analysis and creating my own models and understanding their predictions. Thank you all so much! 👏Great work! (And well, it's not for Python noobies... Anyone past the basics with most of the libraries used can understand it)
It seems half of us are here after falling asleep
Ahahaha, that's right! I fell asleep in front of the computer and when I woke up, this video was playing. 🤣
U r not alone lol
lol yeah, but have you realized the time difference? This happened to no one by the time the video was uploaded, that's wired..
What model did you used to predict this?
Absolutely correct😮
Same 🤣 I reckon UA-cam sent it for the ad revenue on a 4h video
to anyone who is starting i will suggest you should read this book -(Fundamentals of data science by Jugal K. Kalita,Swarup Roy,Dhruba K.)up to machine learning unit..and if any doubt do gpt,,it will definitely help yu to understand this wonderful course more effetively.
The best introductory machine learning video!
I have tremendous respect for Kylie.
I think it is the clearest and easiest to understand explanation in the world.
Thank you so much.
Simple and straightforward content, helped me make the decision to buy
🎯Course outline for quick navigation:
[00:00-06:57]1. Machine learning and data analysis
-[00:00-00:28]Kylie ying, a physicist and engineer, will teach machine learning to beginners.
-[01:05-02:12]Uci ml repository offers magic gamma telescope dataset for predicting particle type using camera patterns.
-[05:04-05:38]Using 'fdist' and 'class' to label columns in a csv file to a pandas data frame.
-[05:59-06:28]Converting gs to 0s and hs to 1s in data frame class.unique.
[07:01-34:39]2. Machine learning fundamentals
-[08:00-08:29]In supervised learning, 10 features are used to predict the class label g.
-[09:53-10:27]Supervised learning uses labeled inputs to train models and learn outputs.
-[13:14-13:45]Two categories: qualitative and categorical data.
-[23:57-24:24]Data set is divided into training, validation, and testing sets, with distribution like 60%, 20%, and 20% or 80%, 10%, and 10% depending on statistics.
[34:39-44:02]3. Data scaling, splitting, and oversampling for machine learning
-[35:13-35:43]Splitting data into train, valid, and test sets, shuffling with 60% split
-[37:38-38:10]Imported standard scaler from sklearn for data scaling.
-[39:06-39:36]Using numpy to reshape 1d vector y into 2d object for stacking x and y.
-[40:28-40:59]Around 7,000 gammas and 4,000 hadrons, oversampling needed.
-[42:05-42:37]Oversample the smaller class to match the larger class in the dataset.
-[43:43-44:12]Validation and test sets are not oversampled to assess model performance on unlabeled data.
[44:02-57:30]4. K nearest neighbors model and its implementation
-[44:35-45:03]Introducing knn model for predicting family size based on income.
-[49:02-49:34]Using k-nearest neighbors algorithm with a k of 3 or 5 to determine labels for data points.
-[52:37-53:10]Introduction to k-nearest neighbors and using sklearn for implementation.
-[56:39-57:12]Precision score: 77-84%, recall: 68-89%
[57:31-01:36:19]5. Probabilistic classification models
-[57:31-58:09]Analyzing unbalanced test data, improving f1 score to 0.87, accuracy at 82%
-[01:01:35-01:02:06]Probability of having covid, given a positive test, is 96.4%.
-[01:06:00-01:06:32]Probability of positive test given disease is 0.99, probability of having disease is 0.1.
-[01:12:22-01:12:52]Naive bayes assumes independence in calculating joint probability.
-[01:16:45-01:17:15]Using the training set, we apply the map principle to maximize expression for hypothesis selection.
-[01:26:27-01:27:43]Logistic regression fits data to sigmoid function to build a model with multiple features.
-[01:28:56-01:29:33]Model achieves 65% precision, 71% recall, and 77% accuracy, outperforming naive bayes but not knn.
-[01:33:30-01:33:58]Goal: maximize margins in svms for best class separation.
[01:36:21-01:46:45]6. Svms and neural networks
-[01:37:13-01:37:41]Exploring the definition and power of svms through the kernel trick.
-[01:38:23-01:39:17]Achieved 87% accuracy with svm model, class one scores 0.9, covered four classification models: svm, logistic regression, naive bayes, and knn.
-[01:39:44-01:40:12]Neural networks consist of input, hidden, and output layers, each containing neurons.
-[01:42:50-01:43:22]Introduction of sigmoid, tanh, and relu activation functions to prevent collapse of the model into a linear one.
-[01:43:42-01:44:16]During training, the model adjusts based on l2 loss function to reduce error.
-[01:45:53-01:46:26]Updating weight w0 with a new value using a factor alpha.
[01:46:45-02:03:23]7. Neural network training
-[01:46:45-01:47:13]Explaining the use of learning rate and negative gradient in adjusting neural net convergence.
-[01:47:39-01:48:13]Iteratively adjust weights in neural network for machine learning.
-[01:55:58-01:56:24]Validation accuracy improved from 0.77 to around 0.81, and loss is decreasing, indicating positive progress.
-[01:56:50-01:57:27]Discussing grid search for hyperparameters with variations like 64 nodes and 16 nodes, and adjusting learning rate and epochs.
-[02:00:57-02:01:36]Analyzing model performance with loss and accuracy plots.
[02:03:24-02:36:19]8. Model training, evaluation, and regression in ml
-[02:07:02-02:07:37]Best performance achieved with 64 nodes, 0.2 dropout, 0.001 learning rate, and batch size of 64.
-[02:16:09-02:16:46]Using linear regression to minimize error and make predictions for data points.
-[02:17:19-02:17:51]Kylie discusses assumptions, including linearity in data analysis.
-[02:28:04-02:28:32]The root mean squared error allows expressing error in dollars and same unit.
-[02:33:09-02:33:32]High r squared indicates good prediction, adjusted r squared accounts for terms.
[02:36:20-03:12:03]9. Data analysis and regression modeling
-[02:42:59-02:43:25]Data frame modified: wind, visibility, and functional dropped, leaving 6 columns.
-[02:48:10-02:48:42]Demonstrating simple linear regression using temperature data.
-[02:54:17-02:55:15]Improved r squared from 0.4 to 0.52 indicates progress in regression analysis using tensorflow.
-[03:11:18-03:11:45]Neural net underestimates for larger values, linear regressor also used.
[03:12:04-03:22:13]10. Unsupervised learning: k-means clustering
-[03:12:04-03:12:32]Linear regressor is limited in capturing non-linear relationships; suggesting the need for alternative models in certain cases.
-[03:12:58-03:13:29]Unsupervised learning: using k-means clustering to compute k clusters from unlabeled data.
-[03:17:35-03:18:02]Data points recalculated to update centroids for groups.
-[03:21:54-03:22:33]Using expectation maximization, reaching stable clusters allows stopping iteration.
[03:22:15-03:33:52]11. Expectation maximization and principal component analysis
-[03:22:15-03:23:20]Using expectation maximization for unsupervised learning to find data patterns and structure.
-[03:24:10-03:24:42]Pca reduces multi-dimensional data to one dimension to capture key information.
-[03:25:49-03:26:19]Demonstrating distance using pca to find direction in space.
-[03:30:55-03:31:23]Minimizing projection residual to find largest variance dimension in pca.
[03:33:53-03:53:51]12. Unsupervised learning and dimensionality reduction
-[03:33:53-03:34:22]Implementing unsupervised learning on the uci seeds dataset with three types of wheat kernels
-[03:35:06-03:35:51]Importing pandas and seedborne for specific class.
-[03:45:46-03:46:12]Using k-means, identified 3 classes based on compactness and asymmetry in scatter plot.
-[03:47:54-03:48:18]Pca reduces multiple dimensions into a lower dimension number.
-[03:51:32-03:53:49]Unsupervised learning algorithm predicts three categories fairly well without labels.
-[03:53:11-03:53:37]Algorithm predicts three categories fairly well without labels. demonstrates unsupervised learning.
offered by Coursnap
She is so smart and makes things simple to understand👍
It would be helpful for people without coding experience to explain the libraries being used, the languages, and how it works. Or to explain the bare minimums required. This seems more like it's geared to people with python programming experience as a base
Hello, Kylie! You even not suppose, as much you help me with this course. Wish , that you will happy all your life
@Freecodecamp you're always teaching us great skills with great mentors. Salute sir 🤗. Kindly upload the electric vehicle course if you have a great mentor available.
Wow! I started learning this morning from your roadmap and the algorithm provided me this video. And this is the best one indeed.
Thank you, Kylie Ying mam.
Stoked to the neuron.
Thanks for the content. This isn't for everyone though.. She speeds through so much and beginners are just sitting here going "What the hell is going on...". Looks like that was being pointed out early in the comments and then there seems to be a flood "this is so clear and fully fleshed out!" comments that have been pushed to the top.. That's a bit sus.
no u actually just have needed to know programming and some basic math. This is much better than that 9hr one by that indian kid
day 1 (8/7) -> 44:43
day 2 (9/7) -> 58:35
day 3 (10/7) -> 1:39:44
day 4 (11/7) -> 2:34:28
day 5 (12/7) -> 3:53:52 done
Congrats
I love it! I must say this is one of the most comprehensive and well structured videos I've watched lately! Big kudos to Kylie!
Kylie and this course are f**ing AWESOME!!! So clear explained!! Thank you so much for sharing!
Could you explain what the things are that you are labeling and importing as you are doing them?
It helps me to make connections in my brain if I can associate what you are doing with the reason behind it. Otherwise it is just an exercise in memorizing and just replicating what you are doing without any understanding.
Thank you for sharing and all the effort put into it.
I love it how it says "for everybody" and skipping through the video all I can see is code and equations.
if you're trying to learn machine learning. you obviously need to know code and some math
Tysm for covering sooo much so quickly and it was all clear and to the point and I cant appreciate it enoughhhhhh!!!!!
i watch a lot of ml course and I believe this is best course.I don't blame on other teacher.My learning style is familier with this course
It seems that nowadays the TensorFlow Model.evaluate returns both the loss and the accuracy in a list so at around 2:05 of the video the comparison val_loss < least_val_loss is no longer working. Maybe they changed what the method returns since this video was published. I got this fixed by referring to the first element of the List (according to the docs the first item is loss and second accuracy) like this:
if val_loss[0] < least_val_loss:
least_val_loss = val_loss[0]
In 2:40:12 u rerun the cell with the "df['functional'] == (df['funtional'] == 'Yes').astype(int)" statement. Because after the first run of this cell there is no 'Yes' in the column all values has to be 0. So u should reconsider u'r statement at 2:42:14 " 'functional' doesn't give us any utility" .
Good morning guys!
Kylie, thanks so much for your time. It was a pleasure learning from you.
didnt feel like for beginners
This is the best course ever on youtube about machine learning . She know the fundamental and know what to teach . ❤
This is a great course and thank you so much. I was able to understand and learn so many new things! Some topics I had struggled a bit were quite clear through this course.
what programming language she used in this course?
@@yehiaelhariry9356 Python.
@@yehiaelhariry9356 bruh😂😂😂🤣🤣🤣
her explanation extremely helpful for my Diabetes Machine Learning Research
Thanks for making the excellent course!
Did a refresher on ML as it's been a while... Kylie rocks! Delivered with the right amount of theory combined with examples and importantly, clarity! Top marks!!!!!
I’ve programmed in Python for a while now but I understand nothing 😂. I think she lost me when she started doing all of the Data Science stuff and there are multiple vocabulary words that could have been explained more. Overall, no shade to her but I’m probably just a noob
Prefect course for newbie. Teacher explains very good. 🎉
This is so great!! Best machine learning video I have seen by far!!
Good job on this :D
She's excellent.I wish she do deep learning tutorials
Can someone confirm if this goes beyond the basics and actually helps you to build stuff? I’ve done plenty of these courses and most of them only contain basic information about a 100 different things and very little demonstration of actually building a significant project from raw data. So, my fundamentals should be fine after a bit of a brush-up but I need a course that goes beyond.
read the title once again
It is just basic not in deep knowledge
You are in what we call tutorial hell
Try to build something on your own it will help
The best way to learn is to practice. You can find free datasets online and mess with them to practice skills.
English is not my native language and I am new at machine learning but it was easy for me to understand you. i watched too many videos about ML but this is the best
hi im sorry i have a really stupid doubt related to the bayes theorem question... Please correct my logic: i thought that P(having disease) = P(True positive) + P(false negative). But acc to that, the ans to that question should be 0.1-0.01=0.09. which it isn't... Is this eqn wrong or can we not add probabilities like that or... Maybe im doing it all completely wrong. Pls help. Thank u so much
Great work! amazing job, just suited for those who want to know how we solve deep learning problems,like tools and thought
A great introduction into machine learning, even as someone who is into the tech industry, I am not too familiar with machine learning.
THANK YOU for making her course available on this channel !!!!!!! 👍😃👍
Thank you so much for your brilliant tutorials and courses Kylie (please do more!!!)! Could you please recommend some books on the mathematics of machine learning (and books that you found useful when you dived into the subject).
I don't know why, but this helped me sooooo much to finally fall asleep 👍
Thank you a lot for this video. This is very interesting and informative. Keep posting like those amazing videos, this is awesome.
Wow, talk about a masterclass in entrepreneurial wisdom! Who knew you could launch an empire with just a few bucks and a dream? 😮
Thanks for sharing such an amazing content!!