Easy ML
Easy ML
  • 93
  • 466 691
Introduction to Iris Data
Introduction to Iris Data
Переглядів: 6 249

Відео

Data Normalization using Z-Score technique
Переглядів 20 тис.5 років тому
Data Normalization using Z-Score technique
Data Visualization
Переглядів 2,8 тис.5 років тому
Data Visualization
Introduction to R Studio - Part 1
Переглядів 1,2 тис.5 років тому
Introduction to R Studio - Part 1
Introduction to R Studio - Part 2
Переглядів 1 тис.5 років тому
Introduction to R Studio - Part 2
Introduction to Supervised Learning
Переглядів 2,3 тис.5 років тому
Introduction to Supervised Learning
Introduction to Correlation Coefficients
Переглядів 2,6 тис.5 років тому
Introduction to Correlation Coefficients
Introduction to Correlation Matrix
Переглядів 52 тис.5 років тому
Introduction to Correlation Matrix
Evaluating accuracy of Regression Models
Переглядів 17 тис.5 років тому
Evaluating accuracy of Regression Models
Evaluating efficiency of Regression Models
Переглядів 1,6 тис.5 років тому
Evaluating efficiency of Regression Models
Building Regression Models
Переглядів 6 тис.5 років тому
Building Regression Models
Regression Models - Step 2 : Splitting Data
Переглядів 1,6 тис.5 років тому
Regression Models - Step 2 : Splitting Data
Regression Models - Introduction to Correlation
Переглядів 1,6 тис.5 років тому
Regression Models - Introduction to Correlation
Regression Models - Step 1 : Variable Selection (Part 1)
Переглядів 2,5 тис.5 років тому
Regression Models - Step 1 : Variable Selection (Part 1)
Regression Models - Step 1 : Variable Selection (Part 2)
Переглядів 1,8 тис.5 років тому
Regression Models - Step 1 : Variable Selection (Part 2)
Spurious Correlations - Why we need Regression Models ?
Переглядів 4,3 тис.5 років тому
Spurious Correlations - Why we need Regression Models ?
Introduction to types of Correlation
Переглядів 1,9 тис.5 років тому
Introduction to types of Correlation
Building Random Forest Models
Переглядів 1,4 тис.5 років тому
Building Random Forest Models
Evaluating Random Forest Models
Переглядів 1,8 тис.5 років тому
Evaluating Random Forest Models
Introduction to Random Forest Models - Understanding Decision Trees (Part 1)
Переглядів 1,3 тис.5 років тому
Introduction to Random Forest Models - Understanding Decision Trees (Part 1)
Random Forest Model - Iris Data
Переглядів 2 тис.5 років тому
Random Forest Model - Iris Data
Understanding Decision Trees (Part 2)
Переглядів 7275 років тому
Understanding Decision Trees (Part 2)
Introduction to Unsupervised Learning
Переглядів 2,1 тис.5 років тому
Introduction to Unsupervised Learning
Evaluating K-Means Cluster Analysis
Переглядів 8 тис.5 років тому
Evaluating K-Means Cluster Analysis
Introduction to Cluster Analysis
Переглядів 2 тис.5 років тому
Introduction to Cluster Analysis
Introduction to K-means - Choosing number of clusters
Переглядів 16 тис.5 років тому
Introduction to K-means - Choosing number of clusters
K-Means Clustering - Iterations
Переглядів 4,9 тис.5 років тому
K-Means Clustering - Iterations
Evaluating Principal Component Analysis (PCA) - Part 1
Переглядів 1,2 тис.5 років тому
Evaluating Principal Component Analysis (PCA) - Part 1
Evaluating Principal Component Analysis (PCA) - Part 2
Переглядів 1,1 тис.5 років тому
Evaluating Principal Component Analysis (PCA) - Part 2
Introduction to Principal Component Analysis (PCA)
Переглядів 1,8 тис.5 років тому
Introduction to Principal Component Analysis (PCA)

КОМЕНТАРІ

  • @MohitSalvankar-r6z
    @MohitSalvankar-r6z 26 днів тому

    Thanks a Lot for these Videos. The way you have explained things is commendable. I wonder why you have so less followers.

  • @aaravinthan001
    @aaravinthan001 Місяць тому

    For all data set we do same process or any changes happen

  • @ellenessien9481
    @ellenessien9481 Місяць тому

    very insightful tutorial

  • @AkshayaS-p9u
    @AkshayaS-p9u Місяць тому

    Link to previous video pls

  • @TheUmaragu
    @TheUmaragu 2 місяці тому

    Nice explanation; I loved the way you used the two extreme cases for clustering.

  • @ldsharma6546
    @ldsharma6546 2 місяці тому

    Sir, I have analysed 850 soil samples for different forms of soil acidity and 7 other soil properties like pH, EC, OC etc. Sir, I would like to predict Exchangeable acidity (numeric) with 7 soil parameters. How should I calculate random forest?

  • @tmitra001
    @tmitra001 2 місяці тому

    what is nc in WSS function?

    • @tmitra001
      @tmitra001 2 місяці тому

      Ok I got what is nc!!

  • @Kavya-0789
    @Kavya-0789 2 місяці тому

    Thank you ur vedio brought me a hope that R is easy

  • @Vesna_Covic
    @Vesna_Covic 3 місяці тому

    why the f you scream this much...

  • @Swormy097
    @Swormy097 4 місяці тому

    Thank you so much❤

  • @aneeshkumarkv7792
    @aneeshkumarkv7792 4 місяці тому

    Made it so simple and illustrative... thanks a lot

  • @kevinshao9148
    @kevinshao9148 5 місяців тому

    Hi, what is this "predict(...)" function? Is it from 'randomForest" or it's R built-in function? Thanks

  • @mehrananjum5501
    @mehrananjum5501 6 місяців тому

    Hello, Can you help me please

  • @expeditadjovi9927
    @expeditadjovi9927 6 місяців тому

    good explanation

  • @heznadastudios
    @heznadastudios 6 місяців тому

    What if i'm trying to load my personal data in?

  • @gauravdeshmukh9451
    @gauravdeshmukh9451 7 місяців тому

    can you please make this same video but in python...

  • @emirhandemir3872
    @emirhandemir3872 8 місяців тому

    My autoplot(KM, mydata, frame= TRUE) doesn't work. I run it and it doesn't do anything. I have to point out that I didn't run the wssplot function though. Is it because of that? I fixed it. It's because of my lack of R syntax knowledge! I have no idea why but I am running R in vscode and vscode doesn't print the variables if they are not in a print function.

  • @ibrahimnahushal9353
    @ibrahimnahushal9353 8 місяців тому

    good job, short and directly to the point. thank you

  • @taisepinheiro8747
    @taisepinheiro8747 9 місяців тому

    Hi, thank you very much for sharing this video :) the only one tutorial that I was able to follow. One question, my predicted variable is not categorial, but it´s an area of deforestation. So, Can I use the code you shared in this video?

  • @noble7578
    @noble7578 9 місяців тому

    Excellent explanation! Thank you!

  • @prachimehta8634
    @prachimehta8634 9 місяців тому

    So glad to see 85k views!

  • @estrellitalinda7658
    @estrellitalinda7658 10 місяців тому

    how can incorporate upsamling or downsampling before running the random forest model? neeeeed help pls

  • @marshmellowmash
    @marshmellowmash 10 місяців тому

    This whole series was so helpful - thank you so much!

  • @michaelogunmakin9143
    @michaelogunmakin9143 11 місяців тому

    Super helpful, thanks!

  • @Aloneincrowd3
    @Aloneincrowd3 11 місяців тому

    The mean of my cor(data) is 0.45. Is it not eligible for pca? What should i use for variable selection then?

  • @mikemartinez5970
    @mikemartinez5970 11 місяців тому

    Excellent video!! Thank you so much

  • @prachimehta8634
    @prachimehta8634 11 місяців тому

    29k views! 👏

  • @jakubosobka4414
    @jakubosobka4414 Рік тому

    Worst educational video I've seen for a while. And I've watch thousands.

  • @FrOsTyBeArKiD
    @FrOsTyBeArKiD Рік тому

    This was very very clear. I enjoy the examples showing the extreme scenarios to make the optimum example hit home.

  • @halilzelenka5813
    @halilzelenka5813 Рік тому

    Hansel is a thicc boy

  • @nehachaudhuri8475
    @nehachaudhuri8475 Рік тому

    You have some great videos. Why have you stopped posting videos?

  • @BhavaniSc
    @BhavaniSc Рік тому

    This was so helpful thank you

  • @codecrafts5263
    @codecrafts5263 Рік тому

    After 2 hrs of surfing internet about the subject, i found this video and it clarified the concept in 3.11 minutes. Really Thank you

  • @oshadeegunawardhana388
    @oshadeegunawardhana388 Рік тому

    Nice explaination ❤

  • @halagundegowdagr2877
    @halagundegowdagr2877 Рік тому

    Sir, You are a Good TEACHER, Short and sweet explanation, Superb

  • @Think0Like0Cheese
    @Think0Like0Cheese Рік тому

    us data analysts learning more than necessary about plant terminology..

  • @ofirshorshy8281
    @ofirshorshy8281 Рік тому

    useful. thank you. I will try this.

  • @elifceyhan78
    @elifceyhan78 Рік тому

    Hello, thanks for great video! It is heelpful. What do you suggest for stability test in r? Which function I can use?

  • @muh.anugrahpratama1752
    @muh.anugrahpratama1752 Рік тому

    Bang kalau pake data citra data raster bisa nggak?

  • @alinaastakhova8412
    @alinaastakhova8412 Рік тому

    Thank You very-very much! It is certainly one of the best explanations. Very helpful! But I have got some questions. 1) I can not understand what do we do next with the PCAs? Shall I use it for multiple regression along with other variables or for clustering? 2) Can I reevaluate impact of variables using loading data? For instance, I use 5 variables to build PCAs. My PCA1 and 2 describe about 85% of variability, but each PCA does not connect to - lets say - the 3rd variable. May be I should delete the 3rd variable and run the analysis with only 4 others? Will this improve the outcome? 3) Why are some spaces blank in loadings (minute 6.28 on the video) - like Sepal.Width vs Comp.1? 4) And the final - body mass index is given as an example of PCA outcome. Does that mean that we can retrieve PCA1 and name it as a sort of new stable variable? Or BMI is just an example of data dimension reduction that does not correspond to PCA directly? Thats a lot of questions - but I really wonder...

    • @easyml1234
      @easyml1234 Рік тому

      Hey thank you for such a detailed comment. At times I wish UA-cam allows voice notes because typing would be ineffective per se. BMI is indeed just an example. The main application of PCA is to capture the essence of a large list of variables in fewer newly generated variables. Assume you have bank data.. there is a list of 120 variables and you need to predict loan delinquency. Inputting 120 variables and eventually tuning the model would be cumbersome in such instances you can deploy PCA to reduce the list of variables from 120 to let us say 12 and rest assured if you have executed PCA well then these 12 variables would have correctly captured the essence of the original 120 variables. The model that you will be building with these new set of PCs will be lighter and faster. You can deploy PCA before unsupervised learning as well !!

    • @alinaastakhova8412
      @alinaastakhova8412 Рік тому

      @@easyml1234 thanks! my point is - may be I don't need all 120. May be, I should extract 40 for one PCA and other 50 for another one and discard the remaining 30 and I will get better sketch of my "Eiffel tower" - that was a great example)) at least, I might get better values of my PC1 and PC2 equations. do we do so? reduce data for several PCAs? And once I get results of several PCAs - how do I interpret them? As variables for regression or basis for clustering?

    • @sutanukadas9049
      @sutanukadas9049 Рік тому

      ​@@easyml1234can this method used for categorical data ?

  • @manifestationmaster1111
    @manifestationmaster1111 Рік тому

    Thank you man, you finally made it made sense!

    • @easyml1234
      @easyml1234 Рік тому

      Thanks a lot for your comment Like / Share and Subscribe 🙏

  • @umermuhammad826
    @umermuhammad826 Рік тому

    Hello EasyML Awesome tutorial video. I wanted to know how can we determine the individual components of a cluster? What each of these blue and orange dots represents on the autoplot? Please help. Thanks in advance.

  • @robtaylor2781
    @robtaylor2781 Рік тому

    Thanks very much . Well explained and very useful

  • @AnkitKumar-rx7ky
    @AnkitKumar-rx7ky Рік тому

    sir I feel there is some problem with the explanation here of the confusion matrix. The axis needs to be reversed.

    • @easyml1234
      @easyml1234 Рік тому

      1:14 - Number of predicted females are 15 and number of Actual Females are also 15 so that is right. I guess the confusion may stem from the fact that there are 3 values or boxes with 15 if I have filled it with different values it would have been easier to follow. The logic is correct but because of the similar numbering there could be some confusion. Anyway thanks for this !

  • @kabberkartuj8924
    @kabberkartuj8924 Рік тому

    Very helpful play list. Thanks a lot

  • @zin6487
    @zin6487 Рік тому

    Thank Sir

  • @salmaasghar1674
    @salmaasghar1674 Рік тому

    EXCELLENT tutorial I must say.... you are born with extraordinary God-gifted abilities my dear.

  • @salmaasghar1674
    @salmaasghar1674 Рік тому

    I had to stop the video to appreciate you for such a wonderful tutorial.

  • @easyml1234
    @easyml1234 Рік тому

    Full video here :- ua-cam.com/video/NfIM9pUH9DA/v-deo.html

  • @Dipsree
    @Dipsree Рік тому

    how do you determine what each cluster represents?