Machine Learning Tutorial Python 12 - K Fold Cross Validation

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 603

  • @codebasics
    @codebasics  2 роки тому +3

    Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced

  • @MrSparshtiwari
    @MrSparshtiwari 3 роки тому +122

    After watching so many different ML tutorial videos and literally so many i have just one thing to say, the way you teach is literally the best among all of them.
    You name any famous one like Andrew NG or sentdex but you literally need to have prerequisites to understand their videos while yours are a treat to the viewers explained from so basics and slowly going up and up. And those exercises are like cherry on the top.
    Never change your teaching style sir yours is the best one.👍🏻

  • @beansgoya
    @beansgoya 5 років тому +30

    I love that you go through the example the hard way and introduce the cross validation after

  • @AltafAnsari-tf9nl
    @AltafAnsari-tf9nl 4 роки тому +12

    Couldn't ask for a better teacher to teach machine learning. Truly exceptional !!!!Thank You so much for all your efforts.

  • @beerusreal6
    @beerusreal6 3 роки тому +2

    I have never seen anyone who can explain Machine Learning and Data Science so easily..
    I used to be scared in Machine Learning and Data science, then after seeing your videos, I am now confident that I can do it by myself. Thank you so much for all these videos....
    👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏

  • @tatendaVIDZ90
    @tatendaVIDZ90 2 роки тому +1

    that approach of doing the manual method of what cross_val_score is doing in the background and then introducing the method! God send! Brilliant. Brilliant I say!

  • @pablu_7
    @pablu_7 4 роки тому +6

    Thank you Sir for this awesome explanation. Iris Dataset Assignment Score
    Logistic Regression [96.07% , 92.15% , 95.83%]
    SVM [100% , 96.07% , 97.91%] (Kernel='linear')
    Decision Tree [98.03 %, 92.15% , 100%]
    Random Forest [98.03% , 92.15% , 97.91%]
    Conclusion: SVM works the best model for me .

    • @pranjaysingh4161
      @pranjaysingh4161 11 місяців тому

      pretty ironic and yet amusing at the same time

  • @rajnichauhan1286
    @rajnichauhan1286 4 роки тому +64

    what an amazing explanation. Finally! I understood cross validation concept so clearly. Thank You so much.

  • @The_TusharMishra
    @The_TusharMishra 10 місяців тому +6

    He did folds = StratifiedKFold(), and said that he will use it because it is better than KFold
    but at 14:20, he used kf.split, where kf is KFold.
    I think he frogot to use StatifiedKFold.

  • @nuraishahzainal1660
    @nuraishahzainal1660 2 роки тому +6

    Hi, I'm from Malaysia. I came across your video and I am glad I did it. super easy to understand and I'm currently preparing to learn deep learning. already watch your Python, Pandas, and currently ML videos. thank you for making all these videos. you making our life easier Sir.
    Sincerely, your student from Malaysia.

  • @codebasics
    @codebasics  4 роки тому +4

    Exercise solution: github.com/codebasics/py/blob/master/ML/12_KFold_Cross_Validation/Exercise/exercise_kfold_validation.ipynb
    Complete machine learning tutorial playlist: ua-cam.com/video/gmvvaobm7eQ/v-deo.html

    • @hemenboro4313
      @hemenboro4313 4 роки тому

      we needed to use mean() with cross validation to get average mean of accuracy score. i'm guessing you forget to add. anyways video is pretty good and depth.keep producing such videos.

  • @vishalrai2859
    @vishalrai2859 3 роки тому +1

    only one channel who has pure quality not beating around the bush thanks dhaval sir for your contribution

  • @imposternaruto
    @imposternaruto 3 роки тому +1

    My teacher is frustratingly bad. I am learning from your videos so that I can get a good grade in my class. Thank you for taking some time to demonstrate what is happening. When you showed me with the example at 10:47, I finally understood.

  • @pablu_7
    @pablu_7 4 роки тому +22

    After Parameter Tuning Using Cross Validation = 10 and taking average
    Logistic Regression = 95.34%
    SVM = 97.34%
    Decision Tree = 95.34 %
    Random Forest Classifier = 96.67 %
    Performance = SVM > Random Forest > Logistic ~ Decision

    • @manu-prakash-choudhary
      @manu-prakash-choudhary 3 роки тому

      after taking cv=5 and C=6 svm is 98.67%

    • @sriram_cyber5696
      @sriram_cyber5696 Рік тому

      @@manu-prakash-choudhary After 50 splits 😎😎
      Score of Logistic Regression is 0.961111111111111
      Score of SVM is 0.9888888888888888
      Score of RandomForestClassifier is 0.973111111111111

  • @nicoleluo6692
    @nicoleluo6692 Рік тому +1

    🌹 You are way way... way better than all of my Machine learning professor at school!

  • @pappering
    @pappering 4 роки тому +10

    Thank you very much. Very nice explanation. My scores, after taking averages, are as follow:
    LogisticRegression (max_iter=200) = 97.33%
    SVC (kernel = poly) = 98.00%
    DecisionTreeClassifier = 96%
    RandomForestClassifier (n_estimators=300) = 96.67%

  • @mastijjiv
    @mastijjiv 5 років тому +10

    Your videos are AMAZING man!!! I have already recommended these videos to my colleagues in my University who is taking Machine Learning course. They are also loving it...!!! Keep it up champ!

    • @codebasics
      @codebasics  5 років тому +2

      Mast pelluri, I am glad you liked it and thanks for recommending it to your friends 🙏👍

  • @panagiotisgoulas8539
    @panagiotisgoulas8539 2 роки тому +2

    For the parameter tuning this helps. Just play a bit with indexes due to lists staring from 0 and n_estimators from 1 to match up indexes.
    scores=[ ]
    avg_scores=[ ]
    n_est=range(1,5) #example
    for i in n_est :
    model=RandomForestClassifier(n_estimators=i)
    score=cross_val_score(model,digits.data, digits.target, cv=10)
    scores.append(score)
    avg_scores.append(np.average(score))

    print('avg score:{}, n_estimator:{}'.format(avg_scores[i-1],i))
    avg_scores=np.asarray(avg_scores) #convert the list to array
    print('
    Average accuracy score is {} for n_estimators={} calculated from following accuracy scores:
    {}'.format(np.amax(avg_scores),np.argmax(avg_scores)+1,scores[np.argmax(avg_scores)]))
    plt.plot(n_est,avg_scores)
    plt.xlabel('number of estimators')
    plt.ylabel('average accuracy')
    44 was the best for me

  • @ricardogomes9528
    @ricardogomes9528 3 роки тому +1

    Finnaly a video explaining de X_train, X_test, y_train,y_teste. Thank you!

  • @christiansinger2497
    @christiansinger2497 5 років тому +4

    Thanks man! You're really helping me out finishing my university project in machine learning.

    • @codebasics
      @codebasics  5 років тому +3

      Christian I am glad to hear you are making a progress on your University project 😊 I wish you all the best 👍

  • @rajadurai7336
    @rajadurai7336 Рік тому +1

    LogisticRegression was the best model in the Iris dataset
    I got an accuracy of 97.3% compared to other models such as svm and randomforestclassifier

  • @jinks3669
    @jinks3669 2 роки тому

    Dhanyavaad Sir. Bhagwaan aapko swasth aur khush rakhien humesha.
    You are my god.

  • @piyushbarthwal1722
    @piyushbarthwal1722 Рік тому

    Don't have any words, you're teaching style and knowledge is amazing ✨...

  • @mohammadpatel2569
    @mohammadpatel2569 5 років тому +1

    Your video's on machine learning is way bettet than any online paid video's. so keep growing..

  • @parikshitshahane6799
    @parikshitshahane6799 4 роки тому +1

    Probably the best machine learning tutorials out there... Very good job
    Thanks!

  • @maruthiprasad8184
    @maruthiprasad8184 2 роки тому

    Thank you very much for excellent explanation. I got accuracy SVC=98.04% , RandomForestClassifier(n_estimators=30)=98.04%,
    LogisticRegression(max_iter=200)=96.08%

  • @hpourmamih
    @hpourmamih 4 роки тому +5

    This is one of the best explanation of Kfold Cross Validation!!!
    Thank you so much for sharing this valuable video . :))

  • @synaestheticVI
    @synaestheticVI 4 роки тому +1

    What an excellent video, thank you! I got lost in other written tutorials, this was finally a clear explanation!

    • @codebasics
      @codebasics  4 роки тому

      Hey, thanks for the comment. Keep learning. 👍

  • @josephnnodim8244
    @josephnnodim8244 3 роки тому

    This is the best video I have watched on Machine learning. Well done!

  • @oscarmuhammad4072
    @oscarmuhammad4072 3 роки тому

    This is an EXCELLLENT explanation. Straighfoward and simplified....Thank you.

  • @sarangabbasi2560
    @sarangabbasi2560 2 роки тому +2

    best explanation... i like the way u give examples using small data to explain how it actually works. 10:20
    no one explains like this... keep doing great work

    • @codebasics
      @codebasics  2 роки тому

      Glad you liked it

    • @strongsyedaa7378
      @strongsyedaa7378 2 роки тому

      @@codebasics
      I have applied K fold on the linear regression's dataset
      I used different activation functions & then I get mean & se values
      How to pick the best model from the k folds?

  • @KK-rh6cd
    @KK-rh6cd 3 роки тому

    I watch several videos of CV but your video is well explained, thank you, thank you very much sir, keep uploading videos sir

  • @learnwithfunandenjoy3143
    @learnwithfunandenjoy3143 5 років тому

    AWESOME AWESOME..... Excellent video you have created. I'm learning ML since past more than 1 years and heard almost more 400 videos. Your videos are AWESOME.... Please make complete series on ML... Thanks.

    • @codebasics
      @codebasics  5 років тому

      Pankaj I am happy it has helped you. :) And yes I am in process of uploading many new tutorials on ML. stay tuned!

  • @beerusreal6
    @beerusreal6 3 роки тому

    I am so close enough to finish your videos and then I'm going to hop into your Machine Learning and Data Science Projects... 😊😊😊😊😊😊😊😊😊😊😊

  • @layanibandaranayake9406
    @layanibandaranayake9406 3 роки тому

    The best and the smilpest explanation for cross validation i could find after so mush searching.! Keep up the good work!

  • @zunairnoor2745
    @zunairnoor2745 Рік тому +2

    Thanks sir! Your tutorials are really helpful for me. Hope I'm gonna see all of them and make my transition from mechanical to AI successful 😊.

  • @Gamesational1
    @Gamesational1 3 роки тому +1

    Useful for identifying many differnt types of categories.

  • @naveenkalhan95
    @naveenkalhan95 4 роки тому +7

    @20:39 of the video, noticed something interesting, by default "cross_val_score()" method generates 3 kfolds... but the default has now changed from 3 to 5 :))

    • @gandharvsaxena8841
      @gandharvsaxena8841 3 роки тому +2

      thanks man, i was worried when mine was showing 5 folds results. i thought something was wrong w my code.

    • @khalidalghamdi6303
      @khalidalghamdi6303 2 роки тому

      ​@@gandharvsaxena8841 Me too lol, whi I am getting 5

    • @aadilsstatus8895
      @aadilsstatus8895 2 роки тому

      Thankyou man!!

  • @barackonduto5286
    @barackonduto5286 3 роки тому

    You are a great instructor and explain concepts in a very understandable and relatable manner. Thank you

    • @codebasics
      @codebasics  3 роки тому

      I am happy this was helpful to you.

  • @knbharath5947
    @knbharath5947 5 років тому +6

    Great stuff indeed. I'm learning machine learning from scratch and this was very helpful. Keep up the good work, kudos!

  • @yoyomovieclips8813
    @yoyomovieclips8813 4 роки тому

    You solved one of my biggest confusion.....Thanks a lot sir

  • @rahuljaiswal9379
    @rahuljaiswal9379 5 років тому +1

    very simple n lovely teaching......u r simple n great... thank u so much sir

    • @codebasics
      @codebasics  5 років тому +1

      Thanks rahul for your kind words of appreciation

  • @anirbanc88
    @anirbanc88 Рік тому

    thank you so much, i am so grateful for a teacher like you.

  • @21_koustavbanerjee69
    @21_koustavbanerjee69 Рік тому +1

    In exercise the maximum score get by SVM at gamma=auto and kernel=linearr and the score is = array([1. , 1. , 0.98]) 😀

  • @anujvyas9493
    @anujvyas9493 4 роки тому +43

    14:15 - Here instead of kf.split() we should use folds.split(). Am I correct??

    • @codebasics
      @codebasics  4 роки тому +14

      Yes. My notebook has a correction. Check that on GitHub link I have provided in video description

    • @Thedevineforce
      @Thedevineforce 4 роки тому +12

      Yes and also just to add to it StratifiedKFold requires X and y both labels to its split method. Stratification is done based on the y labels.

  • @kmchentw
    @kmchentw 3 роки тому +2

    Thank for the very useful and free tutorial series. Salute to you sir!

  • @programmingwithraahim
    @programmingwithraahim 3 роки тому

    The best score in my case is of Logistic Regression 97.33%
    Excellent Machine Learning Tutorials.

    • @codebasics
      @codebasics  3 роки тому

      Good job Raahim, that’s a pretty good score. Thanks for working on the exercise

    • @programmingwithraahim
      @programmingwithraahim 3 роки тому

      @@codebasics Thanks sir

  • @ramandeepbains862
    @ramandeepbains862 2 роки тому +2

    Sir, SVM performance is high as compared to other algo after changing parameter gamma='scale' for the given example of digits dataset

  • @cindinishimoto9528
    @cindinishimoto9528 4 роки тому +9

    My results (with final average):
    L. Regression --> 97.33%
    Decision Tree --> 96.66%
    SVM --> 98.00% [THE WINNER]
    Random Forest --> 96.66%

  • @MNCAMANI15
    @MNCAMANI15 3 роки тому

    So simple. You're a good teacher

  • @neel_in_germany
    @neel_in_germany 5 років тому

    Excellent explanation of cross-validation and parameter tuning...

    • @codebasics
      @codebasics  5 років тому

      Thanks for feedback Subhronil.

  • @sidduhedaginal
    @sidduhedaginal 4 роки тому

    Explanation was amazing sir and performed cross_val_score, below is the final average result(considered 10 folds)
    Logistic Regression
    - 95%
    SVM
    - 98% -------[Performed better]
    Decision Tree
    - 95%
    Random Forest - 96%

    • @codebasics
      @codebasics  4 роки тому

      Good job. those scores are indeed excellent.

  • @Hiyori___
    @Hiyori___ 3 роки тому +2

    your tutorial are saving my life

  • @arfazkhankhan74
    @arfazkhankhan74 Рік тому +1

    After watching the video for 25 mins, I realized that the last 5mins were the most important😄

  • @simaykazc1508
    @simaykazc1508 4 роки тому

    Thanks for creating rather authentic content on this topic compare to others. It is more clear!

  • @abdulazizalqallaf1704
    @abdulazizalqallaf1704 4 роки тому

    Best Explanation I have ever seen. Outstanding job!

    • @codebasics
      @codebasics  4 роки тому

      I am happy this was helpful to you

  • @hridayborah9750
    @hridayborah9750 4 роки тому

    nice n helpful. video with practice is more helpful than just lecture without practice session

  • @alerttrade2356
    @alerttrade2356 4 роки тому +1

    Thank you. This video solved so many questions at once. Nicely done.

  • @kanyadharani6844
    @kanyadharani6844 3 роки тому

    Super clear explanation, I have been searching for this one, by seeing this video makes me perfect, tq.

  • @helloonica8515
    @helloonica8515 3 роки тому

    This is the most helpful video regarding this topic. Thank you so much!

  •  4 роки тому

    OMG!! this is one of your best sir :) , may lord shiva bless you for ur service

  • @Kishor_D7
    @Kishor_D7 3 місяці тому

    usage of same datasets make less uninteresting, but your tutorials are awesome every tutorial across every thing have + and -,your tutorials are more structured but minus point is usage of same dataset which reduces interest to go next next

  • @jatinkumar4410
    @jatinkumar4410 4 роки тому

    Thank you very much sir for this very nice explanation. My results are:
    Logistic Regression=95.33%
    SVM=97.33%
    Decision Tree=96.67%
    Random Forests(40 estimators)=96.67%

  • @Suigeneris44
    @Suigeneris44 4 роки тому +1

    Your videos are really good! The explanation is crisp and succinct! Love your videos! Keep posting! By the way, you may not realize it, but you are changing peoples' lives by educating them! Jai Hind!

  • @ashutoshsrivastava914
    @ashutoshsrivastava914 3 роки тому

    Good explanation..Gained some confidence to enhance my skills in this area..

  • @RAKESHKUMAR-rb8dv
    @RAKESHKUMAR-rb8dv 10 місяців тому

    00:02 K fold cross validation helps determine the best machine learning model for a given problem.
    02:20 K-fold cross validation provides a more robust evaluation of machine learning models.
    04:36 Classifying handwritten characters into ten categories using different algorithms and evaluating performance using k-fold cross validation.
    07:06 K-fold cross validation helps in more robust model evaluation.
    09:43 K-fold cross validation divides data into training and testing sets for iterative model evaluation.
    12:35 Stratified k-fold ensures uniform distribution of categories for better model training.
    15:42 Measuring the performance of models in each iteration
    18:29 Parameter tuning in random forest classifier improves scores.
    20:46 K Fold Cross Validation helps measure the performance of machine learning models.
    23:18 Cross-validation helps in comparing algorithms and finding the best parameters for a given problem.
    25:18 K Fold Cross Validation helps in assessing the model's performance.
    Crafted by Merlin AI.

  • @deepanshudutta8414
    @deepanshudutta8414 4 роки тому

    Sir, really a very good explanation... finally i understood it very well.....

  • @himadrijoshi3745
    @himadrijoshi3745 4 роки тому

    following your tutorials is the best way to learn Machine learning techniques. Please upload a video explanation on KNN as well.

  • @milanms4593
    @milanms4593 3 роки тому

    Now i understand this concept. Thank you sir😃

    • @codebasics
      @codebasics  3 роки тому

      I am happy this was helpful to you.

  • @garggarg406
    @garggarg406 4 роки тому

    Sir u are doing an amazing job...i am becoming your fan now...👑

    • @codebasics
      @codebasics  4 роки тому +1

      Thank you so much Ayushi 😀

  • @mvcutube
    @mvcutube 3 роки тому

    Great. You made things look very easy & boosts the confidence. Thank you.

  • @ayushmuley1907
    @ayushmuley1907 Місяць тому

    You are the best teacher 😊

  • @yashm1735
    @yashm1735 2 роки тому

    Somehow when i tried this, SVM did better than all other Classifiers XD

  • @navjotsingh-hl1jg
    @navjotsingh-hl1jg Рік тому

    love your teaching pattern sir

  • @williamwambua7710
    @williamwambua7710 4 роки тому

    If i was rich i would have sent you a token of appreciation...Thank you for the content

    • @codebasics
      @codebasics  4 роки тому

      No worries! If you feel my videos have benefited you, you can spread the words. share the information on channel on your linkedin, facebook etc. that way maximum ppl can benefit from this

  • @cantanzim6215
    @cantanzim6215 4 роки тому +1

    it is amazing explanation , grate job ...

  • @WahranRai
    @WahranRai 5 років тому +2

    What is the score ?
    Cross validation is about validation of ONE model.
    After validating the model and getting his parameters, you shall choose method to compare with other models and select appropriate model.
    - Training set: A set of examples used for learning, that is to fit the parameters of the classifier.
    - Validation set: A set of examples used to tune the parameters of a classifier, for example to choose the number of hidden units in a neural network.
    - Test set: A set of examples used only to assess the performance of a fully-specified classifier.

    • @codebasics
      @codebasics  5 років тому

      You can use cross validation to compare multiple models to. Basically just run kfold one multiple models or same model with different parameters and compare the score.

  • @michaelcarlson2058
    @michaelcarlson2058 3 роки тому

    thank you for this video. Excellent presentation of the material with clear explanations

    • @codebasics
      @codebasics  3 роки тому

      Michael, I am happy you find it useful

  • @shamsiddinparpiev51
    @shamsiddinparpiev51 4 роки тому +1

    Greatly explained man. Thank you

  • @adnanax
    @adnanax 4 роки тому

    by making df method:
    mean(cross_val_score(LogisticRegression(max_iter=200), X,y))
    0.9733
    mean(cross_val_score(SVC(kernel='linear'),X,y))
    0.98
    mean(cross_val_score(RandomForestClassifier(n_estimators=40), X, y))
    0.96
    by using iris.data and iris.target directly:
    np.average(score_lr)
    0.95333
    np.average(score_svm)
    0.98000001
    np.average(score_rf)
    0.95333333

  • @AjayKumar-uy3tp
    @AjayKumar-uy3tp 3 роки тому +10

    Sir
    You used KFold(kf) instead of StratifiedKFold(folds) in the video
    Will there be any difference in the scores if we use stratified KFold?

    • @Zencreate
      @Zencreate Рік тому

      There is slight difference in the scores

  • @carpingnyland8518
    @carpingnyland8518 2 роки тому +6

    Great video, as usual. Quick question: How were able to get such low scores for svm? I ran it a couple of times and was getting in the upper 90's. So, I set up a for loop, ran 1000 different train_test_split iterations through svm and recorded the lowest score. It came back 97.2%!

  • @mrsonguku
    @mrsonguku Рік тому

    one tip for avoiding writing the same code to test difference models is to store all the models in a list/dict and loop through it.

  • @12121sk
    @12121sk 3 роки тому

    I learnt K Fold Cross Validation! from here!!

  • @nilupulperera
    @nilupulperera 4 роки тому +2

    Dear Sir
    Another great explanation as always.
    Thank you very much for that.
    By adding the following code svm started showing very good scores!
    X_train = preprocessing.scale(X_train)
    X_test = preprocessing.scale(X_test)
    Have I done the correct thing?

  • @hansvasquezm
    @hansvasquezm Рік тому +1

    Really good explanation. You are an expert. I have a question, Is it possible to select the test_size in cross-validation. Because when I use for example, Kfold with 3 splits. It splits the whole data into three parts, but it is possible to make these three splits but using 2 data tests and 7 data train.

  • @tech-n-data
    @tech-n-data 2 роки тому

    Thank you sooooo much. You simplified that beautifully.

  • @aadilgoyal9286
    @aadilgoyal9286 3 роки тому

    def avg(nums):
    num_avg = 0
    for i in range(len(nums)):
    num_avg = num_avg + nums[i]

    num_avg = (num_avg / len(nums))

    return num_avg
    // this is the code if you want to get the average of the list. To use it just say
    avg(scores_l)

  • @humayunnasir6261
    @humayunnasir6261 4 роки тому

    wonderful explaination. Great tutorial series

  • @Augustus1003
    @Augustus1003 4 роки тому

    taking cv=3 for all cases,
    Logistic regression=97.33%
    Random_Forest=96.66%(n_estimator=60)
    Decision tree=~Svc()=96%

  • @ahmedhelal920
    @ahmedhelal920 3 роки тому

    understood now what is k_fold cv , thanks sir..

  • @vardhanvishnu618
    @vardhanvishnu618 5 років тому

    Thank you very much for your class. Its very useful for the beginners.

    • @codebasics
      @codebasics  5 років тому

      I am happy you liked it Vishnu :)

  • @apeculiargentleman6925
    @apeculiargentleman6925 5 років тому +5

    You make exquisite content, I'd love to see more!

  • @shashankkkk
    @shashankkkk 3 роки тому +23

    for me, SVM's score is almost 99 everytime

  • @wasirizvi2437
    @wasirizvi2437 4 роки тому

    You explain things very clearly ! Moreover you keep code ready to save time and your videos are of the appropriate length. You follow the presentations you have made and seem to be speaking impromptu and not reading from somewhere. I have joined a course but there things are not very clear, the videos are sometimes 2-3 hours long and I get bored to death. Most importantly you skip the unnecessary detailed mathematics which are not essential for beginners which helps to focus on machine learning (though I am good at mathematics).

    • @codebasics
      @codebasics  4 роки тому

      Wasi thanks for leaving a well thought out feedback. This helps me a lot. 😊👍

  • @ramezhabib320
    @ramezhabib320 Рік тому

    Using the K Fold Method, the data was split multiple times into X_train s and y_train s but remained constant for each method for each split.
    Is it the same case in the cross_val_score method? Isn't the splitting taking place differently for each method? So basically the models are trained on different X_train s and y_train s
    Thank you so much for the clear explanation.

  • @60pluscrazy
    @60pluscrazy 3 роки тому

    Your explanations are awesome 👌

  • @karishmasewraj6437
    @karishmasewraj6437 2 роки тому

    LogisticRegressionClassifier =100%
    SVC (kernel="poly") =97%
    DecisionTreeClassifier = 97%
    RandomForestClassifier(n_estimators=30) =97% for every increase in n_estimators

  • @sumayachoya2738
    @sumayachoya2738 8 місяців тому

    thank you for this series. it is helping me a lot.

  • @akshyakumarshrestha5551
    @akshyakumarshrestha5551 5 років тому +1

    The way you teach is awesome! I request to make tutorials on Neural Network if you are in that field. Thankyou!

    • @codebasics
      @codebasics  5 років тому

      Akshya I started making videos on neural net. Check my channel, posted first two already..once TF2.0 is stable I will add more.