Cross Validation For Model Selection |K-Fold|Leave One Out CV | Data Science

Поділитися
Вставка
  • Опубліковано 5 вер 2024
  • Cross validations are useful way to evaluate models. The best fit model can be found out by cross validating and choosing the one that has least models. CV techniques are very useful in all data science projects. We would take K-fold & Leave one out cross validation to demonstrate the techniques.
    Contact : analyticsuniversity@gmail.com
    ANalytics Study Pack : analyticunivers...
    Analytics University on Twitter : / analyticsuniver
    Analytics University on Facebook : / analyticsuniversity
    Logistic Regression in R: goo.gl/S7DkRy
    Logistic Regression in SAS: goo.gl/S7DkRy
    Logistic Regression Theory: goo.gl/PbGv1h
    Time Series Theory : goo.gl/54vaDk
    Time ARIMA Model in R : goo.gl/UcPNWx
    Survival Model : goo.gl/nz5kgu
    Data Science Career : goo.gl/Ca9z6r
    Machine Learning : goo.gl/giqqmx
    Data Science Case Study : goo.gl/KzY5Iu
    Big Data & Hadoop & Spark: goo.gl/ZTmHOA

КОМЕНТАРІ • 4

  • @vishalaaa1
    @vishalaaa1 4 роки тому

    Very good . These vides will be hit. Please make more videos and cover all real time practical's and make them one set. It will be hit.

  • @maheshmanchineni6236
    @maheshmanchineni6236 7 років тому

    Hi sir thanks for the video you explained it very clear but how would be our approach if we are having more than one predictor variables how do we write the final equation.

  • @PSILoveMathProfessorSampson
    @PSILoveMathProfessorSampson 3 роки тому

    How do you do another iteration so you get different values. Like what if I wanted 10 sets of data