Це відео не доступне.
Перепрошуємо.

193 - What is XGBoost and is it really better than Random Forest and Deep Learning?

Поділитися
Вставка
  • Опубліковано 17 сер 2024

КОМЕНТАРІ • 60

  • @pvrnaaz
    @pvrnaaz 2 роки тому +7

    Very organized and clear with excellent examples that make it so easy to understand. Thank you!

  • @riti_joshi
    @riti_joshi 4 місяці тому

    I never comment on any UA-cam videos, but I am compelled to do here, because I learned most of my analyses for my dissertation following your tutorials. You're such a great tutor. Thank you so much.

  • @ashift32
    @ashift32 2 роки тому +2

    Very well explained, clear and concise. Thanks for taking your time

  • @mhh5002
    @mhh5002 2 роки тому +3

    Very well explained, sir. It was intuitive for beginners. The analogies are interesting as well.

  • @caiyu538
    @caiyu538 2 роки тому +1

    Perfect tutorial, I am using XGBoost and random forest to analyze some work. Perfect tutorials for me. Always appreciate your continuous efforts to share your knowledge through youtube.

  • @semon00
    @semon00 4 місяці тому

    Wow your explanotion is awesome!!!
    Dont stop plz

  • @sudippandit6676
    @sudippandit6676 3 роки тому +1

    Very organized and straight forward! Waiting other videos. Thank you for sharing this knowledge.

  • @DrNuyenVanFaulk
    @DrNuyenVanFaulk 23 дні тому

    I really appreciate this explanation! Thank you!

  • @grantsmith3653
    @grantsmith3653 2 роки тому +2

    Sreeni said we need to normalize, but I always thought we didn't need to do that with trees... Am I confused on something?
    Thanks for the video!!

  • @evazhong4419
    @evazhong4419 2 роки тому

    your explanation is so interesting haha, it helps me a lot understand the material

  • @andyn6053
    @andyn6053 10 місяців тому +1

    This was very clear and useful! Do u have any link to your code? Also, could xgboost be used for linear regression aswell?

  • @axe863
    @axe863 8 місяців тому +1

    XGBoost with Regularized Rotations and Synthetic Feature Construction can approximate Deep NN deepness

  • @VarunKumar-pz5si
    @VarunKumar-pz5si 3 роки тому +1

    Awesome Tutorial, Glad I got a great teacher..Thank you...

  • @omeremhan
    @omeremhan 2 роки тому

    Magnificant!!! Thanks for clear explanation Sir.

  • @Ahmetkumas
    @Ahmetkumas 3 роки тому +2

    Thanks for the video and effort. Can you make a time series video using xgboost, or something with multiple features(lags, rolling mean, etc..)

  • @barrelroller8650
    @barrelroller8650 Рік тому +1

    It's not clear where did you get a dataset in a CSV format - the .zip archive from provided link includes only `wdbc.data` and `wdbc.names` files

  • @venkatesanr9455
    @venkatesanr9455 3 роки тому

    Thanks, Sreeni sir for your valuable and knowledgeable content. Also, waiting for the next semantic segmentation series and also discusses the hyperparameters and their tuning, Time series analysis that will be highly helpful.

  • @sbaet
    @sbaet 3 роки тому +2

    can you make a quick video on normalization and standardizetion for a image dataset

  • @kakaliroy4747
    @kakaliroy4747 2 роки тому +1

    The example of bagging is so funny and I can fully relate

  • @mouraleog
    @mouraleog Рік тому

    Awesome video, thank you ! Greetings from Brazil!

  • @rezaniazi4352
    @rezaniazi4352 3 роки тому +1

    thanks for the video
    what we have to change if we want to use XGBRegressor() insted if classifier ?
    xgboost documantation is so confusing !

  • @drforest
    @drforest 2 роки тому

    Awesome comparison. Super thanks

  • @farhaddavaripour4619
    @farhaddavaripour4619 2 роки тому

    Thanks for the video. Something I noticed in the figure above that you might have missed is that in the figure you show the most evolved species has lighter hair than less evolved which could interpret a false impression that species with lighter hair are more evolved. It would be great if you could adjust the figure.

  • @sathishchetla3986
    @sathishchetla3986 11 місяців тому

    Thank you so much for your explanation sir

  • @vzinko
    @vzinko Рік тому +13

    Another case of data leakage. You can't scale X and then split it into test and train. The scaling needs to happen after the split.

    • @Beowulf245
      @Beowulf245 Рік тому +1

      Thank you. At least someone understands.

    • @andyn6053
      @andyn6053 10 місяців тому

      So this video is incorrect?

  • @SP-cg9fu
    @SP-cg9fu Рік тому

    very useful video ! Thank you!

  • @RealThrillMedia
    @RealThrillMedia Рік тому

    Very helpful thank you!

  • @vikramsandu6054
    @vikramsandu6054 3 роки тому

    Well explained. Thank you so much for the video.

  • @kangxinwang3886
    @kangxinwang3886 3 роки тому +1

    loved the arrange marriage example! Made it very intuitive and easy to understand. Thank you!

  • @kangajohn
    @kangajohn 3 роки тому

    if your explanations were a kaggle competition it would be top 1%

  • @multiversityx
    @multiversityx Рік тому

    What’s the method name? When are you presenting at NeurIPS? (I’ll be attending it :)

  • @tannyakumar284
    @tannyakumar284 2 роки тому

    Hi. I have a 1500x11 dataset and I am trying to see which out of cognitive ability, non-cognitive ability, family structure, parental involvement, and school characteristics predict academic performance (measured in terms of grades ranging from 1-5). Should I be using XG Boost for this problem or random forest? Thanks!

  • @evyatarcoco
    @evyatarcoco 3 роки тому

    A very useful episode, thanks sir

  • @ahmedraafat8769
    @ahmedraafat8769 2 роки тому +1

    The dataset has been removed from the website. is it possible to upload it?

    • @DigitalSreeni
      @DigitalSreeni  2 роки тому +2

      Just google search for the keywords and you'll find it somewhere, may be on Kaggle. I do not own the data so I cannot share it, legally.

  • @Bwaaz
    @Bwaaz Рік тому

    very clear thanks :)

  • @Lodeken
    @Lodeken Рік тому

    Wow that analogy! 😂 Amazingly apt lol!

  • @longtruong9935
    @longtruong9935 2 роки тому

    dataset in the UCI link not avaiable now. could any one can provide update link?

  • @khairulfahim
    @khairulfahim Рік тому

    Where can I get the exact .csv file?

  • @v1hana350
    @v1hana350 2 роки тому

    I have a question about the Xgboost algorithm. The question is how parallelization works in the Xgboost algorithm and explain me with an example.

  • @Frittenfinger
    @Frittenfinger 9 місяців тому

    Nice T-Shirt 😃

  • @abderrahmaneherbadji5478
    @abderrahmaneherbadji5478 3 роки тому

    thank you very much

  • @darioooc
    @darioooc Рік тому

    Great!

  • @andromeda1534
    @andromeda1534 3 роки тому

    Looks like when you demo-ed random forest, you didn't comment out the xgb line, so you actually showed the fitting for xgb twice with the same results.

  • @ghafiqe
    @ghafiqe Рік тому

    Perfect

  • @3DComputing
    @3DComputing 3 роки тому

    You're worth more money

  • @ramakrishnabhupathi4995
    @ramakrishnabhupathi4995 2 роки тому

    Good one

  • @user.................
    @user................. 2 місяці тому

    bro trying to share about life n forgot wts hes teaching 🤣🤣🤣🤣
    only were i gt complete idea about xgboost tq

  • @alejandrovillalobos1678
    @alejandrovillalobos1678 3 роки тому +1

    can you talk about transformers please?

  • @agsantiago22
    @agsantiago22 2 роки тому

    Merci !