Hello Grandmasters :) That was a delightful conversation :) Thank you for everything Main Takeaways from discussion : 1. Time! If you like something make sure you are giving most of your time to be good at it. 2. Show your work and team up with others. 3. Understand the problem statement, build on skills and in long run it will help to translate ideas from one domain to another . Hello transfer learning! 4. Overview on where do boosting / neural net dominate 5. Have local validation at 5 fold ( Train on 80% data and inferring out of the rest) but when you are submitting you don't take ensemble of the model but what you do is just remember hyperparameters that work(15 epochs etc ). 5 models trained on 100 % of data is better than 5 models trained on 80% of data. It increases score by 10 % 6. If you are starting out - Kaggle Kernals/ Google Colab is sufficient to work with along GPU 7. Learn from others and implement your own ideas rather than forking and submitting kernels written by others. 8. We all deal with Imposter syndrome in some way. You learn most when you learn little bit each day Happy Kaggling! Love
Tricks mentioned: 1. Seed averaging. 2. Remembering hyperparameter values from k-fold CV, but finally training each of the k models on 100% of the training data -> This won't work if you're using early stopping to choose the number of epochs. So don't use early stopping. 3. Back translation (for NLP competitions) & other ways of generating more data. 4. Ensembling with other good notebooks. 5. Hedging the top 2 submissions.
Abhishek Sir is doing an amazing job by giving back to the community to this extent. It takes a lot of effort to bring all such great people together and hosting such a webinar!
Panel : Quadruple Grandmasters :Abhishek Thakur(Moderator), Chris Deotte, Rohan Rao and Bojan Tunguz and also founder,CEO of Kaggle: Anthony Goldbloom This is the list of topics with its timestamps. 1.Introduction (2:00) 2.What's the most difficult thing when started with Kaggle? (8:05) 3.When you started with Kaggle did you have a full time job? (12:30) 4.What's your favorite Kaggle comp and ML algorithm? (19:10)(GM Giba's Question) 5.Which track has been the most difficult for you? (25:45) 6.Who's your favourite Kaggler?(31:35) 7. How to find a mentor for Kaggle Competitions?(35:15) 8. How do skills learnt in Kaggle transfers to daily job as data scientist?(38:00) 9.I'm interested in knowing a bit of an overview of where is it that neural networks dominate today and where is it still the old school methods as being the dominant strategy?(41:00)(CEO Anthony's question) 10.Share 1 trick that never shared with anybody to get god competition performances?(45:20)(CEO Anthony's question) 11.What kind of compute beginner should start from AWS,GCP,cloud-based compute or just buy some local machines?(51:05) 12.Where should beginners start from?(56:15)
Reposting my question incase anyone finds the time to answer: 1. What is your fav chai? :D 2. Do you ever get burned out from Kaggling? How do you unplug and recharge?
Abhishek sir, do you recommend Radek's book? Do you have any recommendations for meta learning with respect to computer vision? How did you go from just ML to MLOPs? (referring to Google Cloud's ML Summit here)
You are cool. Guys, you are in love with Kaggle and DS and this is exciting. I would like to see a video from you to share your experience as professionals as long you work in the industry and the challenges in real-life working in traditional and modern departments from people you manage (inc data science, modelling, actuarial, analytics)
That was amazing guys, thank you all so much for helping make the Kaggle community what it is My prediction for next 4X GM is either Firat Gonen or Rob Mulla. They will both get it soon, I'm sure of that.
Hello Grandmasters :) That was a delightful conversation :) Thank you for everything
Main Takeaways from discussion :
1. Time! If you like something make sure you are giving most of your time to be good at it.
2. Show your work and team up with others.
3. Understand the problem statement, build on skills and in long run it will help to translate ideas from one domain to another
. Hello transfer learning!
4. Overview on where do boosting / neural net dominate
5. Have local validation at 5 fold ( Train on 80% data and inferring out of the rest) but when you are submitting you don't take ensemble of the model but what you do is just remember hyperparameters that work(15 epochs etc ). 5 models trained on 100 % of data is better than 5 models trained on 80% of data. It increases score by 10 %
6. If you are starting out - Kaggle Kernals/ Google Colab is sufficient to work with along GPU
7. Learn from others and implement your own ideas rather than forking and submitting kernels written by others.
8. We all deal with Imposter syndrome in some way. You learn most when you learn little bit each day
Happy Kaggling!
Love
Tricks mentioned:
1. Seed averaging.
2. Remembering hyperparameter values from k-fold CV, but finally training each of the k models on 100% of the training data -> This won't work if you're using early stopping to choose the number of epochs. So don't use early stopping.
3. Back translation (for NLP competitions) & other ways of generating more data.
4. Ensembling with other good notebooks.
5. Hedging the top 2 submissions.
Abhishek Sir is doing an amazing job by giving back to the community to this extent. It takes a lot of effort to bring all such great people together and hosting such a webinar!
It’s great to hear the conversations, totally relate with chris’s story on the barrier thing
Panel : Quadruple Grandmasters :Abhishek Thakur(Moderator), Chris Deotte, Rohan Rao and Bojan Tunguz and also founder,CEO of Kaggle: Anthony Goldbloom
This is the list of topics with its timestamps.
1.Introduction (2:00)
2.What's the most difficult thing when started with Kaggle? (8:05)
3.When you started with Kaggle did you have a full time job? (12:30)
4.What's your favorite Kaggle comp and ML algorithm? (19:10)(GM Giba's Question)
5.Which track has been the most difficult for you? (25:45)
6.Who's your favourite Kaggler?(31:35)
7. How to find a mentor for Kaggle Competitions?(35:15)
8. How do skills learnt in Kaggle transfers to daily job as data scientist?(38:00)
9.I'm interested in knowing a bit of an overview of where is it that neural networks dominate today and where is it still the old school methods as being the dominant strategy?(41:00)(CEO Anthony's question)
10.Share 1 trick that never shared with anybody to get god competition performances?(45:20)(CEO Anthony's question)
11.What kind of compute beginner should start from AWS,GCP,cloud-based compute or just buy some local machines?(51:05)
12.Where should beginners start from?(56:15)
amazing, so good!!
Started from the bottom and no stopping you 🙌
Reposting my question incase anyone finds the time to answer:
1. What is your fav chai? :D
2. Do you ever get burned out from Kaggling? How do you unplug and recharge?
I think they probably like green tea! look at the fabulous shape they are in!!
Doing some workout would help ,Aslike your mind, body also needs some training.
Abhishek sir, do you recommend Radek's book? Do you have any recommendations for meta learning with respect to computer vision?
How did you go from just ML to MLOPs? (referring to Google Cloud's ML Summit here)
Your whole channel is doing God's work. Thank you for the quality information, panels and talks.
Awesome, knowing that you didnt know about ML while starting out at Kaggle... gives me so much hope, haha
Wow congrats people !!
Inspirational .. good to know yr journeys
such an inspiration . Love these guys
"Lords Of Kaggle" at one place. WOW.
A very valuable conversation, enjoyed it! (coming from a non-Kaggler, but I'm passionate to understand more about it)
great session. Enjoyed evert bit of it.
You are cool. Guys, you are in love with Kaggle and DS and this is exciting. I would like to see a video from you to share your experience as professionals as long you work in the industry and the challenges in real-life working in traditional and modern departments from people you manage (inc data science, modelling, actuarial, analytics)
It was a really inspiring session!
It is definitely something that helps one to work harder and push one's limit!
Great Work!
Beautiful people!
That was amazing guys, thank you all so much for helping make the Kaggle community what it is
My prediction for next 4X GM is either Firat Gonen or Rob Mulla. They will both get it soon, I'm sure of that.
The Quad leaders of the Data science future 👍
Heroes.
Much appreciated
Its like a Pokemon League Battle Lineup
Do you know any source of tutorial for machine learning in computer vision for advanced topics?
This is some Billion Dollar ...panel talk
.