Hello Guys, Finally iNeuron is happy to announce Full Stack Data Scientist Bootcamp with Job Guarantee Program starting from 7th May 2022 and the class timing is from 10am to 1 pm and then the doubt clearing session is from 1pm to 3pm every Saturday and Sunday. This time we are keeping 2 hours doubt clearing session after the class. All the live sessions will be recorded and it will be available through lifetime.Even prerecorded videos are also available for everyone. You can check the detailed syllabus and all information below courses.ineuron.ai/Full-Stack-Data-Science-Bootcamp Use Krish10 for additional 10% discount Emi options also available Direct call to our Team incase of any queries 8788503778 6260726925 9538303385 8660034247 9880055539
in classfication problem statement and regression problem statement both me can be uses. Euclidean distance and Manhattan distance. In regression, it is the average of the K(hyperparameter) nearest data points whereas in classification problem it is equal to max number of category points out of the K nearest neighbours. Limitations: Huge dataset me problem create karega. Sensitive to Outliers Sensitive to missing values
Thanks for showing the visualization for KNN method. Have a request, can you share similar examples when input variables 3 or more. Thanks again @krishnaikhindi
I have a question. In classifier, if the value of both the category is same then what will we select. For example if number of 1s are 4 and number of 0s are also 4 then which value will be predicted
Sir I have joined ur programe I did not like the teaching method in the live class. I have stopped joining the class i watch all ur vedios on u tube and thats how i learn I cant wait to join ur live class
Hello Guys,
Finally iNeuron is happy to announce Full Stack Data Scientist Bootcamp with Job Guarantee Program starting from 7th May 2022 and the class timing is from 10am to 1 pm and then the doubt clearing session is from 1pm to 3pm every Saturday and Sunday. This time we are keeping 2 hours doubt clearing session after the class.
All the live sessions will be recorded and it will be available through lifetime.Even prerecorded videos are also available for everyone.
You can check the detailed syllabus and all information below
courses.ineuron.ai/Full-Stack-Data-Science-Bootcamp
Use Krish10 for additional 10% discount
Emi options also available
Direct call to our Team incase of any queries
8788503778
6260726925
9538303385
8660034247
9880055539
Sir I love to take your course but I am working professional it very hard to follow
Sir your Hindi playlist is one of the best
liked your presentation👌👌👌👌
Great sir.. apka samjane ka tarika bahut acha hai puri vedio bahut ache se samaj ayi
in classfication problem statement and regression problem statement both me can be uses. Euclidean distance and Manhattan distance.
In regression, it is the average of the K(hyperparameter) nearest data points whereas in classification problem it is equal to max number of category points out of the K nearest neighbours.
Limitations:
Huge dataset me problem create karega.
Sensitive to Outliers
Sensitive to missing values
Best explanation in simple way on UA-cam... Great work Krish sir, your Hindi & English both channels are awesome. 👏🙌
It was super simplified. Understood well
Thumbnail be like: nikal ❤️day . 😁
It's😂
❤ + ⛅
😂😂
😂😂😂
amazing explanation, understood the concept super well!
Best explanation with clear and concise way
Thank you so much sir! Great explanation.
Please start Deep learning 7 days live session at the earliest.
Thank you sir clearly understood
🙂
Please upload same playlist as you uploaded on your second channel. And also, tell us how to follow your playlist.
Thank you sir ❤😘❤
thank you so much! Very helpful
Thankyou So Much Sir!!!
Please start deep learning and computer vision 7 days lecture
Thanks a lot ❤
how the graph is plotted for 3 feature. like size, room and price????
Thank you very much sir.
Great tutorial...
Thank You so much!
Nice way of explaining as well presentation. Can you share what are the gadgets you are using for writing, please 🙏
great explanation
Sir, what happens when 1's and 0's have the same number of data points under KNN i.e. 3 for 1's and 3 for 0's, Then what would be the output?
This understanding comes with practical knowledge or domain knowledge. Odd no. of K value is taken into account in that case. K = 3, K = 5
same doubt
@@h44r96 i did not get this
u can pls elaborate on ur answer
sier regarding Knn If there are more than 2 classification like 0 1 2 3 4 does THE SAME PRINCIPAL APPLY
sir u didn't uploaded practical implementation of knn.
Thanks for showing the visualization for KNN method.
Have a request, can you share similar examples when input variables 3 or more.
Thanks again
@krishnaikhindi
I have a question.
In classifier, if the value of both the category is same then what will we select. For example if number of 1s are 4 and number of 0s are also 4 then which value will be predicted
can i find English version of this ?
but sir how to find the value of k in our problem as the value of k changes the output result will change.
Please update the material link as this link is showing 404 error
Sir I have joined ur programe I did not like the teaching method in the live class. I have stopped joining the class i watch all ur vedios on u tube and thats how i learn I cant wait to join ur live class
Sir
Where is implementation part? -,-
Where the practical implementation of Naive Bayes algorithm?
Thanks
Is this playlist sufficient to crack online tests and interview? @krishnaikhindi
playlist delete hai kya?
Lovely
Can test data come as outlier
Can someone share me the python implementation of this video
practical nahi liya apne iska aur k-naive baye's .
🎉🎉🎉🎉🎉🎉🎉
👌
I couldn't understand...
Hello Dear Sir, How can I get your contact details?
Noice