MEDIOCRE_GUY
MEDIOCRE_GUY
  • 81
  • 19 537
GridSearchCV using Scikit-Learn
๐†๐ซ๐ข๐๐’๐ž๐š๐ซ๐œ๐ก๐‚๐• is a function that comes with Scikit-Learn library and it is a process for tuning hyperparameters in machine learning models. The performance of a machine learning model significantly depends on the selection of hyperparameters. ๐†๐ซ๐ข๐๐’๐ž๐š๐ซ๐œ๐ก๐‚๐• loops through a predefined set of hyperparameters and selects the optimal values from them after exhaustively considering all parameter combinations.
๐—š๐—ถ๐˜๐—›๐˜‚๐—ฏ ๐—ฎ๐—ฑ๐—ฑ๐—ฟ๐—ฒ๐˜€๐˜€: github.com/randomaccess2023/MG2023/tree/main/Video%2081
๐—œ๐—บ๐—ฝ๐—ผ๐—ฟ๐˜๐—ฎ๐—ป๐˜ ๐˜๐—ถ๐—บ๐—ฒ๐˜€๐˜๐—ฎ๐—บ๐—ฝ๐˜€:
01:09 - Import required libraries
02:48 - Load ๐ž๐š๐ซ๐ฅ๐ฒ_๐ฌ๐ญ๐š๐ ๐ž_๐๐ข๐š๐›๐ž๐ญ๐ž๐ฌ_๐ซ๐ข๐ฌ๐ค_๐ฉ๐ซ๐ž๐๐ข๐œ๐ญ๐ข๐จ๐ง dataset
05:40 - Perform preprocessing
07:42 - Separate features and classes
08:35 - Apply ๐†๐ซ๐ข๐๐’๐ž๐š๐ซ๐œ๐ก๐‚๐• in ๐‘๐š๐ง๐๐จ๐ฆ ๐…๐จ๐ซ๐ž๐ฌ๐ญ ๐‚๐ฅ๐š๐ฌ๐ฌ๐ข๐Ÿ๐ข๐ž๐ซ
15:10 - Apply ๐†๐ซ๐ข๐๐’๐ž๐š๐ซ๐œ๐ก๐‚๐• in ๐„๐ฑ๐ญ๐ซ๐š ๐“๐ซ๐ž๐ž๐ฌ ๐‚๐ฅ๐š๐ฌ๐ฌ๐ข๐Ÿ๐ข๐ž๐ซ
17:59 - Apply ๐†๐ซ๐ข๐๐’๐ž๐š๐ซ๐œ๐ก๐‚๐• in ๐†๐ซ๐š๐๐ข๐ž๐ง๐ญ ๐๐จ๐จ๐ฌ๐ญ๐ข๐ง๐  ๐‚๐ฅ๐š๐ฌ๐ฌ๐ข๐Ÿ๐ข๐ž๐ซ
20:50 - Apply ๐†๐ซ๐ข๐๐’๐ž๐š๐ซ๐œ๐ก๐‚๐• in all the models
#sklearn #scikitlearn #datascience #jupyternotebook #machinelearning #gridsearchcv #hyperparametertuning #python #pythonprogramming
ะŸะตั€ะตะณะปัะดั–ะฒ: 23

ะ’ั–ะดะตะพ

K-fold cross validation using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 5021 ะณะพะดะธะฝัƒ ั‚ะพะผัƒ
๐Š-๐Ÿ๐จ๐ฅ๐ ๐œ๐ซ๐จ๐ฌ๐ฌ ๐ฏ๐š๐ฅ๐ข๐๐š๐ญ๐ข๐จ๐ง is a technique used for evaluating the performance of machine learning models. It uses different portions of the dataset as train and test sets in multiple iterations and helps a model to generalize well on unseen data. Scikit-Learn's ๐ญ๐ซ๐š๐ข๐ง_๐ญ๐ž๐ฌ๐ญ_๐ฌ๐ฉ๐ฅ๐ข๐ญ method uses a fixed set of samples as the train set and the rest of the samples outside the train set as the test set, wh...
GradientBoostingClassifier using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 68ะ”ะตะฝัŒ ั‚ะพะผัƒ
๐†๐ซ๐š๐๐ข๐ž๐ง๐ญ๐๐จ๐จ๐ฌ๐ญ๐ข๐ง๐ ๐‚๐ฅ๐š๐ฌ๐ฌ๐ข๐Ÿ๐ข๐ž๐ซ is a supervised machine learning algorithm. It builds an additive model in a forward stage-wise fashion and allows for the optimization of arbitrary differentiable loss functions. ๐—š๐—ถ๐˜๐—›๐˜‚๐—ฏ ๐—ฎ๐—ฑ๐—ฑ๐—ฟ๐—ฒ๐˜€๐˜€: github.com/randomaccess2023/MG2023/tree/main/Video 79 ๐—œ๐—บ๐—ฝ๐—ผ๐—ฟ๐˜๐—ฎ๐—ป๐˜ ๐˜๐—ถ๐—บ๐—ฒ๐˜€๐˜๐—ฎ๐—บ๐—ฝ๐˜€: 00:47 - Import required libraries 02:24 - Load ๐œ๐ซ๐ž๐๐ข๐ญ_๐š๐ฉ๐ฉ๐ซ๐จ๐ฏ๐š๐ฅ dataset 04:38 - Perform preprocessi...
ExtraTreesClassifier using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 7914 ะดะฝั–ะฒ ั‚ะพะผัƒ
๐„๐ฑ๐ญ๐ซ๐š๐“๐ซ๐ž๐ž๐ฌ๐‚๐ฅ๐š๐ฌ๐ฌ๐ข๐Ÿ๐ข๐ž๐ซ is a supervised machine learning algorithm. It is a type of ensemble learning technique which fits a number of randomized decision trees (i.e., extra trees) on various sub-samples of the dataset. It contributes to reducing the variance of the model and results in less overfitting. ๐—š๐—ถ๐˜๐—›๐˜‚๐—ฏ ๐—ฎ๐—ฑ๐—ฑ๐—ฟ๐—ฒ๐˜€๐˜€: github.com/randomaccess2023/MG2023/tree/main/Video 78 ๐—œ๐—บ๐—ฝ๐—ผ๐—ฟ๐˜๐—ฎ๐—ป๐˜ ๐˜๐—ถ๐—บ๐—ฒ๐˜€๐˜๐—ฎ๐—บ๐—ฝ๐˜€: 01...
Quadratic Discriminant Analysis (QDA) using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 6521 ะดะตะฝัŒ ั‚ะพะผัƒ
๐๐ฎ๐š๐๐ซ๐š๐ญ๐ข๐œ ๐ƒ๐ข๐ฌ๐œ๐ซ๐ข๐ฆ๐ข๐ง๐š๐ง๐ญ ๐€๐ง๐š๐ฅ๐ฒ๐ฌ๐ข๐ฌ (๐๐ƒ๐€) is a supervised machine learning algorithm. It is very similar to Linear Discriminant Analysis (LDA) except the assumption that the classes share the same covariance matrix. In other words, each class has its own covariance matrix. In this case, the boundary between classes is a quadratic surface instead of a hyperplane. ๐—š๐—ถ๐˜๐—›๐˜‚๐—ฏ ๐—ฎ๐—ฑ๐—ฑ๐—ฟ๐—ฒ๐˜€๐˜€: github.com/randomacc...
CatBoost Classifier | Machine Learning | Python
ะŸะตั€ะตะณะปัะดั–ะฒ 137ะœั–ััั†ัŒ ั‚ะพะผัƒ
Categorical Boosting (๐‚๐š๐ญ๐๐จ๐จ๐ฌ๐ญ) is a gradient-boosting algorithm for machine learning. Gradient boosting is a process in which many decision trees are constructed iteratively. In CatBoost, each successive tree is built with reduced loss compared to the previous trees. I used ๐ฆ๐ฎ๐ฌ๐ก๐ซ๐จ๐จ๐ฆ๐ฌ.๐œ๐ฌ๐ฏ dataset for this example. The dataset is available in the repository. It contains 2 types of mushrooms in t...
Bagging Classifier using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 38ะœั–ััั†ัŒ ั‚ะพะผัƒ
๐๐š๐ ๐ ๐ข๐ง๐  is a supervised machine learning algorithm. It is an ensemble learning technique in which multiple base estimators are trained independently and in parallel on different subsets of the training data. The final prediction is made by aggregating all the predictions of the base estimators. I used ๐—ด๐—น๐—ฎ๐˜€๐˜€_๐—ถ๐—ฑ๐—ฒ๐—ป๐˜๐—ถ๐—ณ๐—ถ๐—ฐ๐—ฎ๐˜๐—ถ๐—ผ๐—ป.๐—ฐ๐˜€๐˜ƒ dataset in this example. The dataset is available in the repository. ...
Artificial neural network for regression task using PyTorch
ะŸะตั€ะตะณะปัะดั–ะฒ 106ะœั–ััั†ัŒ ั‚ะพะผัƒ
A regression analysis in machine learning is used to investigate the relationship between one or more independent variables (treated as ๐˜ง๐˜ฆ๐˜ข๐˜ต๐˜ถ๐˜ณ๐˜ฆ๐˜ด) and a dependent variable (regarded as ๐˜ฐ๐˜ถ๐˜ต๐˜ค๐˜ฐ๐˜ฎ๐˜ฆ). It is a method for predictive modelling and is used to predict a continuous outcome. I used ๐˜ด๐˜ฌ๐˜ญ๐˜ฆ๐˜ข๐˜ณ๐˜ฏ'๐˜ด ๐œ๐š๐ฅ๐ข๐Ÿ๐จ๐ซ๐ง๐ข๐š ๐ก๐จ๐ฎ๐ฌ๐ข๐ง๐  dataset for this example. This dataset has 8 features and I built a very simple ar...
Hartigan index using Python
ะŸะตั€ะตะณะปัะดั–ะฒ 11ะœั–ััั†ัŒ ั‚ะพะผัƒ
๐‡๐š๐ซ๐ญ๐ข๐ ๐š๐ง ๐ข๐ง๐๐ž๐ฑ (๐‡๐ˆ) is computed by taking the logarithm of the ratio among the sum-of-squares between each cluster (๐’๐’๐) and the sum-of-squares within the clusters (๐’๐’๐–). It is a cluster evaluation technique. ๐—š๐—ถ๐˜๐—›๐˜‚๐—ฏ ๐—ฎ๐—ฑ๐—ฑ๐—ฟ๐—ฒ๐˜€๐˜€: github.com/randomaccess2023/MG2023/tree/main/Video 73 ๐™„๐™ข๐™ฅ๐™ค๐™ง๐™ฉ๐™–๐™ฃ๐™ฉ ๐™ฉ๐™ž๐™ข๐™š๐™จ๐™ฉ๐™–๐™ข๐™ฅ๐™จ: 00:57 - Import required libraries 04:03 - Create data 05:07 - Perform preprocessing 05:19 - Perf...
Linear Discriminant Analysis using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 43ะœั–ััั†ัŒ ั‚ะพะผัƒ
๐‹๐ข๐ง๐ž๐š๐ซ ๐ƒ๐ข๐ฌ๐œ๐ซ๐ข๐ฆ๐ข๐ง๐š๐ง๐ญ ๐€๐ง๐š๐ฅ๐ฒ๐ฌ๐ข๐ฌ (๐‹๐ƒ๐€) is a supervised machine learning algorithm. This approach is used in machine learning to solve classification problems with two or more classes. ๐‹๐ƒ๐€ fits a Gaussian density to each class, assuming all classes share the same covariance matrix. I used ๐—ฟ๐—ฎ๐—ถ๐˜€๐—ถ๐—ป.๐˜…๐—น๐˜€๐˜… dataset for this example. The dataset is available in the repository. It contains 2 types of raisins...
XGBoost Classifier | Machine Learning | Python API
ะŸะตั€ะตะณะปัะดั–ะฒ 512 ะผั–ััั†ั– ั‚ะพะผัƒ
eXtreme Gradient Boosting (๐—๐†๐๐จ๐จ๐ฌ๐ญ) is a gradient-boosting algorithm for machine learning. ๐—๐†๐๐จ๐จ๐ฌ๐ญ builds a predictive model by combining the predictions of multiple individual models, often decision trees, in an iterative manner. I used ๐—ฏ๐—ฎ๐—ป๐—ธ๐—ป๐—ผ๐˜๐—ฒ_๐—ฎ๐˜‚๐˜๐—ต๐—ฒ๐—ป๐˜๐—ถ๐—ฐ๐—ฎ๐˜๐—ถ๐—ผ๐—ป.๐—ฐ๐˜€๐˜ƒ dataset for this example. The dataset is available in the repository. It contains 2 types of entities in the target column: ๐Ÿฌ & ๐Ÿญ. ...
LightGBM Classifier | Machine Learning | Python API
ะŸะตั€ะตะณะปัะดั–ะฒ 792 ะผั–ััั†ั– ั‚ะพะผัƒ
Light Gradient-Boosting Machine (๐‹๐ข๐ ๐ก๐ญ๐†๐๐Œ) is a gradient-boosting algorithm for machine learning. It uses a histogram-based method in which data is bucketed into bins using a histogram of the distribution. I used ๐—บ๐—ฎ๐—ด๐—ถ๐—ฐ_๐—ด๐—ฎ๐—บ๐—บ๐—ฎ_๐˜๐—ฒ๐—น๐—ฒ๐˜€๐—ฐ๐—ผ๐—ฝ๐—ฒ.๐—ฐ๐˜€๐˜ƒ dataset for this example. The dataset is available in the repository. It contains 2 types of entities in the target column: ๐—ด=๐—ด๐—ฎ๐—บ๐—บ๐—ฎ(๐˜€๐—ถ๐—ด๐—ป๐—ฎ๐—น) & ๐—ต=๐—ต๐—ฎ๐—ฑ๐—ฟ๐—ผ๐—ป(๐—ฏ๐—ฎ๐—ฐ๐—ธ๐—ด๐—ฟ๐—ผ...
AdaBoost Classifier using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 5022 ะผั–ััั†ั– ั‚ะพะผัƒ
๐€๐๐š๐๐จ๐จ๐ฌ๐ญ ๐‚๐ฅ๐š๐ฌ๐ฌ๐ข๐Ÿ๐ข๐ž๐ซ is a supervised machine learning algorithm. AdaBoost is short for ๐€๐๐š๐ฉ๐ญ๐ข๐ฏ๐ž ๐๐จ๐จ๐ฌ๐ญ๐ข๐ง๐  and is used as an ensemble method in machine learning. The core principle of AdaBoost is to fit a sequence of weak learners (i.e., models that are only slightly better than random guessing, such as small decision trees) on repeatedly modified versions of the data. I used ๐—ฑ๐—ฟ๐˜†_๐—ฏ๐—ฒ๐—ฎ๐—ป.๐˜…๐—น๐˜€๐˜… dataset...
Logistic Regression using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 3422 ะผั–ััั†ั– ั‚ะพะผัƒ
๐‹๐จ๐ ๐ข๐ฌ๐ญ๐ข๐œ ๐‘๐ž๐ ๐ซ๐ž๐ฌ๐ฌ๐ข๐จ๐ง is a supervised machine learning algorithm. Despite the name, it can be used for a classification task. In this model, the probabilities describing the possible outcomes of a single trial are modeled using a ๐˜ญ๐˜ฐ๐˜จ๐˜ช๐˜ด๐˜ต๐˜ช๐˜ค ๐˜ง๐˜ถ๐˜ฏ๐˜ค๐˜ต๐˜ช๐˜ฐ๐˜ฏ. I used ๐—ฟ๐—ถ๐—ฐ๐—ฒ.๐—ฐ๐˜€๐˜ƒ dataset for this example. The dataset is available in the repository. It contains 2 types of Turkish rice: ๐—–๐—ฎ๐—บ๐—บ๐—ฒ๐—ผ & ๐—ข๐˜€๐—บ๐—ฎ๐—ป๐—ฐ๐—ถ๐—ธ. ๐‘ฎ๐’Š๐’•๐‘ฏ๐’–๐’ƒ ๐’‚...
Complement Naive Bayes using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 362 ะผั–ััั†ั– ั‚ะพะผัƒ
๐‚๐จ๐ฆ๐ฉ๐ฅ๐ž๐ฆ๐ž๐ง๐ญ ๐๐š๐ข๐ฏ๐ž ๐๐š๐ฒ๐ž๐ฌ is a supervised machine learning algorithm which has been used for a classification task in this example. This algorithm is a modification of ๐Œ๐ฎ๐ญ๐ข๐ง๐จ๐ฆ๐ข๐š๐ฅ ๐๐š๐ข๐ฏ๐ž ๐๐š๐ฒ๐ž๐ฌ and it works well in the case of unbalanced datasets. I used ๐—ต๐—ฎ๐—บ_๐˜€๐—ฝ๐—ฎ๐—บ.๐—ฐ๐˜€๐˜ƒ dataset for this example. The dataset is available in the repository. It contains 2 types of emails: ๐ก๐š๐ฆ & ๐ฌ๐ฉ๐š๐ฆ. ๐‘ฎ๐’Š๐’•๐‘ฏ๐’–๐’ƒ ๐’‚๐’…๐’…๐’“๐’†๐’”๐’”: github...
Gaussian Naive Bayes using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 402 ะผั–ััั†ั– ั‚ะพะผัƒ
Gaussian Naive Bayes using Scikit-Learn
Bernoulli Naive Bayes using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 472 ะผั–ััั†ั– ั‚ะพะผัƒ
Bernoulli Naive Bayes using Scikit-Learn
Feature to image representation using Matplotlib
ะŸะตั€ะตะณะปัะดั–ะฒ 92 ะผั–ััั†ั– ั‚ะพะผัƒ
Feature to image representation using Matplotlib
Multinomial Naive Bayes using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 492 ะผั–ััั†ั– ั‚ะพะผัƒ
Multinomial Naive Bayes using Scikit-Learn
Categorical Naive Bayes using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 692 ะผั–ััั†ั– ั‚ะพะผัƒ
Categorical Naive Bayes using Scikit-Learn
Random Forest using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 1802 ะผั–ััั†ั– ั‚ะพะผัƒ
Random Forest using Scikit-Learn
Decision Tree using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 673 ะผั–ััั†ั– ั‚ะพะผัƒ
Decision Tree using Scikit-Learn
Support Vector Machine (SVM) using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 1253 ะผั–ััั†ั– ั‚ะพะผัƒ
Support Vector Machine (SVM) using Scikit-Learn
Train a CNN with data augmentation - Example using Flowers102 dataset
ะŸะตั€ะตะณะปัะดั–ะฒ 1563 ะผั–ััั†ั– ั‚ะพะผัƒ
Train a CNN with data augmentation - Example using Flowers102 dataset
K-Nearest Neighbors using Scikit-Learn
ะŸะตั€ะตะณะปัะดั–ะฒ 2393 ะผั–ััั†ั– ั‚ะพะผัƒ
K-Nearest Neighbors using Scikit-Learn
Inset plotting using Matplotlib
ะŸะตั€ะตะณะปัะดั–ะฒ 513 ะผั–ััั†ั– ั‚ะพะผัƒ
Inset plotting using Matplotlib
Calculate the output shape of convolution, deconvolution and pooling layers in CNN
ะŸะตั€ะตะณะปัะดั–ะฒ 694 ะผั–ััั†ั– ั‚ะพะผัƒ
Calculate the output shape of convolution, deconvolution and pooling layers in CNN
Conditional DDPM using PyTorch - Example with MNIST dataset
ะŸะตั€ะตะณะปัะดั–ะฒ 3954 ะผั–ััั†ั– ั‚ะพะผัƒ
Conditional DDPM using PyTorch - Example with MNIST dataset
Calculate FID (Frechet Inception Distance) using PyTorch
ะŸะตั€ะตะณะปัะดั–ะฒ 3105 ะผั–ััั†ั–ะฒ ั‚ะพะผัƒ
Calculate FID (Frechet Inception Distance) using PyTorch
Denoising Diffusion Probabilistic Model (DDPM) using PyTorch - Example with MNIST dataset
ะŸะตั€ะตะณะปัะดั–ะฒ 4955 ะผั–ััั†ั–ะฒ ั‚ะพะผัƒ
Denoising Diffusion Probabilistic Model (DDPM) using PyTorch - Example with MNIST dataset