OMG, a classic "UA-camr explains how to add 1+1" BUT NO ONE ELSE SAYS HOW TO ADD 1+1!!! Thanks a bunch, you most likely saved me two hours of frustrating trial and error.
I was literally on my hunt to see if I can train 4 different models with different parameters all at once and here you upload this. Perfect timing! Thanks, man! Love your content!
I don't doubt this works, but it seems a bit odd to specify an instantiated classifier in the pipeline only to override it with another instantiated classifier. Is there a way to make the classifier in the pipeline a generic placeholder?
@@DanielWeikert No need; I realized my question is kinda silly considering that it's no different then simply overriding default parameters. Instead of specifying attributes, each classifier is an object.
Nice tip!! As I'm solving a multi-output classification problem I'm using MultiOutputClassifier() for that and I think that's what's messing my code up when trying to run this solution. It looks something like this: pipeline = Pipeline([ ('vect', CountVectorizer(tokenizer=tokenize)), ('tfidf', TfidfTransformer()), ('classifier', MultiOutputClassifier(lr_clf)) ]) # parameter dict for logistic regression params_lr = { 'vect__decode_error' : ['strict', 'ignore', 'replace'], 'tfidf__norm' : ['l1', 'l2'], 'classifier__estimator__penalty' : ['l1', 'l2'], 'classifier__estimator__C' : [0.1, 1, 10], 'classifier__estimator' : [lr_clf] } # other dicts for different models # list of parameters dicts parameters = [params_lr, params_svc, params_rf] cv = GridSearchCV(pipeline, param_grid = parameters, n_jobs=-1) cv.fit(X_train, y_train) Any tips on this? I think the gridsearch doesn't understand the MultiOutputClassifier. Thanks in advance!!
Thanks for watching! 🙌 If you're brand new to GridSearchCV, I recommend starting with this tutorial instead: ua-cam.com/video/Gol_qOgRqfA/v-deo.html
OMG, a classic "UA-camr explains how to add 1+1" BUT NO ONE ELSE SAYS HOW TO ADD 1+1!!! Thanks a bunch, you most likely saved me two hours of frustrating trial and error.
Happy to help! 🙌
I was literally on my hunt to see if I can train 4 different models with different parameters all at once and here you upload this. Perfect timing! Thanks, man! Love your content!
That's awesome to hear! So glad I could be helpful, and thanks for your kind words 🙏
Amazing video, there are very few videos on these such unique topics on UA-cam.
Had one doubt, didn't understood the placeholder part at 2:15.
Amazing video!
Does this work for RandomizedSearchCV as well?
You're a good boy; this has streamlined my PhD's research.
Thank you!
I don't doubt this works, but it seems a bit odd to specify an instantiated classifier in the pipeline only to override it with another instantiated classifier. Is there a way to make the classifier in the pipeline a generic placeholder?
If you wrap it into a loop
@@DanielWeikert No need; I realized my question is kinda silly considering that it's no different then simply overriding default parameters. Instead of specifying attributes, each classifier is an object.
Thanks for saving our time, i used to do loops
You're very welcome!
Nice tip!!
As I'm solving a multi-output classification problem I'm using MultiOutputClassifier() for that and I think that's what's messing my code up when trying to run this solution. It looks something like this:
pipeline = Pipeline([
('vect', CountVectorizer(tokenizer=tokenize)),
('tfidf', TfidfTransformer()),
('classifier', MultiOutputClassifier(lr_clf))
])
# parameter dict for logistic regression
params_lr = {
'vect__decode_error' : ['strict', 'ignore', 'replace'],
'tfidf__norm' : ['l1', 'l2'],
'classifier__estimator__penalty' : ['l1', 'l2'],
'classifier__estimator__C' : [0.1, 1, 10],
'classifier__estimator' : [lr_clf]
}
# other dicts for different models
# list of parameters dicts
parameters = [params_lr, params_svc, params_rf]
cv = GridSearchCV(pipeline, param_grid = parameters, n_jobs=-1)
cv.fit(X_train, y_train)
Any tips on this? I think the gridsearch doesn't understand the MultiOutputClassifier.
Thanks in advance!!
Thanks, helpful
You're welcome!