Thanks for the great videos and packages Matt, they're fantastic. If I wanted to use tidymodels::tune to tune the hyperparameters with a regular grid do you recommend using a tune_spec and tree_grid to build into the workflow i.e. tune_spec % set_engine("xgboost") %>% set_mode("regression") tree_grid % add_recipe(rec_xgb)
Yes that would work. The only thing to watch out for is the sampling strategy. For ML models you can get away with kfold in most cases. But for sequence based models like ETS or ARIMA you need to do a time series CV strategy. I go into detail in module 13 of my time series course. I spend about 3 hours on hyperparameter tuning so you don’t make the mistakes I’ve run into in the past. university.business-science.io/p/ds4b-203-r-high-performance-time-series-forecasting
Great Video Matt, Thanks for the Video. Can't Wait to join your forecasting class until my company approve the program. And are the classes always up to date with new method that you found?
Awesome! You’ll love the time series course. It’s insane. And yes - I keep it up to date. Each lab gets added as bonus material so you get all of the latest advances in the course content. 🙌
Hi Shweta, we did a Python Learning Lab for forecasting 100 time series. It's available through our Learning Labs PRO university.business-science.io/p/learning-labs-pro
I have the same problem, do u save the model for each time series or just save the results and rerun the model for the next prediction (new data coming)?
Thanks for the great videos and packages Matt, they're fantastic. If I wanted to use tidymodels::tune to tune the hyperparameters with a regular grid do you recommend using a tune_spec and tree_grid to build into the workflow i.e.
tune_spec %
set_engine("xgboost") %>%
set_mode("regression")
tree_grid %
add_recipe(rec_xgb)
Yes that would work. The only thing to watch out for is the sampling strategy. For ML models you can get away with kfold in most cases. But for sequence based models like ETS or ARIMA you need to do a time series CV strategy. I go into detail in module 13 of my time series course. I spend about 3 hours on hyperparameter tuning so you don’t make the mistakes I’ve run into in the past. university.business-science.io/p/ds4b-203-r-high-performance-time-series-forecasting
Excelente video , pronto estaré invirtiendo en sus curso Track R, saludos
Thank you!!
Great Video Matt, Thanks for the Video. Can't Wait to join your forecasting class until my company approve the program.
And are the classes always up to date with new method that you found?
Awesome! You’ll love the time series course. It’s insane. And yes - I keep it up to date. Each lab gets added as bonus material so you get all of the latest advances in the course content. 🙌
Did not understood, what will be forecasted at the end. Multiple time series, a single timeseries (HOUSHOLD_2_101), or a combination...
confusing.
Is it also available in Python? Would be great if provided with a link
Hi Shweta, we did a Python Learning Lab for forecasting 100 time series. It's available through our Learning Labs PRO university.business-science.io/p/learning-labs-pro
I have the same problem, do u save the model for each time series or just save the results and rerun the model for the next prediction (new data coming)?
Has this approach advantage over ensemble modeling?
You can now do ensemble modeling. I will be presenting next week.
ts_clean resulting in NA future forecast wonder why
rec_2
step_ts_clean(value, period = 12)
wflw_ets
add_model(
exp_smoothing() |>
set_engine("ets")
) |>
add_recipe(rec_2)
nested_modeltime_refit_tbl
modeltime_nested_refit(control = control_nested_refit(allow_par = TRUE))
nested_modeltime_refit_tbl |>
extract_nested_future_forecast() |>
group_by(id) |>
plot_modeltime_forecast(
.interactive = TRUE,
.facet_ncol = 4
)
Clean removes outliers. Should replace NA I would think.