XGBoost and HyperParameter Optimization

Поділитися
Вставка
  • Опубліковано 21 жов 2024

КОМЕНТАРІ • 4

  • @edzme
    @edzme Місяць тому

    thanks for making this, coiled seems to be what I'm looking for

  • @kamranpersianable
    @kamranpersianable Рік тому +1

    Thanks, this is amazing! I have tried integrating Optuna hyperparameter search with Dask and it works great, but I have noticed if I increase the number of iterations, at some point my system crashes due to insufficient memory. From what I can see dask keeps a copy of each iteration so it ends up consuming more memory than needed; any way I can release all the memory usages after each iteration?

    • @Coiled
      @Coiled  Рік тому +1

      The copy that Dask keeps is just the result of the objective function (scores, metrics). This should be pretty lightweight.
      That's not to say that there isn't some memory leak somewhere (XGBoost, Pandas, ...). If you're able to provide a reproducer to a Dask issue tracker that would be welcome. Alternatively if you run on Coiled infrastructure there's lots of measurement tools there that get run automatically that could help to diagnose.

    • @kamranpersianable
      @kamranpersianable Рік тому

      @@Coiled thanks, I will check further to see what is going wrong! From what I can see for 500 iterations, there is 9GB of added materials into the memory.