Deep Learning Hyperparameter Tuning in Python, TensorFlow & Keras

Поділитися
Вставка
  • Опубліковано 28 гру 2021
  • Colab Notebook: colab.research.google.com/dri...
    Thank you for watching the video! You can learn data science FASTER at mlnow.ai!
    Master Python at mlnow.ai/course-material/python/!
    Learn SQL & Relational Databases at mlnow.ai/course-material/sql/!
    Learn NumPy, Pandas, and Python for Data Science at mlnow.ai/course-material/data...!
    Become a Machine Learning Expert at mlnow.ai/course-material/ml/!
    Don't forget to subscribe if you enjoyed the video :D

КОМЕНТАРІ • 22

  • @GregHogg
    @GregHogg  8 місяців тому +1

    Take my courses at mlnow.ai/!

  • @billybobandboshow
    @billybobandboshow 2 роки тому +2

    Thank you for this video! I have been learning about deep learning algorithms over the holiday break! Hope we see more videos from you! I love your channel and content! Keep up the awesome work, happy holidays and happy new year! :)

    • @GregHogg
      @GregHogg  2 роки тому +1

      You're very welcome and thanks so much for the kind words! Awesome work, happy new year!!

  • @tigjuli
    @tigjuli 2 роки тому +2

    Simple explanation, awesome video!

  • @rudrathakkar56
    @rudrathakkar56 2 роки тому

    Thank you . I am learning deep learning .This helped me much

    • @GregHogg
      @GregHogg  2 роки тому

      Perfect - Really glad to hear it!

  • @tomaszzielonka9808
    @tomaszzielonka9808 5 місяців тому +1

    @GreggHogg Hi,
    I got stuck with keras tuner. It seems that code below will only only create the function 'model_builder' once. If I change anything like add Dropout layer and rerun the function it keeps displaying the comment (see below the code), like it was consistenly reaching to the first version of function.
    Any clues on how to fix that? I would like to experiment with the 'model_builder' function (add/remove layers, dropouts, etc) and then observe what parameters tuner generates.
    def model_builder(hp) :
    model = Sequential()
    hp_activation = hp.Choice('activation', values = ['relu', 'tanh'])
    hp_layer_1 = hp.Int('layer_1', min_value = 2, max_value = 32, step = 2)
    hp_layer_2 = hp.Int('layer_2', min_value = 2, max_value = 32, step = 2)
    hp_learning_rate = hp.Choice('learning_rate', values = [1e-2, 1e-3, 1e-4])
    model.add(Dense(units = hp_layer_1, activation = hp_activation))
    model.add(Dense(units = hp_layer_2, activation = hp_activation))
    model.add(Dense(units = 1, activation = 'sigmoid'))
    model.compile(optimizer = tf.keras.optimizers.Adam(learning_rate = hp_learning_rate),
    loss = 'binary_crossentropy',
    metrics = [tf.keras.metrics.Recall()])
    return model
    tuner = kt.Hyperband(model_builder,
    objective = kt.Objective("val_recall", direction = "max"),
    max_epochs = 50,
    factor = 3,
    seed = 42)
    Comment : Reloading Tuner from .\untitled_project\tuner0.json

  • @BB-2383
    @BB-2383 21 день тому +1

    Side comment - we divide x by 255, because the image is grayscale. An RGB of white is (255,255,255), so we are converting the values upon dividing to (1,1,1), then leaving black as (0,0,0). So, an important note when training images is first convert the images to grayscale.

    • @GregHogg
      @GregHogg  17 днів тому

      Yes thank you ☺️

  • @arsheyajain7055
    @arsheyajain7055 2 роки тому +1

    Awesome video!!

    • @GregHogg
      @GregHogg  2 роки тому +1

      Thanks a bunch Arsheya! Hope you're having a great holiday break :)

  • @dakshbhatnagar
    @dakshbhatnagar Рік тому +2

    Great Video Man but tbh I was actually expecting some sort of automation of the hyperparameter tuning.

  • @haneulkim4902
    @haneulkim4902 2 роки тому

    Thanks for an amazing video! Is there way to tune hyperparameters like in sklearn w/o using keras-tuner?

    • @GregHogg
      @GregHogg  2 роки тому +1

      You're very welcome! I'm sure there is, although I don't believe I've done it before

    • @tigjuli
      @tigjuli 2 роки тому

      yes, there is. you have to define a model as a function and use KerasClassifier from keras as a wrapper to work with sklearn's GridSearch or Ramdomized search. I'm sure there are videos on youtube

  • @prabinbasyal1049
    @prabinbasyal1049 2 роки тому

    Can you suggest data science course?
    I already read numpy,pandas and matplotlib.

    • @GregHogg
      @GregHogg  2 роки тому

      Awesome! IBM Data science is a great intro. Big big fan of Andrew Ng's deep learning as well.

  • @luisalbertoburbano9295
    @luisalbertoburbano9295 Рік тому +1

    good afternoon, I have a task and I have not been able to create the keras tuner for 5000 rows with 4 columns, in each column the numbers are random from 0 to 9 and I need an output of only 4 numbers this is the code # Initialising the RNN
    model = Sequential()
    # Adding the input layer and the LSTM layer
    model.add(Bidirectional(LSTM(neurons1,
    input_shape=(window_length, number_of_features),
    return_sequences=True)))
    # Adding a first Dropout layer
    model.add(Dropout(0.2))
    # Adding a second LSTM layer
    model.add(Bidirectional(LSTM(neurons2,
    input_shape=(window_length, number_of_features),
    return_sequences=True)))
    # Adding a second Dropout layer
    model.add(Dropout(0.2))
    # Adding a third LSTM layer
    model.add(Bidirectional(LSTM(neurons3,
    input_shape=(window_length, number_of_features),
    return_sequences=True)))
    # Adding a fourth LSTM layer
    model.add(Bidirectional(LSTM(neurons4,
    input_shape=(window_length, number_of_features),
    return_sequences=False)))
    # Adding a fourth Dropout layer
    model.add(Dropout(0.2))
    # Adding the first output layer with ReLU activation function
    model.add(Dense(output_neurons, activation='relu'))
    # Adding the last output layer with softmax activation function
    model.add(Dense(number_of_features, activation='softmax')) Thank you very much