Autoencoder In PyTorch - Theory & Implementation

Поділитися
Вставка
  • Опубліковано 3 сер 2024
  • In this Deep Learning Tutorial we learn how Autoencoders work and how we can implement them in PyTorch.
    Get my Free NumPy Handbook:
    www.python-engineer.com/numpy...
    ✅ Write cleaner code with Sourcery, instant refactoring suggestions in VS Code & PyCharm: sourcery.ai/?... *
    ⭐ Join Our Discord : / discord
    📓 ML Notebooks available on Patreon:
    / patrickloeber
    If you enjoyed this video, please subscribe to the channel:
    ▶️ : / @patloeber
    Resources:
    www.cs.toronto.edu/~lczhang/3...
    Code: github.com/patrickloeber/pyto...
    More PyTorch Tutorials:
    Complete Beginner Course: • Deep Learning With PyT...
    Dataloader: PXOzkkB5eH0
    Transforms: • PyTorch Tutorial 10 - ...
    Model Class: • PyTorch Tutorial 06 - ...
    CNN: • PyTorch Tutorial 14 - ...
    ~~~~~~~~~~~~~~~ CONNECT ~~~~~~~~~~~~~~~
    🖥️ Website: www.python-engineer.com
    🐦 Twitter - / patloeber
    ✉️ Newsletter - www.python-engineer.com/newsl...
    📸 Instagram - / patloeber
    🦾 Discord: / discord
    ▶️ Subscribe: / @patloeber
    ~~~~~~~~~~~~~~ SUPPORT ME ~~~~~~~~~~~~~~
    🅿 Patreon - / patrickloeber
    #Python PyTorch
    Timeline:
    00:00 - Theory
    02:58 - Data Loading
    05:30 - Simple Autoencoder
    15:02 - Training Loop
    17:00 - Plot Images
    19:00 - CNN Autoencoder
    29:12 - Exercise For You
    ----------------------------------------------------------------------------------------------------------
    * This is an affiliate link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏

КОМЕНТАРІ • 83

  • @patloeber
    @patloeber  3 роки тому +48

    Let me know if you enjoyed the new animations in the beginning and want to see this more in the future :)

    • @sumithhh9379
      @sumithhh9379 3 роки тому

      Hi Patrick,
      Any plans to have a series on NLP state of the art models?

    • @MrDeyzel
      @MrDeyzel 3 роки тому +1

      They're great

    • @md.musfiqurrahaman8612
      @md.musfiqurrahaman8612 2 роки тому

      Love the animations and want more. Learning pytorch and following your tutorials.

  • @jh-pq9tp
    @jh-pq9tp 3 роки тому +3

    big thanks to you. i cannot imagine how could i learn my dl course without your tutorial. Your work is the best in youtube so far!

  • @starlite5097
    @starlite5097 3 роки тому +4

    I love all your PyTorch videos, please do more :D

  • @saeeddamadi3823
    @saeeddamadi3823 3 роки тому

    Thank you so much for clear presentation of Autoencoder!

  • @markavilin5020
    @markavilin5020 2 роки тому

    Very clear, thank you very much

  • @salimibrahim459
    @salimibrahim459 3 роки тому +3

    Nice, was waiting for this :)

  • @pleasedontsubscribeme4397
    @pleasedontsubscribeme4397 3 роки тому +1

    Great work!

  • @ingenuity8886
    @ingenuity8886 3 місяці тому

    Thank you so much , you explained it really good.

  • @adityasaini491
    @adityasaini491 3 роки тому +2

    Hey Patrick a really informative and concise video! Thoroughly enjoyed it :DD Just a small correction at 12:51, you used the word dimension while explaining the Normalize transform, whereas the two attributes are just the mean and standard deviation of the resultant normalized data.

    • @patloeber
      @patloeber  3 роки тому +1

      thanks for the hint!

  • @garikhakobyan3013
    @garikhakobyan3013 3 роки тому

    hello, nice videos you have. looking forward new videos on paper review and implementations.

  • @saadmunir1467
    @saadmunir1467 3 роки тому +8

    Its reaallly nice but it would be a very nice addition to include variational autoencoders and Generative adversial networks as well :). Maybe they can be helpful to many struggling with class imbalance during classification

  • @harshkumaragarwal8326
    @harshkumaragarwal8326 3 роки тому

    Great work!! Thanks :))

  • @satpalsinghrathore2665
    @satpalsinghrathore2665 Рік тому

    Very cool. Thank you.

  • @Jerrel.A
    @Jerrel.A Рік тому

    TopNotch explanation! Thx.

  • @huoguo7426
    @huoguo7426 2 роки тому +1

    Great video! Could you provide the same walkthrough for a variational autoencoder? Or point point me to a good walkthrough on the theory and implementation of a variational autoencoder?

  • @anarkaliprabhakar6640
    @anarkaliprabhakar6640 7 місяців тому

    Nice explanation

  • @shahinghasemi2346
    @shahinghasemi2346 3 роки тому +1

    Thank you for your nice tutorials please do the same for a non-image data. I'm curious to see CNN auto-encoders with non-image data.

  • @ayankashyap5379
    @ayankashyap5379 3 роки тому

    at 22:17 when calculating the shape of the conv output, it should be 128*128*1 => 64*64 * 16 and the rest should also be different accordingly

  • @tilkesh
    @tilkesh 21 день тому

    for _ in range(100):
    print("Thank you")

  • @maharshipathak
    @maharshipathak 3 місяці тому

    For python 3.11+, pytorch 2.3+ change the dataiter.next() to next(dataiter)

  • @Mesenqe
    @Mesenqe 3 роки тому

    This channel is really good, I learned PyTorch from this channel. Guys I assure you subscribe to this channel.

    • @patloeber
      @patloeber  3 роки тому +1

      Thanks so much:) appreciate the nice words

  • @falklumo
    @falklumo Рік тому

    It should be noted that the performance difference between Linear and CNN as shown here comes from the chosen compression factor. Linear chose 12 Byte per image, CNN chose 256 Byte per image, where an original image is 784 Byte. So, the CNN code does not compress enough, less than PNG actually! You need two more linear layers to compress 64 down to 16 and then 4.

  • @3stdv93
    @3stdv93 Рік тому

    Thanks ❤

  • @saurrav3801
    @saurrav3801 3 роки тому

    Bro always waiting for your pyt🔥rch video ....🤙🏼🤙🏼🤙🏼

  • @aisidamayomi8534
    @aisidamayomi8534 Місяць тому

    Please can you do for a network intrusion detection

  • @user-ie1cv8su2f
    @user-ie1cv8su2f 3 роки тому

    thank you!

  • @CodeWithTomi
    @CodeWithTomi 3 роки тому +2

    Yet another Pytorch video🔥

    • @devadharshan6328
      @devadharshan6328 3 роки тому

      Can u help to implement pytorch with django

    • @patloeber
      @patloeber  3 роки тому +1

      man you are fast :D

    • @devadharshan6328
      @devadharshan6328 3 роки тому

      Can u upload Ur GUI chat bot code in GitHub I tried code along approach I was able to learn the concept but I got some few bugs . Can u upload it

    • @patloeber
      @patloeber  3 роки тому

      I added the code here: github.com/python-engineer/python-fun

    • @devadharshan6328
      @devadharshan6328 3 роки тому

      @@patloeber thanks

  • @devadharshan6328
    @devadharshan6328 3 роки тому

    Great animations my suggestion is to add in more animations not only in theory but also in the working of the code . Just my suggestion but great video thanks for Ur teaching.

    • @patloeber
      @patloeber  3 роки тому +1

      thanks! definitely a good idea

  • @ujjwalkumar-uf8nj
    @ujjwalkumar-uf8nj Рік тому

    Hey Patrick I used your exact code to train the CNN based autoencoder but couldn't get it to converge without Batch Normalization, after adding BatchNorm2d after every ReLU it works fine, but without it, it doesn't, tried different values for lr from 1e-2 to 1e-5. I was training on MNIST dataset only. the loss becomes NaN or ranges between 0.10 to 0.09.

  • @DiegoAndresAlvarezMarin
    @DiegoAndresAlvarezMarin 2 роки тому

    Beautifully explained!! thank you!

  • @marytajz1814
    @marytajz1814 3 роки тому +1

    your tutorials are amazing! Thank you so much... Could you please make a video for nn.embedding as well?

    • @patloeber
      @patloeber  3 роки тому +1

      I'll have a look at it

  • @ArponBiswas-wq3sh
    @ArponBiswas-wq3sh 4 місяці тому

    Very nice but need more

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 роки тому

    But how do we leverage the low dimensional embedding given that they represent the PCA?

  • @lankanathaekanayake7680
    @lankanathaekanayake7680 2 роки тому

    is it possible to use sentences as input data?

  • @mojojojo890
    @mojojojo890 Рік тому

    which one is the link that explains how you make the pytorch classes please?

  • @amzadhossain8118
    @amzadhossain8118 3 роки тому

    Can u make a video on DNCNN

  • @martinmayo8197
    @martinmayo8197 2 роки тому +1

    I don't understand a little bit the sintaxis. Why do you define the method 'forward' but never call it explicitly ? Maybe the line "recon = model(img)" is where you are using it, but I didn't know that it could be done like this. I would had written "recon = model.forward(img)", is it the same ?

  • @marinacarnemolla5515
    @marinacarnemolla5515 3 роки тому

    hi, I have a question: if we pass the image as input of the model, it will put the weights to zero and then the output will be exactly the same of the input image. So, why the image is given as input of the model? It doesn't make sense to me. Could yu explain this to me?

  • @khushpatelmd
    @khushpatelmd 3 роки тому

    If you normalize the input image which is also the label, the values will be between -1 to +1 but your output since passed through sigmoid will be between 0 and 1. How will you decrease loss for pixels that are between -1 to 0 as your predictions will never be less than 0?

    • @anonim5052
      @anonim5052 3 місяці тому

      you need to change sigmoid function at the end to tanh, to output will also be betweet -1 to 1

  • @astridbrenner2957
    @astridbrenner2957 3 роки тому +7

    This channel is so underrated.Please upload tutorials about Django

  • @teetanrobotics5363
    @teetanrobotics5363 3 роки тому

    youre the best

  • @haikbenlian5466
    @haikbenlian5466 2 роки тому

    How you found that the image size was decreased from 28 to 14?

  • @736939
    @736939 3 роки тому

    Can you please show how to work with variational autoencoders and applications such as Image segmentation.

    • @patloeber
      @patloeber  3 роки тому +1

      will look into this!

    • @736939
      @736939 3 роки тому

      @@patloeber Thank you. Because for me it's hard to program it.

  • @hadisizadiyekta125
    @hadisizadiyekta125 2 роки тому

    you used recons and img as input for loss function, however if we want to train my model and test it we should use "recon" and "labels" as an input for loss function. but the labels are 3D, how we can do that?

    • @121horaa
      @121horaa 23 дні тому

      Since, AutoEncoder is an unsupervised technique, so, recons and img are used as input to loss function.
      But, in semi-supervised or supervised methods, yo got labels, so we use them against the predicted values in loss function.

  • @YounasKhan-vm8nr
    @YounasKhan-vm8nr 4 місяці тому

    Do you have anything specific for face images, this won't work on face images.

  • @avivalviannur5610
    @avivalviannur5610 9 місяців тому

    I tried to rerun your code in the part of Autoencoder CNN, but then I got Loss = nan in each epoch. Do you know what is wrong?

  • @Saens406
    @Saens406 2 роки тому

    why there is no require_grad there?

  • @vallisham1756
    @vallisham1756 3 роки тому +1

    module 'torch.nn' has no attribute 'ReLu'
    Is anyone else getting the same error

  • @AshishSingh-753
    @AshishSingh-753 3 роки тому

    Next video is on GAN

  • @roshinroy5129
    @roshinroy5129 Рік тому +1

    Am I the only one encountering nan values during training this ?

    • @pingpong3904
      @pingpong3904 Рік тому

      On one of my virtual machines i also get nan values, when using torch 2.0.1. I tried a couple of stuff, but only with torch 1.12.1 it works fine.
      On my desktop pc it works with torch 2.0.1 though. But i do not know why.

  • @anirudhjoshi1607
    @anirudhjoshi1607 2 роки тому +2

    dude my CNN autoencoder is doing worse than the linear autoencoder, lmao

    • @marc2911
      @marc2911 Рік тому +1

      me too
      the ouputs show strange padding artefacts as well

    • @pingpong3904
      @pingpong3904 Рік тому

      @@marc2911 On one of my virtual machines i also get nan values or strange artifacts, when using torch 2.0.1. I tried a couple of stuff, but only with torch 1.12.1 it works fine.
      On my desktop pc it works with torch 2.0.1 though. But i do not know why.

  • @ryanhoward5999
    @ryanhoward5999 2 роки тому

    "Jew-Pie-Tar notebook"