Pruning a neural Network for faster training times

Поділитися
Вставка
  • Опубліковано 20 лис 2022
  • Neural Networks and neural network based architecturres are powerful models that can deal with abstract problems but they are known for taking a long time to train. In this video, we learn a new technique: pruning neural networks.
    Previous lesson: • How to select the corr...
    Next lesson: • How to Use Learning Ra...
    📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 misraturp.gumroad.com/l/fdl
    👩‍💻 You can get access to all the code I develop in this course here: github.com/misraturp/Deep-lea...
    ❓To get the most out of the course, don't forget to answer the end of module questions:
    fishy-dessert-4fc.notion.site...
    👉 You can find the answers here:
    fishy-dessert-4fc.notion.site...
    RESOURCES:
    🏃‍♀️ Data Science Kick-starter mini-course: www.misraturp.com/courses/dat...
    🐼 Pandas cheat sheet: misraturp.gumroad.com/l/pandascs
    📥 Streamlit template (updated in 2023, now for $5): misraturp.gumroad.com/l/stemp
    📝 NNs hyperparameters cheat sheet: www.misraturp.com/nn-hyperpar...
    📙 Fundamentals of Deep Learning in 25 pages: misraturp.gumroad.com/l/fdl
    COURSES:
    👩‍💻 Hands-on Data Science: Complete your first portfolio project: www.misraturp.com/hods
    🌎 Website - misraturp.com/
    🐥 Twitter - / misraturp
  • Наука та технологія

КОМЕНТАРІ • 5

  • @user-vf5xf9pi4x
    @user-vf5xf9pi4x 10 місяців тому +1

    Only pruning the network doesn't actually speed up the network.
    It actually slows down the inference latency.
    (cause most AI frameworks we use doesn't support sparse matrix calculation)
    The most important thing is to install sparsity calculation supported kernel in your system.

  • @kartikpodugu
    @kartikpodugu 10 місяців тому

    Can pruning be done after completing the training ?
    For example, I have a pre-trained model.
    Using PTQ (Post Training Quantization), we Quantize the model, similarly can we do Post Training Pruning also ?

  • @starman7906
    @starman7906 Рік тому +1

    Thanks but too short.