DL4CV@WIS (Spring 2021) Tutorial 13: Training with Multiple GPUs

Поділитися
Вставка
  • Опубліковано 28 вер 2024
  • Mode Parallel, Gradient Accumulation, Data Parallel with PyTorch, Larger Batches
    Lecturer: Shai Bagon

КОМЕНТАРІ • 15

  • @prachigarg8579
    @prachigarg8579 2 роки тому +17

    This tutorial is so underrated! Hands down the most clear and in-depth understanding of DDP for someone who doesn't know multi-processing in Pytorch. I came across this after watching 4-5 other videos. Strongly recommend this one.

  • @quantumjun
    @quantumjun 2 роки тому

    I think the questions are excellent

  • @amortalbeing
    @amortalbeing 2 роки тому

    Thanks a lot. really enjoyed it. God bless you all

  • @janasandeep
    @janasandeep 9 місяців тому

    21:19 Where does the averaging of gradients happen? On the CPU as shown in the animation? Or all the GPUs talk to each other directly and averaging happens on each GPU?

    • @shaibagon
      @shaibagon 9 місяців тому

      It depends on the HW you got and the backend you are using. I suppose with NVIDIA servers and nccl backend it all happens between GPUs without CPU involvement. The connection is done device-to-device

  • @duongkstn
    @duongkstn 2 роки тому

    thanks

  • @wtfbro9834
    @wtfbro9834 5 місяців тому

    sir if i have more data like more than 100gb which cannot be stored in google colab then how should i approach this problem for training my model on whole data

  • @pizhichil
    @pizhichil 2 роки тому +1

    I have a question. The train function runs on each process independent of the other (train functions running on other process). Within train, the epoch may finish at different times for each train function. How does the PyTorch distributed know that when it is time to synchronize gradients? BTW - this is the best lecture I have seen on this topic :+1:

    • @shaibagon
      @shaibagon 9 місяців тому

      all processes are sync every gradient update.

  • @mehershashwatnigam5581
    @mehershashwatnigam5581 7 місяців тому

    Thanks a lot for this, helped with my interview prep!

  • @rexi4238
    @rexi4238 Рік тому

    Really good and clear, thank you for this video!

  • @AttiDavidson
    @AttiDavidson 2 роки тому

    Thank you very much. Very good presentation, comprehensive and clear.

  • @wenyuehua9558
    @wenyuehua9558 2 роки тому

    so clear and well-explained. Thank you very much

  • @haiw
    @haiw Рік тому

    super clear! Thanks!

  • @jacksmith6242
    @jacksmith6242 2 роки тому

    so clear,so great