Accelerate PyTorch workloads with Cloud TPUs and OpenXLA

Поділитися
Вставка
  • Опубліковано 26 чер 2024
  • Google Cloud AI accelerators enable high-performance, cost-effective training and inference for leading AI/ML frameworks. In this session, get the latest news on PyTorch/XLA, developed collaboratively by Google, Meta, and partners in the AI ecosystem to utilize OpenXLA for accelerating PyTorch workloads.
    Github → goo.gle/3xQeFaC
    Cloud Tensor Processing Units (TPUs) → goo.gle/tpu
    Speakers: Shauheen Zahirazami
    Watch more:
    Check out all the AI videos at Google I/O 2024 → goo.gle/io24-ai-yt
    Check out all the Cloud videos at Google I/O 2024 → goo.gle/io24-cloud-yt
    Check out all the Mobile videos at Google I/O 2024 → goo.gle/io24-mobile-yt
    Check out all the Web videos at Google I/O 2024 → goo.gle/io24-web-yt
    Subscribe to Google Developers → goo.gle/developers
    #GoogleIO
    Products Mentioned: Cloud - AI and Machine Learning - Cloud TPU
    Event: Google I/O 2024
  • Наука та технологія

КОМЕНТАРІ • 1

  • @GoogleDevelopers
    @GoogleDevelopers  Місяць тому

    Check out all the AI videos at Google I/O 2024 → goo.gle/io24-ai-yt