ClimaX - A foundation model for weather and climate

Поділитися
Вставка
  • Опубліковано 6 вер 2024
  • Tung Nguyen, PhD student in Computer Science, UCLA
    Recent data-driven approaches based on machine learning aim to directly solve a downstream forecasting or projection task by learning a data-driven functional mapping using deep neural networks. However, these networks are trained using curated and homogeneous climate datasets for specific spatiotemporal tasks, and thus lack the generality of currently used computationally intensive physics-informed numerical models for weather and climate modeling. We develop and demonstrate ClimaX, a flexible and generalizable deep learning model for weather and climate science that can be trained using heterogeneous datasets spanning different variables, spatio-temporal coverage, and physical groundings. ClimaX extends the Transformer architecture with novel encoding and aggregation blocks that allow effective use of available compute and data while maintaining general utility. ClimaX is pretrained with a self-supervised learning objective on climate datasets derived from CMIP6. The pretrained ClimaX can then be fine-tuned to address a breadth of climate and weather tasks, including those that involve atmospheric variables and spatio-temporal scales unseen during pretraining. Compared to existing data-driven baselines, we show that this generality in ClimaX results in superior performance on benchmarks for weather forecasting and climate projections, even when pretrained at lower resolutions and compute budgets.
    Bio:
    Tung Nguyen is a second-year PhD student in Computer Science at UCLA. His research interests lie in the intersection of decision making, sequence modeling, and uncertainty quantification. He is also interested in grounding his research in applications to sustainability, especially climate change.

КОМЕНТАРІ • 2

  • @YourHoss
    @YourHoss Рік тому +1

    Sounds truly groundbreaking. As a layman to me it’s amazing this works in the same “next-token” way as LLMs that have become so popular this year. I do wonder, how much memory is “too much”? Could we throw more memory at this problem and even if scaling is quadratic, is it feasible on some grand scale? We currently spend a lot of money on supercomputers, what it the same amount of resources was available for ClimaX?

  • @chilinh2206
    @chilinh2206 Рік тому +1

    👍👍👍👍👍