A Technical Deep Dive into Unity Catalog's Practitioner Playbook

Поділитися
Вставка
  • Опубліковано 2 жов 2024
  • Get ready to take a deep dive into Unity Catalog and explore how it can simplify data, analytics and AI governance across multiple clouds. In this session, take a deep dive into Unity Catalog and the expert Databricks team will guide you through a hands-on demo, showcasing the latest features and best practices for data governance. You'll learn how to master Unity Catalog and gain a practical understanding of how it can streamline your analytics and AI initiatives. Whether you're migrating from Hive Metastore or just looking to expand your knowledge of Unity Catalog, this session is for you. Join us for a practical, hands-on deep dive into Unity Catalog and learn how to achieve seamless data governance while following best practices for data, analytics and AI governance.
    Talk by: Zeashan Pappa and Ifigeneia Derekli
    Connect with us: Website: databricks.com
    Twitter: / databricks
    LinkedIn: / databricks
    Instagram: / databricksinc
    Facebook: / databricksin

КОМЕНТАРІ • 5

  • @neelred10
    @neelred10 Рік тому +4

    Parameterizing catalog name in Python seems straightforward. But for the notebooks and dashboards where we are using spark sql queries they are usually in schema.table format. And same code goes in git and is deployed in dev/qa/prod environments. How can we handle this when we move to UC and have different catalogs for each environment? 59:05

    • @RyanCoffman-mb6ye
      @RyanCoffman-mb6ye Рік тому

      text based widget would work for notebooks and dashboards

  • @neelred10
    @neelred10 7 місяців тому +1

    if snowflake server is not publicly accessible, can it still be federated?

  • @allthingsdata
    @allthingsdata 11 місяців тому

    Excellent talk with good deep dive addressing practical issues.

  • @snehotoshbanerjee1938
    @snehotoshbanerjee1938 10 місяців тому

    Great content!!