Advancing Spark - Understanding the Unity Catalog Permission Model

Поділитися
Вставка
  • Опубліковано 5 жов 2024

КОМЕНТАРІ • 15

  • @aqlanable
    @aqlanable 2 роки тому +5

    I think these important parts worth mentioning , ex: credentials, external location and how to migrate from current hive metastore to unity catalog.
    I have a blog in draft on my WordPress, if its okay i can post it here

  • @allthingsdata
    @allthingsdata 2 роки тому +1

    We can't really use unity effectively as we aim for a client-agnostic data access model and unity assumes that you always go through it so it's a centralization of the authorization layer which goes against the open, client-agnostic lakehouse approach imo. Of course you could have databricks permissions managed via unity service principals plus other permissions on the storage layer managed via rbac + acl but that's double the effort. We currently prefer one auth layer which works for all tools that can do AD passthrough or obtain an Azure AD token and is enforced on the storage layer. Of course this has drawbacks, too, e.g. not cloud-agnostic but for us it's the better model currently. Also not a fan of onboarding lakehouse assets to make them unity ready.

  •  2 роки тому +1

    It was worth waiting until the end. Ah, those buttons ... ;D

  • @jordanfox470
    @jordanfox470 2 роки тому

    @Simon with the release of Unity Catalog, do you have any insight if they're going to update delta live table to allow us to put objects in a single catalog but multiple different schema/databases? At the moment you define a target, and that target is the schema/database for every object in the delta live table pipeline. Seems like it'll be necessary to update in Unity.

  • @prasad8195
    @prasad8195 Рік тому

    Hello @Simon
    I require your assistance with a specific use case. Suppose I create a view using the `%sql` declaration with the `CREATE OR REPLACE VIEW` statement and grant the Databricks group 'X' usage access to the schema and catalog, along with select access to the view. Consequently, a user who is a member of the Databricks group 'X' will gain visibility of the object and the ability to retrieve data from the view.
    However, a challenge arises when I execute the `CREATE OR REPLACE VIEW` statement again. It appears that the previously granted permissions for Databricks group 'X' vanish, subsequently restricting users in that group from accessing the object.
    Could you please provide guidance /feedback on this ? Your assistance is greatly appreciated.

  • @user-bs8ku6cg9f
    @user-bs8ku6cg9f 2 роки тому

    why you have no videos about palantir? they have the best software

  • @nikhilsahu4159
    @nikhilsahu4159 Рік тому

    I do not find "Create Catalog "and "Create Metastore" on Azure Databricks even I have a premium account on azure databricks. Anyone know...Why?

    • @AdvancingAnalytics
      @AdvancingAnalytics  Рік тому

      Have you enabled Unity Catalog and associated the workspace to a metastore? There is some setup to do before workspaces will work with the new commands!

  • @gordonegar7717
    @gordonegar7717 2 роки тому

    Thoughts on using a single storage account container and metastore across environments?

    • @aqlanable
      @aqlanable 2 роки тому

      In unity catalog its possible through databricks account portal, u can create a metastore and share it across multiple workspaces.

    • @user-bs8ku6cg9f
      @user-bs8ku6cg9f 2 роки тому

      Gordon Egar, maybe you should check palantir foundry

  • @sankarazad7574
    @sankarazad7574 Рік тому

    How do we provide security between the workspaces??
    How can we keep dev, UAT and prod workspaces seperately