13. Use Python to Manage OneLake in Fabric |

Поділитися
Вставка
  • Опубліковано 14 лют 2024
  • In this video, I discussed about using python to manage one lake in Microsoft fabric.
    documentation link:
    learn.microsoft.com/en-us/fab...
    Link for PySpark Playlist:
    • 1. What is PySpark?
    Link to Microsoft Fabric Playlist:
    • Microsoft Fabric Playlist
    Link for PySpark Real Time Scenarios Playlist:
    • 1. Remove double quote...
    Link for Azure Synapse Analytics Playlist:
    • 1. Introduction to Azu...
    Link to Azure Synapse Real Time scenarios Playlist:
    • Azure Synapse Analytic...
    Link for Azure Data bricks Play list:
    • 1. Introduction to Az...
    Link for Azure Functions Play list:
    • 1. Introduction to Azu...
    Link for Azure Basics Play list:
    • 1. What is Azure and C...
    Link for Azure Data factory Play list:
    • 1. Introduction to Azu...
    Link for Azure Data Factory Real time Scenarios
    • 1. Handle Error Rows i...
    Link for Azure Logic Apps playlist
    • 1. Introduction to Azu...
    #azure #microsoft #wafastudies #fabric #msfabric #microsoftfabric #azuredatafactory #azuresynapse #azuresynapseanalytics #kusto #adx #azuredataexplorer #powerbi #pbi
  • Наука та технологія

КОМЕНТАРІ • 8

  • @deepjyotimitra1340
    @deepjyotimitra1340 4 місяці тому

    what an explanation 👏 👌 Loved it.
    For live project, can we use SPN for auth?

  • @cheeliAmarnath
    @cheeliAmarnath Місяць тому

    Am not able to install packages

  • @zubair489
    @zubair489 4 місяці тому +1

    How to use a service principal instead of logging interactively?

    • @BharathKumar-ch4tp
      @BharathKumar-ch4tp 3 місяці тому

      import os
      from azure.identity import ClientSecretCredential
      from azure.storage.filedatalake import (
      DataLakeServiceClient,
      DataLakeDirectoryClient,
      )
      # Set your account, workspace, and item path here
      ACCOUNT_NAME = "onelake"
      WORKSPACE_NAME = ""
      DATA_PATH = ""
      LOCAL_FILE_PATH = r""
      def upload_file_to_directory(directory_client: DataLakeDirectoryClient, local_file_path: str):
      file_name = os.path.basename(local_file_path)
      file_client = directory_client.get_file_client(file_name)
      with open(local_file_path, mode="rb") as data:
      file_client.upload_data(data, overwrite=True)
      def main():
      # Create a service client using service principal credentials
      client_id = ""
      tenant_id = ""
      client_secret = ""
      account_url = f"{ACCOUNT_NAME}.dfs.fabric.microsoft.com"
      credential = ClientSecretCredential(tenant_id, client_id, client_secret)
      service_client = DataLakeServiceClient(account_url, credential=credential)
      # Create a file system client for the workspace
      file_system_client = service_client.get_file_system_client(WORKSPACE_NAME)

      # Get the directory client for the specified data path
      directory_client = file_system_client.get_directory_client(DATA_PATH)

      # Upload the local file to the specified directory in the Data Lake storage
      upload_file_to_directory(directory_client, LOCAL_FILE_PATH)
      if __name__ == "__main__":
      main()
      If you already know how to create service principal , Use this code

    • @BharathKumar-ch4tp
      @BharathKumar-ch4tp 3 місяці тому

      This is for loading a data from local computer to Lakehouse using Python

  • @rogerbheeshmaa3606
    @rogerbheeshmaa3606 4 місяці тому

    how can we fetch the data from onelake, pls post a video on that

  • @kartiktak6270
    @kartiktak6270 4 місяці тому

    when will you add more videos ??