9. how to create mount point in azure databricks | dbutils.fs.mount in databricks | databricks

Поділитися
Вставка
  • Опубліковано 6 лис 2024

КОМЕНТАРІ • 28

  • @ssunitech6890
    @ssunitech6890  Рік тому +4

    Using Account Key:
    dbutils.fs.mount(
    source=‘wasbs://@.blob.core.windows.net/’,
    mount_point=‘/mnt/’,
    extra_configs={‘fs.azure.account.key..blob.core.windows.net’:’’})
    Using SAS token:
    dbutils.fs.mount(
    source=‘wasbs://@.blob.core.windows.net/’,
    mount_point=‘/mnt/’,
    extra_configs={‘fs.azure.sas.< containerName >..blob.core.windows.net’:’’})

    • @satishkumar-bo9ue
      @satishkumar-bo9ue Рік тому

      can i save this Account key and SAS token , can we use same syntax in Realtime works

  • @goluSingh-su1xs
    @goluSingh-su1xs Рік тому +2

    It's very nice video, learning adb from you. Please upload more videos

  • @amritasingh1769
    @amritasingh1769 Рік тому +1

    Crystal clear, really very helpful

    • @ssunitech6890
      @ssunitech6890  Рік тому

      Thanks for your appreciation,
      It always motivate me

  • @satishkumar-bo9ue
    @satishkumar-bo9ue Рік тому +2

    here mount point can i give directly access key is secure or not. instead of directly without given access key is possible to create secret key by using key vault.

    • @ssunitech6890
      @ssunitech6890  Рік тому

      Yes we can, watch below video:
      ua-cam.com/video/BF_UNfRJrD4/v-deo.html

  • @sravankumar1767
    @sravankumar1767 Рік тому +1

    In our project we are using abfss path apart from wasbs. Most projects i had seen they are using abfss. What is the difference between abfss vs wasbs. Could you please explain 🙏

    • @ssunitech6890
      @ssunitech6890  Рік тому +1

      I don't have answer for this question now but let me check and confirm

  • @MOOLA7893
    @MOOLA7893 6 місяців тому +1

    Can we create pipeline in ADF to copy from input to output instead of mount process? Tq

  • @parulsingh3534
    @parulsingh3534 Рік тому +1

    Hi, is there any other way or option to access files from ADLS inside databricks without mounting the storage account in databricks? Can you please provide your inputs on that. Thank you!

    • @ssunitech6890
      @ssunitech6890  Рік тому

      I didn't see any other option except mount points.

  • @sravankumar1767
    @sravankumar1767 Рік тому +1

    Nice explanation 👌 👍 👏

  • @indrabahadursingh5950
    @indrabahadursingh5950 Рік тому +1

    Superb video😍

  • @nagamanickam6604
    @nagamanickam6604 7 місяців тому +1

    Thank you

    • @ssunitech6890
      @ssunitech6890  7 місяців тому

      Thanks
      Please share to others
      Keep learning and growing 💗

  • @suman3316
    @suman3316 Рік тому +1

    Hi What is the Difference between Account key and SAS token.

    • @ssunitech6890
      @ssunitech6890  Рік тому +1

      Account Key- if you provide access using it then user will get complete access on account. Like view/modify.
      SAS- it's useful if you want to share the access to resources for a specific period of time and only specific permission like view or create or modify or all.

  • @sammail96
    @sammail96 7 місяців тому +1

    Hey It is nice video. Thank you

    • @ssunitech6890
      @ssunitech6890  7 місяців тому

      Thanks
      Please share to others
      Keep learning and growing 💗

  • @venkatchinta3105
    @venkatchinta3105 Рік тому +2

    Hi bro,
    today I attended the TCS interview. they asked me about realtime scenarios in adf.
    1. how to create reusable pipeline for collecting the required columns from n no of files from adls to SQL. ex I have 10 files in every file I have 20 columns but I want only 15 columns. I need to do this activity repeatly so you can create a pipeline for reusable.
    2. in adls I have different CSV files in Adls ex India, aus, eng, sa cricket teams. I want only Indian cricket team realted files. for this purpose how to create a pipeline.
    Please create realtime scenario video for this thanks in advance

    • @satishkumar-bo9ue
      @satishkumar-bo9ue Рік тому +2

      2.) in adls u have diff csv files , u want only india team related files ?
      ans: take getmetadata acitivity in this to fetch all files than take filter activity in this u have give condition on starts with indian files next copy activity ....then u get only indian team files.

    • @satishkumar-bo9ue
      @satishkumar-bo9ue Рік тому

      2nd method ; fisrt u can take csv datasets and linked services on adls gen2 ,,take pipeline on copy activity to select source in source u give widcard file path. *.csv open dataset select indian folders only and finally in sink u create datasets and linkedservice on destination ..finally sink u give name on sink path...debug u can get output only indian files

    • @ssunitech6890
      @ssunitech6890  Рік тому

      Can you please explain more about 2nd question