Loading data from ADLS behind firewalls to Fabric Lakehouse

Поділитися
Вставка
  • Опубліковано 30 лип 2023
  • As part of this Microsoft Fabric Security video, you will learn how to connect to your data sources that are behind firewalls, vnets, private endpoints such as Azure Data Lake Storage Gen 2 (ADLS Gen 2) and load them into Microsoft Fabric to start evaluating with your corporate data.
    -----------------------
    Follow me on Linked-In:
    / vengat83

КОМЕНТАРІ • 12

  • @jegaveerpandian
    @jegaveerpandian 4 місяці тому

    Amazing content, Vengatesh! This video saved me a lot of time and effort, thank you so much. I am curious whether we can create a shortcut within datalake from ADLS2 storage container with the same approach? Its giving me error 'invalid credentials'.

  • @davidma7481
    @davidma7481 11 місяців тому +1

    brilliant idea!!!

  • @batuhantuter
    @batuhantuter 11 місяців тому +1

    Great video Vengatesh!

  • @user-yy1ng6wp3y
    @user-yy1ng6wp3y 11 місяців тому +3

    Awesome video!!
    1. Could you please explain what AD user that is used for authenticating to Fabric, is it managed identity or a personal user?
    2. How is Synapse able to access Fabric over the public internet when Synapse is using a managed private network?
    3. Does Fabric and Synapse have to be in the same tenant?
    Thanks :)

    • @datagravity3578
      @datagravity3578  11 місяців тому +1

      Thanks for watching !!!
      1. My Synapse Spark notebook is using my personal (logged in) AAD user credential to execute. So the authentication to Fabric is also happening seamlessly with that AAD token. If you were to use Managed Identity, it is very much possible to execute your notebook using MI and you have to provide the appropriate permissions for that MI in your Fabric Workspace - Note: For MI to work, Fabric and Azure Synapse needs to be on the same tenant only.
      2. MAnaged VNet provides network isolation and secure outbound connectivity for spark workloads (and azure integration runtimes). When it comes to outbound connectivity, spark can connect to any public endpoint from a Managed VNet - all outbound calls are allowed. Only when we enabled "data exfiltration protection" for Synapse, we will block any outbound calls that do not fall under approved tenants and those don;t have a managed private endpoint. In my case, I have enabled Managed VNET but I have not enabled DEP. Watch my video to learn more about Managed VNets - ua-cam.com/video/4PJOuhFosLY/v-deo.htmlsi=ZGQjdYgdEOCbWAi9
      3. For seamless authentication yes - but not mandatory. You can call the OneLake APIs with appropriate access tokens also inside your notebook: blog.fabric.microsoft.com/en-us/blog/connecting-to-onelake?ft=All:

    • @user-yy1ng6wp3y
      @user-yy1ng6wp3y 11 місяців тому

      Thank you@@datagravity3578! :) I really appreciate your prompt response and excellence explanation. This deserves a Sub, hope you keen making more content

  • @vt1454
    @vt1454 8 місяців тому

    Thanks for this video. I was looking for this solution (or work around). Can Azure Data Factory(classic) write directly to Fabric Lakehouse using AAD authentication? In that case we can read data with ADF and write to Lakehouse from the same.

  • @LegendaryBullStrike
    @LegendaryBullStrike 11 місяців тому +1

    This is great!
    But how do you connect to a on prem SQL SERVER DB/DW ?
    I've tried to use gateway, last week, and it wasn't working. Is there any way to do it?
    Thanks!

    • @datagravity3578
      @datagravity3578  11 місяців тому +1

      Thanks for Watching !
      The method would be the same (except that you dont have to install the drivers, it comes with it) - you can refer to SQL Server data source in our on-prem data gateways here: learn.microsoft.com/en-us/power-bi/connect-data/service-gateway-enterprise-manage-sql

    • @LegendaryBullStrike
      @LegendaryBullStrike 11 місяців тому

      @@datagravity3578 Thanks! I'll try that when I got a chance :)