Microsoft Fabric: Import On Premise SQL Server 2022 Data into Microsoft Fabric with Data Pipelines!!

Поділитися
Вставка
  • Опубліковано 15 гру 2024

КОМЕНТАРІ • 7

  • @zhiyingwang1234
    @zhiyingwang1234 Місяць тому

    I'm confused. So in Power BI gateway, there is already Azure Data Lake Storage Gen2 connection created. This kind of gateway was not mentioned in other similar tutorials online. Is it really necessary? How to create this connection? This gateway was not mentioned before, it just suddenly appeared in the video.

  • @adilmajeed8439
    @adilmajeed8439 7 місяців тому +1

    Thanks for sharing. When you have created the connection in the Fabric service and then you had clicked to see the properties again, after scrolling at the bottom, I can’t see the staging part in my connection which you had showed in the properties dialogue box. Power BI Gateway is already been updated. Any pointers…

    • @Tales-from-the-Field
      @Tales-from-the-Field  7 місяців тому +1

      Hi @adilmajeed8439 per Bradley, "Hello sir, it's a little hard to troubleshoot this on UA-cam. Could you hit me up on Linkedin and we could DM with screen shots?" www.linkedin.com/in/sqlballs/

  • @vbhvprksh
    @vbhvprksh 3 місяці тому +1

    Where did you get that username and password from ??

    • @Tales-from-the-Field
      @Tales-from-the-Field  3 місяці тому +2

      Hi @vbhvprksh per Bradley, "Hi @vbhvprksh I've got two different places I use a login in the video, so I want to make sure I answer you correctly. The first is my domain account that was created in M365 that I use to log into the Power BI Data Gateway. That account is already aligned with a Microsoft Fabric tenant so when it authenticates it registers the Data Gateway with my tenant automatically. The second is a simple SQL Authentication account I created on my local system. I didn't show creating that account, but it is just a SQL Authentication account and has db_datareader and db_datawriter permissions. I hope this helps!"

  • @OneNI83
    @OneNI83 7 місяців тому +1

    In this method can we bulk import tables (lets say we want tables that are filtered using a query and those tables we need to import) ,what would be the maximum that can export at one go ?

    • @Tales-from-the-Field
      @Tales-from-the-Field  7 місяців тому +1

      Hi @OneNI83 per Bradley, "Look at this much like Azure Data Factory. Use this for your initial fork lift, based on your comfortability with this and other technologies. If you are a T-SQL Warehouse person, land it in files and do a CTAS to load or load it directly to a warehouse and then use T-SQL to Transform your data. If you are spark person, land it in files and use a notebook to transform and load your data. This is a powerful tool, but there's multiple ways to ingest data after it lands. So super long answer for a short questioni, Yes you can use this to bulk load. Not sure, there's not really a limit and you could scale the parallelism to increase throughput for VLDB workloads."