How to Access On-Prem Files with Fabric Pipelines (Microsoft Fabric tutorial)

Поділитися
Вставка
  • Опубліковано 17 січ 2025

КОМЕНТАРІ • 8

  • @monn3t440
    @monn3t440 3 місяці тому

    Thanks so much! This was awesome! Not even AI could help me. This did it.

  • @Shaffiullahkhan
    @Shaffiullahkhan 4 місяці тому +1

    Is it possible to access the file system path of on-premise files to be used in a Python notebook using Fabric?

  • @royjasper838
    @royjasper838 2 місяці тому

    Hello! thanks for the valuable video!
    Q: Any way to copy all the files from a directory to a lakehouse with a pipeline? ( gateway is running on this machine)

  • @JeanFrancoisGay
    @JeanFrancoisGay Рік тому +1

    Good explanation!
    I am facing a challenge: I have hundreds of files to copy from on-prem, with different names. I have a list of these files (currently in a table, but I could have it in any format).
    Question: Do you know of a way to:
    (1) parametrize this dataflow to pass it a "SourceFileName" & "TargetFileName" to use in the source and targets? I was able to create parameters, but the pipeline does not seem to allow passing parameters to the dataflow.
    or (2) to iterate over a list of files names in your dataflow to copy them all?
    My only other option is to create a static query for each files, which is not very flexible.
    Any insights on how to accomplish this in Fabric?

    • @jamesd9356
      @jamesd9356 11 місяців тому

      Hi just wondering if you have had any success in doing this, I have a similar use case.

  • @carexpertDATA
    @carexpertDATA Рік тому

    Yes, thats a way for one csv file or one table. But what if you want to copy an entire onprem SQL Server database to the bronze layer of your lakehouse? The dataflows work fine with gateways to the SQL server. But they're limited to 50 tables and simply crash if you want to access large databases.

  • @thomasgremm6127
    @thomasgremm6127 Рік тому

    Chapeau!
    It is persisted in OneLake, isn’t it?
    Is the same possible with ADLSv2 via On premise Data Gateway?

    • @johannesjolkkonen
      @johannesjolkkonen  Рік тому

      Yup, all the data in Fabric is stored in OneLake!
      Right now, ADLSv2 isn't among the storage destinations you can choose for Data flows. However, after you get your on-prem files into a Lakehouse, you could add a Copy Data -activity to copy it from that Lakehouse to your ADLSv2.