DP-203: 10 - Introduction to Azure Data Factory

Поділитися
Вставка
  • Опубліковано 30 лис 2024

КОМЕНТАРІ • 38

  • @keithmutamba1395
    @keithmutamba1395 7 днів тому

    Not only prepping a person for the cert but asserting best practices and big picture thinking. PHENOMENAL AND SENSATIONAL

  • @alphar85
    @alphar85 4 місяці тому +5

    Hi Piotr, i came across your channel and i have been buying resources esp books on Azure data engineering but trust me your videos are so simple, eloquent and easy to understand. I am grateful for your effort. keep going and keep your smile.

  • @ihorchebotarov8487
    @ihorchebotarov8487 4 місяці тому

    Piotr, you're the man! Thanks for your great content. Very structured and easy to understand

    • @TybulOnAzure
      @TybulOnAzure  4 місяці тому

      Thanks, I'm glad you enjoyed it.

  • @ishantsagar3979
    @ishantsagar3979 5 місяців тому +1

    This is a really good and helpful video. Couldn't thank you much.
    I am a newbie trying to get into data engineering. I will try my best to make out of this playlist.
    Thanks and please keep up the good work. :)

  • @prabhuraghupathi9131
    @prabhuraghupathi9131 8 місяців тому

    Good overview to ADF !! Thank you!!

  • @omarouaissi_sekouti4579
    @omarouaissi_sekouti4579 25 днів тому

    Hey Piotr,
    Thanks for the great explanation,
    In the video, you mentioned that we use ADF for data ingestion and orchestration of activities. Could you clarify why we don’t use it for data transformation as well, given that ADF includes several components designed for transformation?

    • @TybulOnAzure
      @TybulOnAzure  25 днів тому

      We can use it for transformations (it is covered later), but there are other tools that simply do it better.

    • @omarouaissi_sekouti4579
      @omarouaissi_sekouti4579 24 дні тому

      ​@@TybulOnAzure Thanks for the detailed answer!
      I have a another question: You previously mentioned that we use a staging (landing) area in ADLSg2 to create a copy of the data from the source. Could you clarify what an ODS is and why it differs from the staging area?

  • @dmitryzvorikin
    @dmitryzvorikin 4 місяці тому

    Thank you very much. I have wasted quite a some time because I chose the Ms Entra ID authentication and went digging into parameters because I could not switch the created server back to SQL auth. Seems possible only by destroy/create. Beware!

  • @rafaelvieira2003
    @rafaelvieira2003 6 місяців тому

    the hierarchy is different from the one you brilliant teached in the last video "Data Lake Structure". First domain, then source system. Thank you!

  • @PrajeshTX
    @PrajeshTX Місяць тому

    Hi Piotr,
    I understand creating a dataset is a good practice but I have seen production level pipelines with copy activity directly connected to SQL on prem databases. When is that a dataset is a must?
    Thanks,
    P

    • @TybulOnAzure
      @TybulOnAzure  Місяць тому

      Always. How was this pipeline configured that it wasn't using any dataset?

    • @PrajeshTX
      @PrajeshTX Місяць тому

      @@TybulOnAzure Sorry my bad, I was looking for dataset as an activity (similar to copy activity, etc) which you place somewhere on the pipeline but now understand that dataset is like a configuration.

  • @soumikmishra7288
    @soumikmishra7288 5 місяців тому

    So, Synapse Studio in Azure Synapse Analytics and ADF are same? they look very similar.

    • @TybulOnAzure
      @TybulOnAzure  5 місяців тому

      Synapse pipelines is pretty much the same thing as ADF.

  • @pavara.maddumage
    @pavara.maddumage Рік тому +1

    Hi Tybul, I got to know that the Microsoft role based and specialty exams are now open book exam since August update. Any tips for how to use Microsoft Learning efficiently while facing the exam? Thank you!

    • @TybulOnAzure
      @TybulOnAzure  Рік тому

      I would advise to use MS Learn for exam preparation so during the exam you will be familiar with it and you would know what to look for, where and how.
      The great thing about this change is that no longer you will have to memorize everything - it will be enough to know that some feature exists and you will just find details in MS Learn.

  • @smbs47
    @smbs47 10 місяців тому

    For ZoomIt, you might be able to click the right mouse key twice instead of using the Escape key. Ctrl-1 might also work.

    • @TybulOnAzure
      @TybulOnAzure  10 місяців тому

      Thanks! I'll definitely try that.

  • @zouhair8161
    @zouhair8161 11 місяців тому

    i have a question : do you have loading adventureworks db before the episode or it is a db provided by azure like a demo???

    • @TybulOnAzure
      @TybulOnAzure  11 місяців тому +1

      It is a sample database available from Azure - that's why I used "Sample" option when provisioning an Azure SQL DB.

    • @zouhair8161
      @zouhair8161 11 місяців тому

      @@TybulOnAzure thank u

  • @rkneti
    @rkneti Рік тому

    Great tutorial Piotr! Do you have any plans of doing videos on Microsoft Fabric?

    • @TybulOnAzure
      @TybulOnAzure  Рік тому +5

      Thanks! Yes, at some point I'll cover Fabric but I don't know yet when it would be. Maybe I'll make a separate playlist about Fabric that will be a kind of appendix to DP-203. Time will tell.

  • @sanjaybondili
    @sanjaybondili Місяць тому

    what a co-incidence the practicle shows in the video same date n month as 20 th Oct except year (2023) doing in 2024. 😀

    • @TybulOnAzure
      @TybulOnAzure  Місяць тому

      Yup, it's been already a year since I recorded it. Time flies.

  • @mdshohidurrahman
    @mdshohidurrahman 7 місяців тому

    U r the best.

  • @vaibhav8257
    @vaibhav8257 4 місяці тому

    Hi Piotr,
    I wanted to know exactly what is a dataset and what is its role?
    I have watched carefully in the video and it is still not clear to me.....
    Also Your content is extremely Good, thanks for it

    • @TybulOnAzure
      @TybulOnAzure  4 місяці тому +1

      Dataset represents the layout of your data and its properties. Let's assume that you want to load customers from:
      a) SQL database,
      b) CSV file.
      In both cases you would have a dataset that represents your customers so it would have the schema (columns and data types). However, depending on source of your data (db or file), you would have additional properties set in the dataset:
      a) For SQL DB - name of source table,
      b) For CSV file - path to the file, column delimiter, row delimiter, encoding, quote character, etc.

    • @vaibhav8257
      @vaibhav8257 4 місяці тому

      @@TybulOnAzure got it! Thank you 😊

  • @mgdesire9255
    @mgdesire9255 3 місяці тому

    Same pinch 42:37😂

  • @jonasr1504
    @jonasr1504 6 місяців тому

    Topman:)

  • @zouhair8161
    @zouhair8161 11 місяців тому

    ♥♥♥♥♥

  • @LATAMDataEngineer
    @LATAMDataEngineer 8 місяців тому

    thanks.