16. CETAS with Synapse SQL in Azure Synapse Analytics

Поділитися
Вставка
  • Опубліковано 7 лис 2024

КОМЕНТАРІ • 49

  • @ajinkyaghadge2405
    @ajinkyaghadge2405 2 роки тому +8

    You have mastered the art of explaining complex concepts in simple words, keep it up!!

  • @himanshutrivedi4956
    @himanshutrivedi4956 3 роки тому +1

    Wonderful explanation..Maheer always rocks..Thank you Rockstar..

  • @vishwdipsakpal7080
    @vishwdipsakpal7080 Рік тому

    Thank You so much for all Videos. You are doing great work.

  • @pavankumar-nm9yu
    @pavankumar-nm9yu 3 роки тому +1

    Hi Maheer, Thank you so much for the clear explanation.

  • @varunkakkar2654
    @varunkakkar2654 Рік тому

    @Maheer. A quick question - Why do we need to have an external table when we can simply pull an external file into Data Lake and query on top of it. Will it not save the effort of creating the external data source and file format?

  • @sajjadahmad241
    @sajjadahmad241 3 роки тому +2

    Awesome , i had been following this course and on video 16,
    one thing whenever something went wrong while running the script you always close and reopen the script
    why that happens : during explaining the SQL queries to select some part of query and try and forget to select all of selected query text which needs to be executed next, besides closing the script just select all or which part of query you want to execute and that would be fine :)

  • @lgvjlal-vp7kd
    @lgvjlal-vp7kd Рік тому

    In ASA,the external table with dedicated sql pool become a physical table and data get duplicated from GEN2 detal table /parquet files ? But in case of serverless sql pool it's only logical and data doesn't get duplicated ?

  • @srishtitalreja5832
    @srishtitalreja5832 Рік тому

    which path is given in BULK inside select query while creating external table?

  • @ADODenHaag-QuintenNielo-md8li

    Again Super, also how you point back to your previous video

  • @pkpatnaik81
    @pkpatnaik81 2 роки тому +1

    Now I am become a fan of yours!

  • @utsavchanda4190
    @utsavchanda4190 3 роки тому +1

    Very precise explanation

  • @a2zhi976
    @a2zhi976 Рік тому

    how can we handle delta / incremental data in folder - that is coming from IOT or daily files . In real scenario how do we call these scripts?.

  • @yveshermann
    @yveshermann 2 роки тому +1

    i have been watching the videos from 1 to 15 so far ....@wafastudies XXXXXXXXtra Thanks for sharing the knowledge

  • @rameshpeddine7123
    @rameshpeddine7123 Рік тому

    may i know why we need these external tables?
    when we have dedicated sql pool.
    i know its cost is low and no data is stored. other than these is their any uses

  • @namnguyenvan4569
    @namnguyenvan4569 2 роки тому

    Many tks sir, but what should i do if i want to set the filename base on my case ( just not want that name with the auto gen filename like the way the code was. Example: AEC2494D-E0F4-4EF0-999D.parquet)

  • @AnilKumar-ut2bi
    @AnilKumar-ut2bi 3 роки тому +1

    Nice explanation Bro thank you

  • @digitaltechconnect1318
    @digitaltechconnect1318 Рік тому

    Please share the query content or put the description section whatever you have shown on your video

  • @arjunpalitphotography
    @arjunpalitphotography 2 роки тому

    Does it create parquet file each time or it’s defined one time at time of Cetas , ? Is it possible to define a naming conv on the parquet file ?

  • @robinthomas7782
    @robinthomas7782 2 роки тому

    When we create cetas using as select from a synapse table why it is generating many files .and the file extension is coming as .txt.deflated .Can't we generate one file with a name

  • @KaustubhTendulkarpro
    @KaustubhTendulkarpro 3 роки тому

    Is it Possible to create a Table from External Table/External DataSource/External File Format.
    Also Can we pull data as External DataSource from Azure File Share which is available in ADLS Gen 2

  • @sandinnut956
    @sandinnut956 8 місяців тому

    Can the same be done in dedicated sql pool?

  • @yveshermann
    @yveshermann 2 роки тому +1

    great content

  • @TarekDelft
    @TarekDelft Рік тому

    @WafaStudies Why don't you share the scripts? Please use google drive/ other options to share the scrips.

  • @ananthkrishnabattepati
    @ananthkrishnabattepati 3 роки тому +1

    Hi Maheer, Thank you so much for your wonderful explanation.
    But, I have one doubt. Does CETAS work with data based scoped credential created using SAS, I created it and tried to load data into ADLS Gen2, it didn't work. But I tried with Managed Identity and it worked.

    • @WafaStudies
      @WafaStudies  3 роки тому +3

      SAS key will have some expiry set to it also SAS keys can be created at any level(folder or file or continer) so its complicated to handle with SAS keys. Using managed identity will be best way to go with

    • @ananthkrishnabattepati
      @ananthkrishnabattepati 3 роки тому

      @@WafaStudies Thank You

    • @dataengineerazure2983
      @dataengineerazure2983 3 роки тому

      @@WafaStudies Hello, thank you very much
      I want doing : the passage of data from one dataflow to another with the use of temporary tables in azure synapse and data flow but temporary table (ie, #TemporaryTable) not visible.
      I use synapse pipeline with sql pool, not serveless . i use data flow
      Are there no other alternatives with temporary tables?
      I want use temporary tables to split dataflows.
      Thank you

  • @rk-ej9ep
    @rk-ej9ep 2 роки тому +1

    Nice sir...

  • @sachinv9923
    @sachinv9923 Рік тому

    Thanks a lot!

  • @deepak0417
    @deepak0417 3 роки тому +1

    Brilliant

  • @alphavoyager14
    @alphavoyager14 Місяць тому

    will it work for csv file format??

    • @alphavoyager14
      @alphavoyager14 17 днів тому

      i am able to extract but the files are saved in .txt format and are divided in multiple chunks is there any approach to save it in .csv format please let me know

  • @alfredsfutterkiste7534
    @alfredsfutterkiste7534 2 роки тому +1

    You really have to realize that when you select SQL in the editor and press play it only runs the selected part. This has tripped you up so many times haha.