@Maheer. A quick question - Why do we need to have an external table when we can simply pull an external file into Data Lake and query on top of it. Will it not save the effort of creating the external data source and file format?
Awesome , i had been following this course and on video 16, one thing whenever something went wrong while running the script you always close and reopen the script why that happens : during explaining the SQL queries to select some part of query and try and forget to select all of selected query text which needs to be executed next, besides closing the script just select all or which part of query you want to execute and that would be fine :)
In ASA,the external table with dedicated sql pool become a physical table and data get duplicated from GEN2 detal table /parquet files ? But in case of serverless sql pool it's only logical and data doesn't get duplicated ?
may i know why we need these external tables? when we have dedicated sql pool. i know its cost is low and no data is stored. other than these is their any uses
Many tks sir, but what should i do if i want to set the filename base on my case ( just not want that name with the auto gen filename like the way the code was. Example: AEC2494D-E0F4-4EF0-999D.parquet)
When we create cetas using as select from a synapse table why it is generating many files .and the file extension is coming as .txt.deflated .Can't we generate one file with a name
Is it Possible to create a Table from External Table/External DataSource/External File Format. Also Can we pull data as External DataSource from Azure File Share which is available in ADLS Gen 2
Hi Maheer, Thank you so much for your wonderful explanation. But, I have one doubt. Does CETAS work with data based scoped credential created using SAS, I created it and tried to load data into ADLS Gen2, it didn't work. But I tried with Managed Identity and it worked.
SAS key will have some expiry set to it also SAS keys can be created at any level(folder or file or continer) so its complicated to handle with SAS keys. Using managed identity will be best way to go with
@@WafaStudies Hello, thank you very much I want doing : the passage of data from one dataflow to another with the use of temporary tables in azure synapse and data flow but temporary table (ie, #TemporaryTable) not visible. I use synapse pipeline with sql pool, not serveless . i use data flow Are there no other alternatives with temporary tables? I want use temporary tables to split dataflows. Thank you
i am able to extract but the files are saved in .txt format and are divided in multiple chunks is there any approach to save it in .csv format please let me know
You really have to realize that when you select SQL in the editor and press play it only runs the selected part. This has tripped you up so many times haha.
You have mastered the art of explaining complex concepts in simple words, keep it up!!
Thank you ☺️
Wonderful explanation..Maheer always rocks..Thank you Rockstar..
Welcome 🤗
Thank You so much for all Videos. You are doing great work.
Hi Maheer, Thank you so much for the clear explanation.
Thank you 🙂
@Maheer. A quick question - Why do we need to have an external table when we can simply pull an external file into Data Lake and query on top of it. Will it not save the effort of creating the external data source and file format?
Awesome , i had been following this course and on video 16,
one thing whenever something went wrong while running the script you always close and reopen the script
why that happens : during explaining the SQL queries to select some part of query and try and forget to select all of selected query text which needs to be executed next, besides closing the script just select all or which part of query you want to execute and that would be fine :)
Thank you. Sure 😃
WafaStudies awesome work btw.
@@sajjadahmad241 thank you 😊
In ASA,the external table with dedicated sql pool become a physical table and data get duplicated from GEN2 detal table /parquet files ? But in case of serverless sql pool it's only logical and data doesn't get duplicated ?
which path is given in BULK inside select query while creating external table?
Again Super, also how you point back to your previous video
Now I am become a fan of yours!
Thank you 😊
Very precise explanation
Thank you 😊
how can we handle delta / incremental data in folder - that is coming from IOT or daily files . In real scenario how do we call these scripts?.
i have been watching the videos from 1 to 15 so far ....@wafastudies XXXXXXXXtra Thanks for sharing the knowledge
Thank you ☺️
may i know why we need these external tables?
when we have dedicated sql pool.
i know its cost is low and no data is stored. other than these is their any uses
Many tks sir, but what should i do if i want to set the filename base on my case ( just not want that name with the auto gen filename like the way the code was. Example: AEC2494D-E0F4-4EF0-999D.parquet)
Nice explanation Bro thank you
Thank you 🙂
Please share the query content or put the description section whatever you have shown on your video
Does it create parquet file each time or it’s defined one time at time of Cetas , ? Is it possible to define a naming conv on the parquet file ?
When we create cetas using as select from a synapse table why it is generating many files .and the file extension is coming as .txt.deflated .Can't we generate one file with a name
Is it Possible to create a Table from External Table/External DataSource/External File Format.
Also Can we pull data as External DataSource from Azure File Share which is available in ADLS Gen 2
Can the same be done in dedicated sql pool?
great content
Thank you 😊
@WafaStudies Why don't you share the scripts? Please use google drive/ other options to share the scrips.
Hi Maheer, Thank you so much for your wonderful explanation.
But, I have one doubt. Does CETAS work with data based scoped credential created using SAS, I created it and tried to load data into ADLS Gen2, it didn't work. But I tried with Managed Identity and it worked.
SAS key will have some expiry set to it also SAS keys can be created at any level(folder or file or continer) so its complicated to handle with SAS keys. Using managed identity will be best way to go with
@@WafaStudies Thank You
@@WafaStudies Hello, thank you very much
I want doing : the passage of data from one dataflow to another with the use of temporary tables in azure synapse and data flow but temporary table (ie, #TemporaryTable) not visible.
I use synapse pipeline with sql pool, not serveless . i use data flow
Are there no other alternatives with temporary tables?
I want use temporary tables to split dataflows.
Thank you
Nice sir...
Thank you 😊
Thanks a lot!
Brilliant
Thank you 😊
will it work for csv file format??
i am able to extract but the files are saved in .txt format and are divided in multiple chunks is there any approach to save it in .csv format please let me know
You really have to realize that when you select SQL in the editor and press play it only runs the selected part. This has tripped you up so many times haha.
Hehehe ya
😂