I have to admit up until now I wasn't sure if Data Engineering is right for me, but seeing you design that pipeline was fascinating (I know it's a simple pipeline) but man it was awesome to watch. Hats off to you my friend
Good video to learn how to ingest data using lookup, foreach in Azure!! I liked your way of iterative way of teaching !! Hopefully will complete this series and will plan for my certification!! Thanks for your effort on this series!!
very help full video, that amazing pipeline that work on dynamic way. everything it try to make it generic. very very useful. thanks a lot for this informative video and full series. thanks tybul.
Wow this was GREAT!! So much i learnt from this.Thank You So much Sir!! And for the test which you gave at the end to have table name wise target structure, i added this and it worked !! @concat( dataset().ServerName, '/', dataset().DatabaseName, '/', dataset().TableName, '/Year=', formatDateTime(utcnow(),'yyyy'), '/Month=', formatDateTime(utcnow(),'MM'), '/Day=', formatDateTime(utcnow(),'dd') )
I've just uploaded the json definition of the whole ADF pipeline that contains those queries. You can find it in my GitHub (link is in the video description).
Here you can find Microsoft docs about ADF expression language: learn.microsoft.com/en-us/azure/data-factory/control-flow-expression-language-functions
Hi Tybul, amazing video series! I've been following your videos religiously and plan on taking the exam sometime next year. I'm currently practicing sql on leetcode. What do you suggest would be the fastest way to get fluent with SQL?
In this episode I uploaded data from all tables to the same directory in ADLSg2. Without good file names I wouldn't be able to tell what data is stored in each of those files.
@@TybulOnAzurecan't wait for that 😊 For the ci/cd you will cover ARM template deployment or use an IaC tool liké Terraform. Sorry if i'm spoiling the séries🙈
i selected parquet for sink dataset , however it gave an error while running the pipeline for dbo.BuildVersion . it says "ErrorCode=ParquetInvalidColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The column name is invalid. Column name cannot contain these character:[,;{}() \t=],Source=Microsoft.DataTransfer.Common," . Does anyone know how to solve it ? Thank You !
I have to admit up until now I wasn't sure if Data Engineering is right for me, but seeing you design that pipeline was fascinating (I know it's a simple pipeline) but man it was awesome to watch.
Hats off to you my friend
Glad you liked it!
This has to be the best series I've followed. The content is priceless.
Thanks for this Tybul this.
Good video to learn how to ingest data using lookup, foreach in Azure!! I liked your way of iterative way of teaching !! Hopefully will complete this series and will plan for my certification!! Thanks for your effort on this series!!
Great stuff ❤
Thanks ✌️
Hi Tybul I am following the series and it's awesome. Thank u so much!!
Great stuff tybul. I appreciate your valuable insights.take a look at copy activity settings as well.
very help full video, that amazing pipeline that work on dynamic way. everything it try to make it generic. very very useful. thanks a lot for this informative video and full series. thanks tybul.
You are welcome, I'm glad you enjoyed it!
Wonderful video sir, thank you
Great one! Thanks!
Gold material. Thanks
Wow this was GREAT!! So much i learnt from this.Thank You So much Sir!!
And for the test which you gave at the end to have table name wise target structure, i added this and it worked !!
@concat(
dataset().ServerName,
'/',
dataset().DatabaseName,
'/',
dataset().TableName,
'/Year=',
formatDateTime(utcnow(),'yyyy'),
'/Month=',
formatDateTime(utcnow(),'MM'),
'/Day=',
formatDateTime(utcnow(),'dd')
)
Great job! And I'm super happy that you liked the episode.
33:25 Sneaky Paws spotted
Thanks for the video and the course itself! Would you be so kind to share queries you've used, so it would be easier to follow along?
I've just uploaded the json definition of the whole ADF pipeline that contains those queries. You can find it in my GitHub (link is in the video description).
Hello Tybul, Late to the party, I did not understand how did you write the pipeline expresiion. Do I need to refer to something to know about that?
Here you can find Microsoft docs about ADF expression language: learn.microsoft.com/en-us/azure/data-factory/control-flow-expression-language-functions
@@TybulOnAzure thankyou for sharing
Hi Tybul, amazing video series! I've been following your videos religiously and plan on taking the exam sometime next year. I'm currently practicing sql on leetcode. What do you suggest would be the fastest way to get fluent with SQL?
Thanks! As for SQL I highly recommend books by Itzik Ben-Gan, e.g. this one: www.amazon.com/T-SQL-Fundamentals-3rd-Itzik-Ben-Gan/dp/150930200X
Hey,
Why did you put tableName and schemaNama between quotes , is it a good practise to give name to files ?
In this episode I uploaded data from all tables to the same directory in ADLSg2. Without good file names I wouldn't be able to tell what data is stored in each of those files.
Another super vidéo💪🙏🙏
Thank you very much
Say Hi to jour cat 😸 (33')
Are you planning any vidéo on Databricks? Stream Analytics? Azure data + DevOps?
My cat says hello:) Yes, I'll cover Databricks, Stream Analytics and CI/CD for ADF.
@@TybulOnAzurecan't wait for that 😊
For the ci/cd you will cover ARM template deployment or use an IaC tool liké Terraform.
Sorry if i'm spoiling the séries🙈
I'll talk about and show two approaches: ARM-based one and selective one. I don't plan to include Terraform.
You're the best, bravo!!... Next course Azure architecture 😁.
SELECT QUOTENAME(t.name) AS tableName, QUOTENAME(SCHEMA_NAME(t.schema_id)) AS schemaName FROM sys.tables as t
or just
select quotename(name) as tablename, quotename(schema_name(schema_id)) as schemaName from sys.tables
@concat(
dataset().serverName,
'/',
dataset().databaseName,
'/Year=',
formatDateTime(utcnow(),'yyyy'),
'/Month=',
formatDateTime(utcnow(),'MM'),
'/Day=',
formatDateTime(utcnow(),'dd')
)
Hello brother, New to ADF Is there any documentation to know how to write pipeline expression or its a SQL syntax?
@concat('SELECT * FROM ', item().schemaName, '.', item(). tableName)
@concat(dataset().schemaName,'_', dataset().tableName,'.csv')
i selected parquet for sink dataset , however it gave an error while running the pipeline for dbo.BuildVersion . it says "ErrorCode=ParquetInvalidColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The column name is invalid. Column name cannot contain these character:[,;{}()
\t=],Source=Microsoft.DataTransfer.Common," . Does anyone know how to solve it ? Thank You !
Yup, some characters are not allowed in column names when using Parquet file format. You would have to map them to proper names.