Here is a 56 year old lady learing Snowflake, and earned 2 snowflake badges, and preparing for SnowPro Certification, Thanks Data Engineering team for putting such amazing videos.
@@DataEngineering undoubtedly best online training i have taken - payed or non payed for any cloud topic. only thing is some of the link for SQL doesn't work
Your the hero we need, Sir!!!! Thank you for posting this. Your videos are very accentuate on point. The very Hands on training you provide differs you from the entire resources out there. Thank you for your time and knowledge in posting these videos.
Thank you 🙏 for watching my video @Chaitanya Krishna and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
One of the BEST TUTORIAL series to learn Snowflake. This one is too good and better than any paid course. Could you please share SQLs/lab practice material in the description box for each lesson.
Excellent Sir...!!!! I am learning Snowflake from your video series.... I am the beginner to Snowflake but your teaching made me comfortable with Snowflake... I wish to recall one of other comments that you are "Spiderman of Snowflake".... Kindly keep posting Sir...Many thanks for your efforts and TIME on this knowledge sharing..... One request Sir it will be helpful if we get code for hands on.....
You are most welcome. here is the code link for the video, let me check if ch-17 is up or not Ch-19 toppertips.com/snowflake-etl-example-ch19-part01 Chc-20 toppertips.com/role-grants-role-hierarchy-example-ch20 Stream & Task - toppertips.com/stream-and-task-snowflake-jump-start Ch-17 is having some URL issue..will fix it and share it.
Hi , i am new to snowflake and gone through the video ,had got one doubt -- Files are there in azure container and these files comes every day to this container , data to be copied to the snowflake (this wil be source table right ) continuously, My doubt is whatever the data is coming from the files(with DML operation on the same data ) everyday is simply loaded to the source table as new records or these DML operations applied on the source table ?
You're very welcome.. --------- and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=NEWYEAR50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=NEWYEAR35
Very clear explanation. I have one question here: create a default stream, ran Insert, Update and Delete DML operations on Source table. if we consume only Insert data from stream and commit, then rest of the changes also resetting. why?
Even I observed the same thing and that's why either you have to have 2 streams or you can have multiple sql consumer statement under a transaction. I will do some additional reading and let you know how it works and if this behaviour is as per design or bug.
Gold mine of knowledge... Thank you for all your efforts Sir 🙏 This channel is probably amongst the best out there...😇 I hope it gets the appreciation in terms of number on yt that it deserves. 🤞👏
Currently i am going through the videos and i am preparing for Snowflake Advance data engineering certification. Could you please guide me which playlist more towards data engineering course content. Where can i download those worksheet which you are doing in the videos.
Thank you for sharing amazing content. I have a doubt, when S3 delta data flow through snowpipe in landing tables, files header also populates as values. However, we already set skip_header=1 while defining file format. Why it is considering headers as one of values while loading data into landing tables
Business problem I am struggling with: I have a table with some data which Is replaced by new data every week. Its in Json format so i structure it after loading into snowflake. The problem is I am unable to capture what changed when the new data is loaded and replaced by the old data. For example if i want to check what was the total savings till june and what is the savings now. I dont have that functionality because i dont have date to measure. its like a snapshot functionality i want to add in the table so everytime new data is loaded it captures the change .Please suggest some solutions around it.
why don't you use time travel feature .. watch this video..it may help to solve your issue (Time Travel Master Class - ua-cam.com/video/AdESTexG7QA/v-deo.html)
Hi Sir, Thanks for the great tutorial on Streams. It was really very easy to follow your video. However, I am curious if we can create a stream to capture if the changes happen to some columns on a source table and not on all the columns. Like for eg: TEST_TABLE (COL1, COL2, COL3) contains 3 columns. I only care about the changes happen to COL1 and COL3, want to ignore for COL2. So is it possible to create Stream this way CREATE OR REPLACE STREAM TEST_STREAM ON TABLE TEST_TABLE (COL1, COL3) Or is there any alternative approach to achieve the above behaviour. Any help is appreciated. Thanks again for the great series.
Why not use VARCHAR default length? if we use columns in the tables without defining any length and keeping it as MAX, snowflake confirms that there is no impact on the performance. do you see any impact to the downstream or any ETL tools which uses snowflake?
If we create stream with show_intiail_rows=true then get_ddl not giving this option, do we get all this structure with any other options ?could you please reply on this ?
Thanks for the video, just one thing, I was getting error while trying to access Sql Link from description, is that restricted to a certain group? Cheers!
Yes, you can think that way.. 1. Stream which captures all changes (insert/updated/delete) 2. Stream which captures only insert and not update/delete (like IoT use cases) 3. Stream on external table is always append only (Snowflake does not support update or delete) Hope it makes the concept clear.
Thank you 🙏 @Girish for watching my video and your word of appreciation really means a lot to me. Stream object can be created on table including external table and they are not applicable for views. Will see if I can make videos on different stream scenario and thanks for sharing your ideas.
follow the offset concept from 6th min onwards and if you understand the offset concept, you would know that it is covered. Stream just captures the offset and it does not hold any data by itself and that's why there is no cost. I would make separate video on stream cost to add additional clarity. Thanks for your comment.
It is possible that there is single source but there are different business team or group of people who wants to track changes as per their business rules.. for those cases, you can have multiple streams on single table..
thank you... If you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... this 2 videos... will help you to broaden your knowledge.. Thse contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50 2. www.udemy.com/course/
Could you elaborate your question. Do you mean stale? - Streams are build on the top of time travel and if stream (CDC) is not captured, the chage part is no move available. I did not understand the 2nd part of the question .. about Type.. do you mean stream type? They are all explained in the video very well.
It’s showing Stale after = 14days(default) Can we change or set the stream stale_after date to a any particular date like example stale_after=1 day and stale_after= 2 years
Here is a 56 year old lady learing Snowflake, and earned 2 snowflake badges, and preparing for SnowPro Certification, Thanks Data Engineering team for putting such amazing videos.
That is awesome!
you can also watch these snowpro practice test playlist
ua-cam.com/play/PLba2xJ7yxHB5X2CMe7qZZu-V4LxNE1HbF.html
@@DataEngineering undoubtedly best online training i have taken - payed or non payed for any cloud topic. only thing is some of the link for SQL doesn't work
I was looking for a quick revision on Snowflake and these are one of the best tutorials I have ever seen. Thanks a ton.
thanks dear..
Your the hero we need, Sir!!!! Thank you for posting this. Your videos are very accentuate on point. The very Hands on training you provide differs you from the entire resources out there. Thank you for your time and knowledge in posting these videos.
I appreciate that! and thank you so much for sharing your thoughts.... I feel good when my knowledge helps other to learn and become better...
WOW.. Extraordinary Tutorial.. Can't find anywhere. Thanks for Sharing Knowledge
Thank you 🙏 for watching my video @Chaitanya Krishna and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
One of the BEST TUTORIAL series to learn Snowflake. This one is too good and better than any paid course. Could you please share SQLs/lab practice material in the description box for each lesson.
Got a clear picture on Stream and different types of Stream and how it differs. Thank you !
Glad it helped and thanks for following my playlist.
Excellent Sir...!!!! I am learning Snowflake from your video series.... I am the beginner to Snowflake but your teaching made me comfortable with Snowflake...
I wish to recall one of other comments that you are "Spiderman of Snowflake".... Kindly keep posting Sir...Many thanks for your efforts and TIME on this knowledge sharing.....
One request Sir it will be helpful if we get code for hands on.....
You are most welcome.
here is the code link for the video, let me check if ch-17 is up or not
Ch-19 toppertips.com/snowflake-etl-example-ch19-part01
Chc-20 toppertips.com/role-grants-role-hierarchy-example-ch20
Stream & Task - toppertips.com/stream-and-task-snowflake-jump-start
Ch-17 is having some URL issue..will fix it and share it.
Explained very well with excellent examples. Thank you.
Glad it was helpful!
Hi , i am new to snowflake and gone through the video ,had got one doubt -- Files are there in azure container and these files comes every day to this container , data to be copied to the snowflake (this wil be source table right ) continuously, My doubt is whatever the data is coming from the files(with DML operation on the same data ) everyday is simply loaded to the source table as new records or these DML operations applied on the source table ?
Thanks for uploading 👍. I was waiting.. as usual great work 👌
So nice of you @Parashuram N.
Happy to hear that thesse videos are helping all of those who wants to learn this cool technology.
yours is far better than udemy courses.
thank you for your note..
thank you for the wonderful playlist
You're very welcome..
---------
and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=NEWYEAR50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=NEWYEAR35
Very clear explanation. I have one question here: create a default stream, ran Insert, Update and Delete DML operations on Source table. if we consume only Insert data from stream and commit, then rest of the changes also resetting. why?
Even I observed the same thing and that's why either you have to have 2 streams or you can have multiple sql consumer statement under a transaction.
I will do some additional reading and let you know how it works and if this behaviour is as per design or bug.
@@DataEngineering I think, this one limitation in the stream.
when we done any operation on stream table, it could be empty.
Gold mine of knowledge...
Thank you for all your efforts Sir 🙏
This channel is probably amongst the best out there...😇
I hope it gets the appreciation in terms of number on yt that it deserves. 🤞👏
So nice of you
For a delta stream, it seems it is only capturing the current version of the table and all are INSERT. Can you please help understand?
Could you elaborate more?
Nice explanation, keep going on. Thankyou..!
Currently i am going through the videos and i am preparing for Snowflake Advance data engineering certification. Could you please guide me which playlist more towards data engineering course content. Where can i download those worksheet which you are doing in the videos.
Thank you for sharing amazing content. I have a doubt, when S3 delta data flow through snowpipe in landing tables, files header also populates as values. However, we already set skip_header=1 while defining file format.
Why it is considering headers as one of values while loading data into landing tables
Is it work with external table
Great work master Eagerly Waiting when next video is going to come.
Working on it
That was a great video.. What if I have to implement the SCD Type 2 without stream and task on the table with 500 columns, how we can implement it?
If you have to implement without stream/task, then you have to use time travel feature but that will make the overall implementation very complicated.
Business problem I am struggling with: I have a table with some data which Is replaced by new data every week. Its in Json format so i structure it after loading into snowflake. The problem is I am unable to capture what changed when the new data is loaded and replaced by the old data. For example if i want to check what was the total savings till june and what is the savings now. I dont have that functionality because i dont have date to measure. its like a snapshot functionality i want to add in the table so everytime new data is loaded it captures the change .Please suggest some solutions around it.
why don't you use time travel feature .. watch this video..it may help to solve your issue (Time Travel Master Class - ua-cam.com/video/AdESTexG7QA/v-deo.html)
Hi Sir,
Thanks for the great tutorial on Streams. It was really very easy to follow your video. However, I am curious if we can create a stream to capture if the changes happen to some columns on a source table and not on all the columns.
Like for eg:
TEST_TABLE (COL1, COL2, COL3) contains 3 columns. I only care about the changes happen to COL1 and COL3, want to ignore for COL2.
So is it possible to create Stream this way
CREATE OR REPLACE STREAM TEST_STREAM
ON TABLE TEST_TABLE (COL1, COL3)
Or is there any alternative approach to achieve the above behaviour. Any help is appreciated. Thanks again for the great series.
This is really awesome.
Is the sample code available in GIT ?
Why not use VARCHAR default length? if we use columns in the tables without defining any length and keeping it as MAX, snowflake confirms that there is no impact on the performance. do you see any impact to the downstream or any ETL tools which uses snowflake?
You are right.. I used sample DDL for demo purpose .. it is not the best practice video.. so you are right..
If we create stream with show_intiail_rows=true then get_ddl not giving this option, do we get all this structure with any other options ?could you please reply on this ?
hii plz provide the code which u r writing in snowflake terminal
great work and great help.
Glad it helped
Very well explained!!!
Glad you liked it
How to know stream status like data reading from base table or data reading completed from base table?
Thanks for the video, just one thing, I was getting error while trying to access Sql Link from description, is that restricted to a certain group? Cheers!
Thanks for your note.. looks there is a probem.. let me check and fix it..
This is awesome..
Glad you liked it..
As usual, great video
Glad you enjoyed it
So there are 3 types of stream in Snowflake. Delta, append & insert only for external table
Yes, you can think that way..
1. Stream which captures all changes (insert/updated/delete)
2. Stream which captures only insert and not update/delete (like IoT use cases)
3. Stream on external table is always append only (Snowflake does not support update or delete)
Hope it makes the concept clear.
how to get this code for execution
Excellent Video!! Can you please add a video on streams on Views ( also secured views) and the various scenarios that will play out?
Thank you 🙏 @Girish for watching my video and your word of appreciation really means a lot to me.
Stream object can be created on table including external table and they are not applicable for views. Will see if I can make videos on different stream scenario and thanks for sharing your ideas.
@@DataEngineering Streams can be created on views and secure views
very useful
Glad you think so!
if you could share some knowledge on procedure and automating them using task considering one real case would definitely help full to all.
It is in my list and will release soon.
Thanks for good lesson but does stream cost a lot is not been answered
follow the offset concept from 6th min onwards and if you understand the offset concept, you would know that it is covered.
Stream just captures the offset and it does not hold any data by itself and that's why there is no cost.
I would make separate video on stream cost to add additional clarity. Thanks for your comment.
@@DataEngineering thanks a lot. yes, it only stored offset but at the same it will use cost for processing the data from stream.
Could you please check the resource file link ? It not working
Let me check..
I am planning to re-organize them in a different platform, will soon update you, for now they are missing and really sorry for that.
what is the use case of having multiple streams on a table
It is possible that there is single source but there are different business team or group of people who wants to track changes as per their business rules.. for those cases, you can have multiple streams on single table..
too good
thank you...
If you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... this 2 videos... will help you to broaden your knowledge..
Thse contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50
2. www.udemy.com/course/
can you please post all video text tutorials, only some of topics are available remaining all are videos there
Working on it.. will do it soon.
What is first-class object means?
These objects can be manipulated using SQL commands, they are securable objects and different privileges can be given.
So Remaining 5 chapter are in making is it?
yes, will publish them soon.
What is tale and type in stream result table.
Could you elaborate your question.
Do you mean stale? - Streams are build on the top of time travel and if stream (CDC) is not captured, the chage part is no move available.
I did not understand the 2nd part of the question .. about Type.. do you mean stream type? They are all explained in the video very well.
It’s showing Stale after = 14days(default)
Can we change or set the stream stale_after date to a any particular date like example stale_after=1 day and stale_after= 2 years
I am not sure if snowflake added one paramtere to set this value.. 14 is a default given by snowflake..
@@DataEngineering thanks for your information
These tutorials are too good and These are helping me a lot in my project
Thanks once again
@@saiyadav5014 Thank you so much
@@DataEngineering I had a question over here...What is the use of data retention period if time travel does not work on streams??
where is stream storage cost sir.
Stream is not a separate object and the existing table gets 3 additional column, so cost of stream is cost of table itself.
The link is unsecured Can you share an other link please
will check and share..
i dont have the permission to access SQL scripts,why?
let me fix it.
@@DataEngineering still it's not working. 403 error. please fix it