Snowflake Stream & Change Data Capture | Chapter-17 | Snowflake Hands-on Tutorial

Поділитися
Вставка
  • Опубліковано 12 лис 2024

КОМЕНТАРІ • 98

  • @sujaa1000
    @sujaa1000 Рік тому +16

    Here is a 56 year old lady learing Snowflake, and earned 2 snowflake badges, and preparing for SnowPro Certification, Thanks Data Engineering team for putting such amazing videos.

    • @DataEngineering
      @DataEngineering  Рік тому +1

      That is awesome!
      you can also watch these snowpro practice test playlist
      ua-cam.com/play/PLba2xJ7yxHB5X2CMe7qZZu-V4LxNE1HbF.html

    • @samihussain-bc6uf
      @samihussain-bc6uf Рік тому

      @@DataEngineering undoubtedly best online training i have taken - payed or non payed for any cloud topic. only thing is some of the link for SQL doesn't work

  • @laxmansingh-x8e
    @laxmansingh-x8e Рік тому

    I was looking for a quick revision on Snowflake and these are one of the best tutorials I have ever seen. Thanks a ton.

  • @karanmachendranath7162
    @karanmachendranath7162 2 роки тому +2

    Your the hero we need, Sir!!!! Thank you for posting this. Your videos are very accentuate on point. The very Hands on training you provide differs you from the entire resources out there. Thank you for your time and knowledge in posting these videos.

    • @DataEngineering
      @DataEngineering  2 роки тому

      I appreciate that! and thank you so much for sharing your thoughts.... I feel good when my knowledge helps other to learn and become better...

  • @chaitanyakrishna5873
    @chaitanyakrishna5873 2 роки тому +1

    WOW.. Extraordinary Tutorial.. Can't find anywhere. Thanks for Sharing Knowledge

    • @DataEngineering
      @DataEngineering  2 роки тому

      Thank you 🙏 for watching my video @Chaitanya Krishna and your word of appreciation really means a lot to me.
      ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
      I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
      🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
      🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
      🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
      🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
      ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡

  • @iamjags
    @iamjags Рік тому

    One of the BEST TUTORIAL series to learn Snowflake. This one is too good and better than any paid course. Could you please share SQLs/lab practice material in the description box for each lesson.

  • @nishavkrishnan4271
    @nishavkrishnan4271 2 роки тому

    Got a clear picture on Stream and different types of Stream and how it differs. Thank you !

    • @DataEngineering
      @DataEngineering  2 роки тому

      Glad it helped and thanks for following my playlist.

  • @manjunathah5484
    @manjunathah5484 2 роки тому +1

    Excellent Sir...!!!! I am learning Snowflake from your video series.... I am the beginner to Snowflake but your teaching made me comfortable with Snowflake...
    I wish to recall one of other comments that you are "Spiderman of Snowflake".... Kindly keep posting Sir...Many thanks for your efforts and TIME on this knowledge sharing.....
    One request Sir it will be helpful if we get code for hands on.....

    • @DataEngineering
      @DataEngineering  2 роки тому

      You are most welcome.
      here is the code link for the video, let me check if ch-17 is up or not
      Ch-19 toppertips.com/snowflake-etl-example-ch19-part01
      Chc-20 toppertips.com/role-grants-role-hierarchy-example-ch20
      Stream & Task - toppertips.com/stream-and-task-snowflake-jump-start
      Ch-17 is having some URL issue..will fix it and share it.

  • @BruchoSindicate
    @BruchoSindicate 2 роки тому +1

    Explained very well with excellent examples. Thank you.

  • @cherukurid0835
    @cherukurid0835 Рік тому

    Hi , i am new to snowflake and gone through the video ,had got one doubt -- Files are there in azure container and these files comes every day to this container , data to be copied to the snowflake (this wil be source table right ) continuously, My doubt is whatever the data is coming from the files(with DML operation on the same data ) everyday is simply loaded to the source table as new records or these DML operations applied on the source table ?

  • @parashuramn4348
    @parashuramn4348 2 роки тому +1

    Thanks for uploading 👍. I was waiting.. as usual great work 👌

    • @DataEngineering
      @DataEngineering  2 роки тому +1

      So nice of you @Parashuram N.
      Happy to hear that thesse videos are helping all of those who wants to learn this cool technology.

  • @praveennarayanan478
    @praveennarayanan478 Рік тому +2

    yours is far better than udemy courses.

  • @praveenchinnareddy1552
    @praveenchinnareddy1552 10 місяців тому

    thank you for the wonderful playlist

    • @DataEngineering
      @DataEngineering  10 місяців тому

      You're very welcome..
      ---------
      and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
      These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
      1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=NEWYEAR50
      2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=NEWYEAR35

  • @RaviTheVlogger
    @RaviTheVlogger 2 роки тому +1

    Very clear explanation. I have one question here: create a default stream, ran Insert, Update and Delete DML operations on Source table. if we consume only Insert data from stream and commit, then rest of the changes also resetting. why?

    • @DataEngineering
      @DataEngineering  2 роки тому +2

      Even I observed the same thing and that's why either you have to have 2 streams or you can have multiple sql consumer statement under a transaction.
      I will do some additional reading and let you know how it works and if this behaviour is as per design or bug.

    • @RaviTheVlogger
      @RaviTheVlogger 2 роки тому

      @@DataEngineering I think, this one limitation in the stream.

    • @rajinikanthgudimalla5454
      @rajinikanthgudimalla5454 2 роки тому

      when we done any operation on stream table, it could be empty.

  • @lavesh90
    @lavesh90 11 місяців тому

    Gold mine of knowledge...
    Thank you for all your efforts Sir 🙏
    This channel is probably amongst the best out there...😇
    I hope it gets the appreciation in terms of number on yt that it deserves. 🤞👏

  • @dipakprasad
    @dipakprasad Рік тому

    For a delta stream, it seems it is only capturing the current version of the table and all are INSERT. Can you please help understand?

  • @Rajag-ic1xt
    @Rajag-ic1xt 7 місяців тому

    Nice explanation, keep going on. Thankyou..!

  • @MrJaga121
    @MrJaga121 4 місяці тому

    Currently i am going through the videos and i am preparing for Snowflake Advance data engineering certification. Could you please guide me which playlist more towards data engineering course content. Where can i download those worksheet which you are doing in the videos.

  • @nehasindwani538
    @nehasindwani538 Рік тому

    Thank you for sharing amazing content. I have a doubt, when S3 delta data flow through snowpipe in landing tables, files header also populates as values. However, we already set skip_header=1 while defining file format.
    Why it is considering headers as one of values while loading data into landing tables

  • @GaneshHurgule
    @GaneshHurgule 2 місяці тому

    Is it work with external table

  • @sudeepduvvuru9483
    @sudeepduvvuru9483 2 роки тому +1

    Great work master Eagerly Waiting when next video is going to come.

  • @shashankm2859
    @shashankm2859 Рік тому

    That was a great video.. What if I have to implement the SCD Type 2 without stream and task on the table with 500 columns, how we can implement it?

    • @DataEngineering
      @DataEngineering  Рік тому

      If you have to implement without stream/task, then you have to use time travel feature but that will make the overall implementation very complicated.

  • @shubhamjoshi6608
    @shubhamjoshi6608 2 роки тому

    Business problem I am struggling with: I have a table with some data which Is replaced by new data every week. Its in Json format so i structure it after loading into snowflake. The problem is I am unable to capture what changed when the new data is loaded and replaced by the old data. For example if i want to check what was the total savings till june and what is the savings now. I dont have that functionality because i dont have date to measure. its like a snapshot functionality i want to add in the table so everytime new data is loaded it captures the change .Please suggest some solutions around it.

    • @DataEngineering
      @DataEngineering  2 роки тому

      why don't you use time travel feature .. watch this video..it may help to solve your issue (Time Travel Master Class - ua-cam.com/video/AdESTexG7QA/v-deo.html)

  • @soniapandita5128
    @soniapandita5128 Рік тому

    Hi Sir,
    Thanks for the great tutorial on Streams. It was really very easy to follow your video. However, I am curious if we can create a stream to capture if the changes happen to some columns on a source table and not on all the columns.
    Like for eg:
    TEST_TABLE (COL1, COL2, COL3) contains 3 columns. I only care about the changes happen to COL1 and COL3, want to ignore for COL2.
    So is it possible to create Stream this way
    CREATE OR REPLACE STREAM TEST_STREAM
    ON TABLE TEST_TABLE (COL1, COL3)
    Or is there any alternative approach to achieve the above behaviour. Any help is appreciated. Thanks again for the great series.

  • @judepieries1828
    @judepieries1828 Рік тому

    This is really awesome.

  • @nagach7525
    @nagach7525 2 роки тому

    Why not use VARCHAR default length? if we use columns in the tables without defining any length and keeping it as MAX, snowflake confirms that there is no impact on the performance. do you see any impact to the downstream or any ETL tools which uses snowflake?

    • @DataEngineering
      @DataEngineering  2 роки тому

      You are right.. I used sample DDL for demo purpose .. it is not the best practice video.. so you are right..

  • @maaaaaass
    @maaaaaass Рік тому

    If we create stream with show_intiail_rows=true then get_ddl not giving this option, do we get all this structure with any other options ?could you please reply on this ?

  • @avisheksingh1262
    @avisheksingh1262 Рік тому

    hii plz provide the code which u r writing in snowflake terminal

  • @Asma_colors
    @Asma_colors 2 роки тому

    great work and great help.

  • @himanshumittal1095
    @himanshumittal1095 2 роки тому

    Very well explained!!!

  • @maaaaaass
    @maaaaaass Рік тому

    How to know stream status like data reading from base table or data reading completed from base table?

  • @akashgoel601
    @akashgoel601 Рік тому

    Thanks for the video, just one thing, I was getting error while trying to access Sql Link from description, is that restricted to a certain group? Cheers!

    • @DataEngineering
      @DataEngineering  Рік тому

      Thanks for your note.. looks there is a probem.. let me check and fix it..

  • @rk-ej9ep
    @rk-ej9ep Рік тому

    This is awesome..

  • @fishsauce7497
    @fishsauce7497 2 роки тому

    As usual, great video

  • @aks541
    @aks541 2 роки тому +1

    So there are 3 types of stream in Snowflake. Delta, append & insert only for external table

    • @DataEngineering
      @DataEngineering  2 роки тому

      Yes, you can think that way..
      1. Stream which captures all changes (insert/updated/delete)
      2. Stream which captures only insert and not update/delete (like IoT use cases)
      3. Stream on external table is always append only (Snowflake does not support update or delete)
      Hope it makes the concept clear.

  • @radhikaramaro
    @radhikaramaro Рік тому

    how to get this code for execution

  • @rajgirish
    @rajgirish 2 роки тому

    Excellent Video!! Can you please add a video on streams on Views ( also secured views) and the various scenarios that will play out?

    • @DataEngineering
      @DataEngineering  2 роки тому

      Thank you 🙏 @Girish for watching my video and your word of appreciation really means a lot to me.
      Stream object can be created on table including external table and they are not applicable for views. Will see if I can make videos on different stream scenario and thanks for sharing your ideas.

    • @rajgirish
      @rajgirish 2 роки тому

      @@DataEngineering Streams can be created on views and secure views

  • @saeedrahman8362
    @saeedrahman8362 2 роки тому

    very useful

  • @mukundam9428
    @mukundam9428 2 роки тому

    if you could share some knowledge on procedure and automating them using task considering one real case would definitely help full to all.

  • @anuragkataria548
    @anuragkataria548 2 роки тому +1

    Thanks for good lesson but does stream cost a lot is not been answered

    • @DataEngineering
      @DataEngineering  2 роки тому +1

      follow the offset concept from 6th min onwards and if you understand the offset concept, you would know that it is covered.
      Stream just captures the offset and it does not hold any data by itself and that's why there is no cost.
      I would make separate video on stream cost to add additional clarity. Thanks for your comment.

    • @anuragkataria548
      @anuragkataria548 2 роки тому

      @@DataEngineering thanks a lot. yes, it only stored offset but at the same it will use cost for processing the data from stream.

  • @gaganrajkaushal
    @gaganrajkaushal Рік тому

    Could you please check the resource file link ? It not working

    • @DataEngineering
      @DataEngineering  Рік тому

      Let me check..
      I am planning to re-organize them in a different platform, will soon update you, for now they are missing and really sorry for that.

  • @kamaladevim122
    @kamaladevim122 2 роки тому

    what is the use case of having multiple streams on a table

    • @DataEngineering
      @DataEngineering  2 роки тому

      It is possible that there is single source but there are different business team or group of people who wants to track changes as per their business rules.. for those cases, you can have multiple streams on single table..

  • @BipinJoshi-q9b
    @BipinJoshi-q9b Рік тому

    too good

    • @DataEngineering
      @DataEngineering  Рік тому

      thank you...
      If you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... this 2 videos... will help you to broaden your knowledge..
      Thse contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
      1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50
      2. www.udemy.com/course/

  • @rajinikanthgudimalla5454
    @rajinikanthgudimalla5454 2 роки тому

    can you please post all video text tutorials, only some of topics are available remaining all are videos there

  • @chandangupta1989
    @chandangupta1989 Рік тому

    What is first-class object means?

    • @DataEngineering
      @DataEngineering  Рік тому

      These objects can be manipulated using SQL commands, they are securable objects and different privileges can be given.

  • @anguramanathan7937
    @anguramanathan7937 2 роки тому

    So Remaining 5 chapter are in making is it?

  • @soundaryas5907
    @soundaryas5907 2 роки тому

    What is tale and type in stream result table.

    • @DataEngineering
      @DataEngineering  2 роки тому

      Could you elaborate your question.
      Do you mean stale? - Streams are build on the top of time travel and if stream (CDC) is not captured, the chage part is no move available.
      I did not understand the 2nd part of the question .. about Type.. do you mean stream type? They are all explained in the video very well.

  • @saiyadav5014
    @saiyadav5014 2 роки тому

    It’s showing Stale after = 14days(default)
    Can we change or set the stream stale_after date to a any particular date like example stale_after=1 day and stale_after= 2 years

    • @DataEngineering
      @DataEngineering  2 роки тому

      I am not sure if snowflake added one paramtere to set this value.. 14 is a default given by snowflake..

    • @saiyadav5014
      @saiyadav5014 2 роки тому

      @@DataEngineering thanks for your information
      These tutorials are too good and These are helping me a lot in my project
      Thanks once again

    • @DataEngineering
      @DataEngineering  2 роки тому

      @@saiyadav5014 Thank you so much

    • @jimitmehta5985
      @jimitmehta5985 Рік тому

      @@DataEngineering I had a question over here...What is the use of data retention period if time travel does not work on streams??

  • @vinaykumarpatnana
    @vinaykumarpatnana Рік тому

    where is stream storage cost sir.

    • @DataEngineering
      @DataEngineering  Рік тому

      Stream is not a separate object and the existing table gets 3 additional column, so cost of stream is cost of table itself.

  • @saitejaaa1086
    @saitejaaa1086 2 роки тому

    The link is unsecured Can you share an other link please

  • @nadeem4222
    @nadeem4222 2 роки тому

    i dont have the permission to access SQL scripts,why?