I am a BI Engineer and trying to expand my portfolio to Data Engineering as well! Man, no paid courses give such realistic demos and explanations in real-time scenarios. Thank You for teaching us all free. This course is amazing and wonderfully curated.
I am learning so much from your videos. Thankyou so much for taking out time and making an effort for all these videos which are helping so many people like me. Thankyou Again
Very good performance demonstration at the end. Great knowledge. Lot of concepts explained in this video. no words to say!.... keep up your good work. Feeling happy that I found this playlist to learn Snowflake. Thanks for all your knowledge sharing.
Thank you for preparing these videos, they are really very helpful. Well explained and you have covered all the topics in snowflake. Appreciate your work Sir. Thank you again
@Ramu Vunnam , thank you 🙏 for watching my video and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Sir, I am very happy that these videos are available on UA-cam for free to all. Your efforts are very much appreciated and thank you so much for very extensive level of knowledge sharing. 👏👏👏. My request is please do share us the slide that you prepared for demonstration. I t would really help us the keep the note for the specific chapter. I again thank you so much for your effort and level of explaining.
Thank you for putting all these valuble informations in one place. I am a novice user and I am searching for Disaster Recovery options in snowflake. It would be great if you can make a video about the same. Thanks again for your 'good samaritan' attitude.
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. You would like to know how Disaster Recovdery works in snowflake ? or how DR is designed in Snowflake? DR comes under continous data protection and snowflake can replicate all your data in another cloud region and that is done primarily for ensure data availability for business critical application, else you never lose data in snowflake so technically you don't need DR. Please share your view and then I can design a content and publish it.
@@DataEngineering Thanks for the replay and delay in responding as I missed the notification :(. Well, you answerd a part of my question already in few words :). If you can explain 'how Disaster Recovdery works in snowflake' it would be great. I am from Oracle world and we are using switchover/failover procedures in DR scenarios to switch the DB roles.
What a great series of videos on Snowflake! Really hats-off to the way you are presenting without even wasting a single second and lots and lots of info. Probably we may not even get this much info on the paid courses. Great work. One small thing, where can I find the queries and commands that you hare showing on your worksheets. They are really helpful for us to practice and verification of our practice. Again, its an awesome videos!
Glad you like them Jyothi. Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Thank you @Fiona Fu 🙏 for watching my video and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Glad it was helpful! Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Thank you 🙏 for watching my video @Salman Sayyad and your word of appreciation really means a lot to me. I have not yet thought about Udemy. I am able to reach my audience via youtube and I will stick to it for some more time.. may be in future some other way to publish.. thanks for your feedback again..
Thank you 🙏 for watching my video @Satchidananda Tripathy and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Glad you liked it! --- and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in udemy.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/ 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/
This is remarkable compilation. Could you please point to the weblink url to see get a copy of the sql statements. the webpage shared does not have content sorted by the playlist videos. Thanks and much appreciated!!
This is very good and informative info. Thanks for putting efforts really appreciated. I am still trying to understand how come its super fast loading millions of data file to table in seconds. What internally this tool does to do that? Also if you increase the nodes it become faster but in real life your data is always have constraints, indexes, triggers, primary/composite keys, partitions so in that case what will happen? Like in HDFS and AB Initio MFS your data in file gets distributed and then each node, CPU+main memory work only on that part of data to do it faster, what happens in snowflke, what type of loads it supports. May be I have to go through all videos but I need to practically do and understand. Thanks again its really great content to start
Thank you for series of informative videos on various topics on Snowflake. I could carry out lot of important implementations in my project. I have One question on bulk data load, here you have explained how we can copy or load data into internal named or un-named stage from the file stored on local disk but in actual scenario, bulk data to be loaded from database either on premise or DB on any other cloud. In this scenario how to copy data to internal stage?
The way of knowledge presentation is highly appreciated brother. Thnk you so much for this knowledge transfer. Small request is: Could you please provide the respective SQL statements which have been used to explain this chapter. Would be a great help
Sure I will and yes, if you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
Hello. Always great work and really well done videos. Thank you ! 🥇 Remark : isnt there an error in the put parquet file command : time video : 26:21 Table created is "customer_parquet_ff" and table stage used in put command is "customer_parquet" . that dont match , am I rigth ? thk you 🤟
you might be true.. sometime I skip the video parts which is not meaningful or end with errors. I try my best to have proper sync but sometime that kind of mistake happens. Thanks for pointing it out.. will pay extra attention next time on.
What you see is not slides and rich content and it is not easy to share them.. if you need more accessible.. you can look into my udemy course (though limited at this stage but will publish more content that will have complete reference material) These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=SPECIAL35
Great videos. How can I imititate production data migration in training scenarios? Do you have a video on common migrations issues and how to resovle them?
if you really want to migration related video.. I have only one ua-cam.com/video/ahJWrD3FFx0/v-deo.html and yes, if you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
At 12:54 & 25:15 you mentioned that the table stage cannot be associated with file format like Csv / parquet. But later at 25:48 you mentioned STAGE_FILE_FORMAT = (TYPE = PARQUET) when creating a table. Here you are indirectly associating a file format to a stage. Is it not contradicting? Thanks a lot for the videos!
Very thank you for the clear cut session on the data loading. I have one question: can we update the data in external state table before loading into snowflake table? How to update parquet file data in external table?
Yes, you can. You need to have parquet as file format associated with stage or external table. It is covered in ch-11 ua-cam.com/video/w9BQsOlJc5s/v-deo.html
hey man! awesome tutorial! i see that in your previous video when creating a table for a parquet file to be loaded you used Stage_file_format before creating the table and in this tutorial video you used only file_format. whats the difference between a stage_file_format and just a file_format???
Hi @data engineering simplified will you please provide link to access sql scripting and files which you have used to upload in stages and how to create ware houses with nodes and clusters
Thank you so much for wonderful explanations about snowflake. I have question on querying unstructured data like parquet which you have mentioned in chapter 9. How do we query if more than 50 key value pair data. In the video we have very few so that is easy to query?? Please answer my question.
Thank you very much for such a detailed explanation. It's really good. Please let us know where can we find the notes that you made under the worksheets? I couldn't find for Chapter-9?
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Thank you for the wonderful tutorials. Can u pls provide "toppertips" link for the queries you have executed in this videos? I am not able to find them.
@Data Engineering Simplified Extremely useful videos for snowflake. Thanks a lot for sharing your knowledge on this platform. Is it possible to share the deck and the worksheet which you are running on the videos? I tried to search those on toppertips, but did not find those..
Thank you for the great content, can you please help me that what privilege we need from admin to create integration and staging object. can you please provide sql statement of both. I tried ''grant create integration on account to role
Thank you 🙏 for watching my video @Dinavahi Kalyan and your word of appreciation really means a lot to me. For integration object, the use must have accountadmin role, if you can share what error you are getting, that would help me to understand the issue.
At 42.08 the stage that is created (my_stg) is an internal stage right?...because if url is not provided its an internal stage....also for external stages put cannot be used....wanted to know why it is stated as external stage please?
Always welcome. I am trying to find a way so I can be reachable, for now comments are the only way to share your views. Thank you 🙏 for watching my video @Dinesh B and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Hi..they are rich media for creating video content.. it is ppt or pdf which can just share. I try to share the code via my website... this chapter is not yet published in my blog.. will do it soon.
I assume.. you must be talking about SnowSQL CLI.... it is a CLI that helps you to load data from your local PC to snowflake.. that you can also do now using snowsight for limited files...
Hi sir, i have one question - i have file with 20 columns and hve a snowflake table with 21 columns where last column is created date ( timestamp) .we need to copy all the file columns and also to pass the current timestamp in created date column. How will we do it..?? If we have less columns we could do this Copy into table (col1,col2,col3) from (select $1,$2,to_timestamp(current _timestamp) from @stage/emp.txt File_format = (type=csv) But listing all 20 $ sign columns in select clause will be difficult..is there any way to handle these situations..??
yes.. there is no other alternative... you have to do it manually... (I am working on a source that can automate this code using snowpark and that will be available in udemy.. connect me over instagram or facebook.. and you will be notified) instagram.com/learn_dataengineering/ facebook.com/groups/627874916138090/
My question may be too basic, To copy data in snowflake , syntax while copying data to target table stage should be tablename\schema name, generally we follow in other dbs like schema name\table name. Plz clarify
Hi I've a query. I have a requirement where a new file comes to my local system every day at a specific location. I have to create an internal stage on Snowflake and have to exceute put command on snowsql to upload this file data into snowflake table . I want to orchestrate this daily. How to achieve this
you can write one SQL script and run that using Snowsql -f option and then schedule it using Windows Schedule or Linux cron job from your location machine
Yes... it is not necessary that you should have file format when you create stage.. and default consideration is CSV... so if there is CSV file.. without the file format you can query the stage files.
To load data from legacy system, it has to be first moved to S3 ( or cloud storage) location and then copy command to be executed. You can also use tools like fivetran, streamset, matallion, qlik etc... they are data ingesion as service with CDC featured enabled
Thank you so much for creating this Snowflake tutorial. It's extremely useful and stands out from the rest of the content on UA-cam. Could you please advise on how to troubleshoot any errors that arise as a result of a format mismatch while loading data? So, In the case of numeric data types, for example, instead of 98765, we have 987lm. You can also suggest any tutorial videos that have addressed this issue.
How to find the scripts and data in your web site? I'm looking for the sql scripts and data used in this tutorial (Fast Data Loading & Bulk Ingestion in Snowflake | Chapter-9 ) @DataEngineering
Yes, you can run them in free trial edition.. for few chapters. you may need S3 as well as python install..but they are just few.. ----- You can download this summary sheet in PDF version that has detail for all the chapters in one single file. Snowflake End To End Guide Cheat Sheet - bit.ly/43t5IPB
If we have multiple files in one of our stage and we want to load a particular file in table or we want t query that specific file then how to do that ?
yes, you can do it.. you can go deep to that file copy into landing_item_cpy from @s3_location/folder-1/folder-2/my-file.csv file_format = (type=csv COMPRESSION=none); I would also suggest to watch External Table chapter ua-cam.com/video/w9BQsOlJc5s/v-deo.html
Yes, it can be done using external table approach and run the copy command via task and have full load external table (as select statement) or external stage using copy command.
I want solution of partial data load ex suppose we have 10 record on target out of them 3 record have bad data just like data type not match etc that time copy into statement got fail for all records but I want continue load with good data please help me to solution of this types of scenario
there is an option when you define copy command... and bad data will be skipped and only good data will be loaded. You can watch chc-19 (ua-cam.com/video/9FejjGVZrPg/v-deo.html) which shows how bad data is tracked.
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. Let me check and will soon update about the link. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Quest
How to find in your web iste ? I'm looking for the sql scripts and data used in this tutorial (Fast Data Loading & Bulk Ingestion in Snowflake | Chapter-9 ) @DataEngineering
I have an interrogation on SNOWFLAKE.ACCOUNT_USAGE history table . I followed the process described in the video .. But I cant see any trace in copy or load history tables. Video step : 31:30 Have you an idea why ? I tried using IU and also using snowsql = same result . no copy or Load history line thank you 👀 a) create table stage (IU) b) put my file to table stage (snowsql) c) copy file to table (IU or snowsql ) d) check : table is ok (1000 rows) but no trace in SNOWFLAKE.ACCOUNT_USAGE
If you check the account usage table right after the action, it will not appear there.. it may take few min to few hours. It is a view which gets loaded after sometime. Try it out and you can see the result.
Am trying to list the stages for USER and TABLE, used complete extension like databse.schema.tabel but it doesn't give me that result. And for listing the user stage list@~ doesn't provide any result. Need your feedback on this please. Thanks
@@DataEngineering Yes you are right, but i cannot see the table stage it gives error "MISSING STAGE NAME IN URL @%". For the user stage i got it since i didn't had any data before but now i can view them. Thanks
I am a BI Engineer and trying to expand my portfolio to Data Engineering as well! Man, no paid courses give such realistic demos and explanations in real-time scenarios. Thank You for teaching us all free. This course is amazing and wonderfully curated.
Glad it was helpful!
I do have paid courses that has lot of content to practice.... you can try them
www.udemy.com/user/data-engineering-simplified/
One of the best videos in recent times with perfect voice over, screen presence and sharp one liners .... kudos
Glad you liked it!
I am learning so much from your videos. Thankyou so much for taking out time and making an effort for all these videos which are helping so many people like me. Thankyou Again
You are so welcome!
Very well explained. I will surely watch all the videos. I have joined a paid class for snowflake but this is far better. thank you
Hi Thanks for all your hardwork and I cleared my certification and I must say your videos helped a lot . Appreciate all the good work 👍
Great job! and congratulation on your success.
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
Luca's share details if possible.. regarding prep and question coverage compare to 300 Question bank of this chanel
Very good performance demonstration at the end. Great knowledge. Lot of concepts explained in this video. no words to say!.... keep up your good work. Feeling happy that I found this playlist to learn Snowflake.
Thanks for all your knowledge sharing.
Thanks a ton
More interesting videos are on the way .... and I am happy to that liked and paid attention to my content.. happy learning.
Quick and well explained in a way that you will get an overview of the stages.
Glad you liked it..
Thank you for preparing these videos, they are really very helpful. Well explained and you have covered all the topics in snowflake. Appreciate your work Sir. Thank you again
super clarity on every topic
@Ramu Vunnam , thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
thank you very much for sharing all this information and the best for free for us
It's my pleasure
Sir, I am very happy that these videos are available on UA-cam for free to all. Your efforts are very much appreciated and thank you so much for very extensive level of knowledge sharing. 👏👏👏. My request is please do share us the slide that you prepared for demonstration. I t would really help us the keep the note for the specific chapter. I again thank you so much for your effort and level of explaining.
So nice of you.. will explore what is the best possible way to share the deck.. it is not ppt and very heavy & big files..
Thank you for putting all these valuble informations in one place. I am a novice user and I am searching for Disaster Recovery options in snowflake. It would be great if you can make a video about the same. Thanks again for your 'good samaritan' attitude.
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
You would like to know how Disaster Recovdery works in snowflake ? or how DR is designed in Snowflake? DR comes under continous data protection and snowflake can replicate all your data in another cloud region and that is done primarily for ensure data availability for business critical application, else you never lose data in snowflake so technically you don't need DR. Please share your view and then I can design a content and publish it.
@@DataEngineering Thanks for the replay and delay in responding as I missed the notification :(. Well, you answerd a part of my question already in few words :). If you can explain 'how Disaster Recovdery works in snowflake' it would be great. I am from Oracle world and we are using switchover/failover procedures in DR scenarios to switch the DB roles.
Nice video and thank you for creating videos on snowflake
Glad you like them! and thank you again.
Really did a superb job for all tutorial..!!
Thanks a ton
What a great series of videos on Snowflake! Really hats-off to the way you are presenting without even wasting a single second and lots and lots of info. Probably we may not even get this much info on the paid courses. Great work.
One small thing, where can I find the queries and commands that you hare showing on your worksheets. They are really helpful for us to practice and verification of our practice. Again, its an awesome videos!
Glad you like them Jyothi.
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
My website is having some issue, will fix it soon so the SQL contents will be available there.
like your presentation, make the topic you try to introduce very clear and easy to follow.
Thank you @Fiona Fu 🙏 for watching my video and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Personally I feel the way of explanation is awesome!
Glad you linked my contents.
Thank You So much for this information. Explained very nicely Brother!!
Glad it was helpful!
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
THANKS for the excellent study material. I don't see any consolidated material any where in web like this👋👋👋
You are welcome!
Salute to your passion and hardworking for the such lengthy video with no time waste at all. Is your course on udemy?
Thank you 🙏 for watching my video @Salman Sayyad and your word of appreciation really means a lot to me.
I have not yet thought about Udemy. I am able to reach my audience via youtube and I will stick to it for some more time.. may be in future some other way to publish.. thanks for your feedback again..
Good One. One of the best over web
Thank you 🙏 for watching my video @Satchidananda Tripathy and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
great video! Thanks
Glad you liked it!
---
and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in udemy.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/
This is remarkable compilation. Could you please point to the weblink url to see get a copy of the sql statements. the webpage shared does not have content sorted by the playlist videos. Thanks and much appreciated!!
Here is the link for the website www.toppertips.com/
Finished watching
thanks
Great videos. As a newbie to snowflake , I'm finding it extremely useful . Thanks 🤍
Glad it was helpful! Keeping following.. many more interesting videos are on your way..
This is very good and informative info. Thanks for putting efforts really appreciated.
I am still trying to understand how come its super fast loading millions of data file to table in seconds. What internally this tool does to do that?
Also if you increase the nodes it become faster but in real life your data is always have constraints, indexes, triggers, primary/composite keys, partitions so in that case what will happen? Like in HDFS and AB Initio MFS your data in file gets distributed and then each node, CPU+main memory work only on that part of data to do it faster, what happens in snowflke, what type of loads it supports.
May be I have to go through all videos but I need to practically do and understand.
Thanks again its really great content to start
Thank you for series of informative videos on various topics on Snowflake. I could carry out lot of important implementations in my project.
I have One question on bulk data load, here you have explained how we can copy or load data into internal named or un-named stage from the file stored on local disk but in actual scenario, bulk data to be loaded from database either on premise or DB on any other cloud. In this scenario how to copy data to internal stage?
The way of knowledge presentation is highly appreciated brother.
Thnk you so much for this knowledge transfer.
Small request is: Could you please provide the respective SQL statements which have been used to explain this chapter. Would be a great help
Sure I will
and yes, if you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
Hello. Always great work and really well done videos. Thank you ! 🥇
Remark : isnt there an error in the put parquet file command :
time video : 26:21
Table created is "customer_parquet_ff" and table stage used in put command is "customer_parquet" . that dont match , am I rigth ?
thk you
🤟
you might be true.. sometime I skip the video parts which is not meaningful or end with errors. I try my best to have proper sync but sometime that kind of mistake happens.
Thanks for pointing it out.. will pay extra attention next time on.
Great video! could you please share or add the SQL script used in the description
Really good explanation. It would be really helpful if you could provide slides and worksheets links as well.
What you see is not slides and rich content and it is not easy to share them.. if you need more accessible.. you can look into my udemy course (though limited at this stage but will publish more content that will have complete reference material)
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=SPECIAL35
Great videos. How can I imititate production data migration in training scenarios? Do you have a video on common migrations issues and how to resovle them?
if you really want to migration related video.. I have only one
ua-cam.com/video/ahJWrD3FFx0/v-deo.html
and yes, if you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
At 12:54 & 25:15 you mentioned that the table stage cannot be associated with file format like Csv / parquet. But later at 25:48 you mentioned STAGE_FILE_FORMAT = (TYPE = PARQUET) when creating a table. Here you are indirectly associating a file format to a stage. Is it not contradicting? Thanks a lot for the videos!
You are right.. let me check this video once again.. thanks for your note.
while copy into table from stage the column should be in the same serial number in both csv file and the defined table structure or not
Thank you so much
glad you liked the content..
Very thank you for the clear cut session on the data loading.
I have one question: can we update the data in external state table before loading into snowflake table? How to update parquet file data in external table?
Yes, you can. You need to have parquet as file format associated with stage or external table. It is covered in ch-11 ua-cam.com/video/w9BQsOlJc5s/v-deo.html
hey man! awesome tutorial! i see that in your previous video when creating a table for a parquet file to be loaded you used Stage_file_format before creating the table and in this tutorial video you used only file_format. whats the difference between a stage_file_format and just a file_format???
great contents how do I get these files for practice please
Awesome Material, from where can we download the data file and code used in this chapter, please share the link.
Hi @data engineering simplified will you please provide link to access sql scripting and files which you have used to upload in stages and how to create ware houses with nodes and clusters
Hi, how can I get the datafiles? Especially those having semi structured data.
Thank you so much for wonderful explanations about snowflake. I have question on querying unstructured data like parquet which you have mentioned in chapter 9. How do we query if more than 50 key value pair data. In the video we have very few so that is easy to query?? Please answer my question.
Working on them and will soon will make it part of video description.
Snowflake use AWS S3 Bucket for all their processing, so User Stage is allocted in S3 bucket but the location is Hidden, some where in "Real" Cloud,
yes..you are right..
Amazing videos, very well explained! Can we get a scripts for all the commands executed during the videos?
Yes, soon it will be uploaded, bit busy now.. will do it.
Thank you very much for such a detailed explanation. It's really good.
Please let us know where can we find the notes that you made under the worksheets? I couldn't find for Chapter-9?
Thanks Sunil,
I will upload them soon via my website..
great content!
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Thank you for the wonderful tutorials. Can u pls provide "toppertips" link for the queries you have executed in this videos? I am not able to find them.
let me check and make it available for your, having some issue with my website.
@Data Engineering Simplified Extremely useful videos for snowflake. Thanks a lot for sharing your knowledge on this platform. Is it possible to share the deck and the worksheet which you are running on the videos? I tried to search those on toppertips, but did not find those..
My site had some issue.. will release it soon (probably later this week) and will let you know the URL
At 29.20,u said if ff attached to tbl,it will not work...then why did we attach ff with thd parqueg tbl created?pls can u re explain 29.20
Thank you for the great content, can you please help me that what privilege we need from admin to create integration and staging object. can you please provide sql statement of both. I tried ''grant create integration on account to role
Thank you 🙏 for watching my video @Dinavahi Kalyan and your word of appreciation really means a lot to me.
For integration object, the use must have accountadmin role, if you can share what error you are getting, that would help me to understand the issue.
Could you please share resources that you have used in videos?
At 42.08 the stage that is created (my_stg) is an internal stage right?...because if url is not provided its an internal stage....also for external stages put cannot be used....wanted to know why it is stated as external stage please?
you are right... I need to check if I made mistake stating that it is an external stage..
Thanks for your note..
Thank you very much most awaited chapter ... anyway to talk to you ?
Always welcome.
I am trying to find a way so I can be reachable, for now comments are the only way to share your views.
Thank you 🙏 for watching my video @Dinesh B and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
is it possible to download the demo files which are used in demo?
Hi..they are rich media for creating video content.. it is ppt or pdf which can just share. I try to share the code via my website... this chapter is not yet published in my blog.. will do it soon.
Can't thank enough.
you are welcome Prasad.
Sir, these videos are awesome, please let me know where I can get the notes of this chapter.
will publish it soon...
Is there any option to limit the no. of records loaded in to table from fe
I have a question about that blue screen with codes, what is that?
thankyou
I assume.. you must be talking about SnowSQL CLI.... it is a CLI that helps you to load data from your local PC to snowflake.. that you can also do now using snowsight for limited files...
when to go for user staging and when to go for table staging
Which location does internal named stage point to?
Hi sir, i have one question - i have file with 20 columns and hve a snowflake table with 21 columns where last column is created date ( timestamp) .we need to copy all the file columns and also to pass the current timestamp in created date column. How will we do it..??
If we have less columns we could do this
Copy into table (col1,col2,col3) from (select $1,$2,to_timestamp(current _timestamp) from @stage/emp.txt
File_format = (type=csv)
But listing all 20 $ sign columns in select clause will be difficult..is there any way to handle these situations..??
yes.. there is no other alternative... you have to do it manually...
(I am working on a source that can automate this code using snowpark and that will be available in udemy.. connect me over instagram or facebook.. and you will be notified)
instagram.com/learn_dataengineering/
facebook.com/groups/627874916138090/
My question may be too basic, To copy data in snowflake , syntax while copying data to target table stage should be tablename\schema name, generally we follow in other dbs like schema name\table name. Plz clarify
In snowflake, you have to follow db_name.schema_name.table_name.
suggest to watch my context function series ua-cam.com/video/VkIRKGHdg6c/v-deo.html
Hi
I've a query. I have a requirement where a new file comes to my local system every day at a specific location. I have to create an internal stage on Snowflake and have to exceute put command on snowsql to upload this file data into snowflake table . I want to orchestrate this daily. How to achieve this
you can write one SQL script and run that using Snowsql -f option and then schedule it using Windows Schedule or Linux cron job from your location machine
Very Nice Videos, can you plesae provide me the PDF file .
So if there is no file format associated with internal stage so we can not store semistructured data?
Yes... it is not necessary that you should have file format when you create stage.. and default consideration is CSV... so if there is CSV file.. without the file format you can query the stage files.
If my business scenario is to copy data from other datawarehouse like teradata data to snowflake then how the process should be .
To load data from legacy system, it has to be first moved to S3 ( or cloud storage) location and then copy command to be executed.
You can also use tools like fivetran, streamset, matallion, qlik etc... they are data ingesion as service with CDC featured enabled
why toppertips website showing under maintenance ?
it is fixed now and you must be able to access it.
Hi Sir, How can i move data from snowflake in aws to snowflake in azure? Is this feature available in snowflake?
yes, you can move it.. it is done via data sharing via replication ...
@@DataEngineering Thank you sir
Hi Can you please provide this three dataset?
Thank you so much for creating this Snowflake tutorial. It's extremely useful and stands out from the rest of the content on UA-cam.
Could you please advise on how to troubleshoot any errors that arise as a result of a format mismatch while loading data?
So, In the case of numeric data types, for example, instead of 98765, we have 987lm.
You can also suggest any tutorial videos that have addressed this issue.
thanks
To copy the data do we need to do through snow SQL cli only? Can't we do it using webui?
you can upload only 100Mb data file via webui, if you have to load more than that, in that case you have to use stage process.
@@DataEngineering okay thanks for your response.
Hi Can we get this queries so that we can practice.
I have not yet published them.. will see if I can make them public via my website or by other means.
can u show the bulk uplaod in new snowflake web interface?
You can watch this playlist..that covers entire loading process in snowflake
ua-cam.com/play/PLba2xJ7yxHB6NPEv8pp7j3zWibLZzwjvO.html
How to find the scripts and data in your web site? I'm looking for the sql scripts and data used in this tutorial (Fast Data Loading & Bulk Ingestion in Snowflake | Chapter-9 )
@DataEngineering
Sir, can we cover all this tutorial on snowflake free trial
Yes, you can run them in free trial edition.. for few chapters. you may need S3 as well as python install..but they are just few..
-----
You can download this summary sheet in PDF version that has detail for all the chapters in one single file.
Snowflake End To End Guide Cheat Sheet - bit.ly/43t5IPB
please add csv and parque files as well for practice
If we have multiple files in one of our stage and we want to load a particular file in table or we want t query that specific file then how to do that ?
yes, you can do it.. you can go deep to that file
copy into landing_item_cpy
from @s3_location/folder-1/folder-2/my-file.csv
file_format = (type=csv COMPRESSION=none);
I would also suggest to watch External Table chapter ua-cam.com/video/w9BQsOlJc5s/v-deo.html
@@DataEngineering Thanks for the reply
Thank you for the detailed information, We can automate continuous data with snow pipe but how can we automate bulk load?
Thanks in advance.
Yes, it can be done using external table approach and run the copy command via task and have full load external table (as select statement) or external stage using copy command.
i am unable to find code of this series on your web
hi where i can find that parquet file showed in vedio
Let me check.. having some issue with my webiste.. will fix it and will send you the link
How to query the worksheet or how to check the data from the worksheets that are available in user stage.
User storage starts with@ ~ sign
example list @~;
Hi , Where can I get the tree diagram, I am unable to find it in the link given.
I will try to publish it via my website.
I want solution of partial data load ex suppose we have 10 record on target out of them 3 record have bad data just like data type not match etc that time copy into statement got fail for all records but I want continue load with good data please help me to solution of this types of scenario
there is an option when you define copy command... and bad data will be skipped and only good data will be loaded.
You can watch chc-19 (ua-cam.com/video/9FejjGVZrPg/v-deo.html) which shows how bad data is tracked.
Does name of table stage and name of table must be same ??? Please answer
yes, it is same as table name.
hey sir nice content but where to find these codes as your site is not working
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. Let me check and will soon update about the link.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Quest
Where can we get all the files you have used?
will release the SQLs soon.. via my website..
How to find in your web iste ? I'm looking for the sql scripts and data used in this tutorial (Fast Data Loading & Bulk Ingestion in Snowflake | Chapter-9 )
@DataEngineering
Where can we get the data files that you used?
can u pls provide script file shown in this video
Where can I get the codes of your videos?
It must be in the website, let me know if it is down and let you know if there is an issue.
@@DataEngineering Unable to find all the codes in the site which is explained in the videos. Kindly update it in the site.
GitHub or toppertips? I didn’t find it
from where i can get the codes of this video?
My site is having some issue, will fix it soon and then will drop you a note.
@@DataEngineering okay,waitng for your response.
bhai civ file kaha milegi
Which csv file you need? can you me the time referene in the video?
I have an interrogation on SNOWFLAKE.ACCOUNT_USAGE history table . I followed the process described in the video
.. But I cant see any trace in copy or load history tables. Video step : 31:30
Have you an idea why ?
I tried using IU and also using snowsql = same result . no copy or Load history line
thank you
👀
a) create table stage (IU)
b) put my file to table stage (snowsql)
c) copy file to table (IU or snowsql )
d) check : table is ok (1000 rows)
but no trace in SNOWFLAKE.ACCOUNT_USAGE
If you check the account usage table right after the action, it will not appear there.. it may take few min to few hours. It is a view which gets loaded after sometime.
Try it out and you can see the result.
Right : this morning the history is present. ! 👌
Am trying to list the stages for USER and TABLE, used complete extension like databse.schema.tabel but it doesn't give me that result. And for listing the user stage list@~ doesn't provide any result. Need your feedback on this please. Thanks
do you have data inside the user stage and table stages? user stage generally provide some output and there is no special privileges required.
@@DataEngineering Yes you are right, but i cannot see the table stage it gives error "MISSING STAGE NAME IN URL @%". For the user stage i got it since i didn't had any data before but now i can view them. Thanks