Glad it was helpful! Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Thanks for such a detailed explanations. In Multi cluster VWH cost calculation example we can not consider Standard edition as it starts with Enterprise edition & above I guess . Could you please clarify ?
Great session. Every penny second worth it. Do you have any plan creating tutorials for AWS?? I am damn sure AWS playlist will be the much awaited for everyone.
Always been a fan of your videos. Can you please make a video on failover if possible. The concept of failover is well explained in snowflake documentation but there isn’t any demonstrations available anywhere. Was trying to explore this feature but did not work out.
Thank you @Tanaya Mandal, It is in my list and I will create a video around it. Since it is bit complex process and need organization setup, it may take some time for me. However, I will write a content around it and publish it in my website (www.toppertips.com)
Great session.. Neatly presented. Can you please tell me what is the max data volume or size that can be loaded into a snowflake table using snow pipe.
Thanks.. There is no specific limit for snowpipe as it runs copy command under the hood but if you are loading incremental load then make sure that your data file size should be 50-100Mb, else it will be slow.
@@DataEngineering I am amazed to see your reply. Thanks you so much. So I have dived into details and clear about the concepts. Can you make me understand the actual working in snowflake ❄.
We had around 1200 external tables almost all of them were using the same file_format, by mistake the developer recreated the file_format using create or replace script.Is there a way we can recover all external table? We are not even able to apply like get_ddl now.For sure we can't time travel .What shall we do if you have a way.Thanks in advance.
There are 2 ways to do that.. The account (service principal) used to submit the job mus have default database, schema and warehouse set and that way, without passing anything, that particular warehouse will be started. 2nd, you have to set the context in which you can specify the warehouse. You can watch my SQL series ua-cam.com/play/PLba2xJ7yxHB6LbOdzpqRB0WQE7IPWbbSy.html
Suppose the account was created on GCP cloud and in the coming years there is a need of 5x and 6x size compute and GCP does not have that available. In that case a new account under AWS or Azure would have to be created to facilitate the need, Am I right or is there any other feasible solution?
What is impact of pricing in snowflake for Min/Max Cluster Constrain in particular size of Warehouse . Ex:- is there any impact on x size warehouse cost for if I put min cluster=3 or min cluster=1.
Not sure if I understood your question correctly or not. The pricing has no co-relation with min/max cluster. The credit per unit consumption is based on your usage. So if you have an enterprise edition, your per credit cost will be around $3 and if you have multicluster running with x size (2credit per house) so for 1 hour of your elastic cluster will cost you equal to normal cluster but when it scale out, it may start costing your double or tripple as per the usage.
How to handle special charachers like Alpha,Beta in Snowflake.? when I load Parquet file in table. I see some special character. I tried casting it to UTF-8 but could not get it exactly.
Very good session again. Thanks for doing this. Able to understand workload and type of WH...etc., Great help!
Glad it was helpful!
I just wonder the way you have presented the Virtual warehouse session. Thanks for the Awesome presentation.!!!!
Excellent material and videos...Keep up the good work!!!
Crisp and clear explanation. Thanks for sharing
Glad it was helpful!
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
Thanks for such a detailed explanations. In Multi cluster VWH cost calculation example we can not consider Standard edition as it starts with Enterprise edition & above I guess . Could you please clarify ?
Great session. Every penny second worth it. Do you have any plan creating tutorials for AWS?? I am damn sure AWS playlist will be the much awaited for everyone.
this is just great!
thanks @Dirk...
Always been a fan of your videos. Can you please make a video on failover if possible. The concept of failover is well explained in snowflake documentation but there isn’t any demonstrations available anywhere. Was trying to explore this feature but did not work out.
Thank you @Tanaya Mandal, It is in my list and I will create a video around it. Since it is bit complex process and need organization setup, it may take some time for me.
However, I will write a content around it and publish it in my website (www.toppertips.com)
Great session.. Neatly presented. Can you please tell me what is the max data volume or size that can be loaded into a snowflake table using snow pipe.
Thanks..
There is no specific limit for snowpipe as it runs copy command under the hood but if you are loading incremental load then make sure that your data file size should be 50-100Mb, else it will be slow.
SF- Superman, High Quality Content....keep going bro, aslo can you please make one project that would help a lot
thanks.. will make on project.. keep watching this space..
Hi Sir
I am keen to know about scaling up and scaling out feature in snowflake
do you need indepth understanding how it works? or only conceptual understanding? Help me what is not clear from this tutorial?
@@DataEngineering I am amazed to see your reply. Thanks you so much. So I have dived into details and clear about the concepts. Can you make me understand the actual working in snowflake ❄.
@@DataEngineeringlet me tell you these videos are blessings in disguise. I am going through many of these videos. Highly appreciated. Grateful to you.
We had around 1200 external tables almost all of them were using the same file_format, by mistake the developer recreated the file_format using create or replace script.Is there a way we can recover all external table? We are not even able to apply like get_ddl now.For sure we can't time travel .What shall we do if you have a way.Thanks in advance.
Can i pass the warehouse while i am submitting the job in snowflake ?
There are 2 ways to do that..
The account (service principal) used to submit the job mus have default database, schema and warehouse set and that way, without passing anything, that particular warehouse will be started.
2nd, you have to set the context in which you can specify the warehouse.
You can watch my SQL series
ua-cam.com/play/PLba2xJ7yxHB6LbOdzpqRB0WQE7IPWbbSy.html
Hi, any plans of uploading rest 13 chapters?
@Maheedhar K, it is plan for day night (IST), if you have subscribed my channel, you would get the notification.
Does snowflake support triggers??
No, it does not ... may be in future, they will add it.. but with current feature list..it does not support trigger..
Suppose the account was created on GCP cloud and in the coming years there is a need of 5x and 6x size compute and GCP does not have that available. In that case a new account under AWS or Azure would have to be created to facilitate the need, Am I right or is there any other feasible solution?
yes ..right.... or as work around.. you can use multicluster to get the same number within the same account.
How to create a ware house for getting analysis reports using snowflake and power bi
What is impact of pricing in snowflake for Min/Max Cluster Constrain in particular size of Warehouse . Ex:- is there any impact on x size warehouse cost for if I put min cluster=3 or min cluster=1.
Not sure if I understood your question correctly or not. The pricing has no co-relation with min/max cluster. The credit per unit consumption is based on your usage.
So if you have an enterprise edition, your per credit cost will be around $3 and if you have multicluster running with x size (2credit per house) so for 1 hour of your elastic cluster will cost you equal to normal cluster but when it scale out, it may start costing your double or tripple as per the usage.
How to handle special charachers like Alpha,Beta in Snowflake.? when I load Parquet file in table. I see some special character. I tried casting it to UTF-8 but could not get it exactly.
It is tricky sometime but it works.
If you share the text descrition, I can also try and see how the behaviours looks like.
Excellent. Can you please give me the Code / Command in a separate file?
Thanks for the videos and great explanation with examples for all scenarios. can you please share the scripts used in the video as well
Yes, sure
How I can contact you like mail or phone
most folks drop a note via instagram
instagram.com/learn_dataengineering/