#Snowflake

Поділитися
Вставка
  • Опубліковано 20 гру 2024

КОМЕНТАРІ • 44

  • @zohebshumaeel5804
    @zohebshumaeel5804 Місяць тому +1

    Hello Anupam, Great Insight for us aspiring data engineers.
    Just have a request could you please create a video regarding auto ingestion from Google cloud storage in snowflake just as you have done for this AWS.

  • @archanapattanaik8713
    @archanapattanaik8713 9 місяців тому +1

    Thank you for this video. I appreciate your effort for sharing knowledge .Can you please make some video on Acryl Datahub ,DataHub ,metadata, glossary , domain, data lineage out of the data by ingesting them

  • @rajwedhikar2955
    @rajwedhikar2955 4 місяці тому +1

    Thanks a lot, very helpful

  • @hirenparikh1364
    @hirenparikh1364 Рік тому

    Excellent resource !!!

  • @TexasCoffeeBeans
    @TexasCoffeeBeans Рік тому

    Very informative video. Thank you.

  • @hienngo8761
    @hienngo8761 7 місяців тому

    Thank you for the video. It is short but has enough information. I have two questions: How does snowflake detect duplicate files? Is there any query to identify those duplicated files processed in Snowflake ?

    • @anupam.kushwah
      @anupam.kushwah  5 місяців тому

      It keeps track of the file name in the internal metadata tables. check this page if it helps - community.snowflake.com/s/article/COPY-INTO-completes-with-Copy-executed-with-0-files-processed

  • @pratz3198
    @pratz3198 10 місяців тому

    Hello Anupam, Do we have to create the sqs queue in AWS frst or does creating snow pipe directly creates the queue?

    • @anupam.kushwah
      @anupam.kushwah  10 місяців тому

      SQS notification channel will automatically created when you create the snowpipe

  • @nvkishorekumar
    @nvkishorekumar Рік тому

    Thanks for the video . I have a small doubt, lets say if we are using a internal stage and there are couple of different files i.e., employee,salary,dept. All are of type csv. Now if i have to load them to their respective tables in snowflake i.e, employee, salary, dept using pipe, is it possible ? if yes what will be the notification channel?

    • @anupam.kushwah
      @anupam.kushwah  Рік тому

      It is possible to load the data from a stage with different file names using pattern. Please refer this document for more info - docs.snowflake.com/en/sql-reference/sql/copy-into-table#syntax

  • @vaibhavverma1340
    @vaibhavverma1340 Рік тому

    Thank you for the video, I have a doubt Suppose the source file deleted I mean the file in S3 so are we still able to get the rows when we querying the data ..the select * from ??

    • @anupam.kushwah
      @anupam.kushwah  Рік тому +1

      if the data is loaded into the snowflake table and then s3 file deleted, we will be able to see the data as the data loaded into the table and can be used for further analysis. it is not like external table where you just define the table structure in the snowflake and actual data reside in the s3.

  • @mickyman753
    @mickyman753 7 місяців тому

    for that sns. do we need to make anew sns in our aws account to so that it works or that sns is alreayd provided by aws, and also do we need to change trust policy of s3 bucket or add any policy

  • @ashishchaurasiya106
    @ashishchaurasiya106 5 місяців тому +1

    nice

  • @maryam4071
    @maryam4071 9 місяців тому

    hi, where can i find the previous video? You are mentioning that in the last session ..

  • @princynandan9389
    @princynandan9389 Рік тому

    Thanks for the video, that's very informative :). However I have a doubt.. Let's say my csv file has more no of columns than in the table and since AUTO_INGEST = True, So in this case will it load the data into table or it won't process the file?. If it doesn't process , is there any log file which keeps a track of it?

    • @anupam.kushwah
      @anupam.kushwah  Рік тому

      Hello Princy,
      Please follow the below document for more details on it
      docs.snowflake.com/en/sql-reference/functions/pipe_usage_history

  • @muralikrishna-gk4hx
    @muralikrishna-gk4hx 7 місяців тому

    Thanks sir
    can i create snowpipe on internal named stage

  • @aniketkardile
    @aniketkardile 4 місяці тому

    for ingesting multiple different tables to multiple different stages

  • @rameshram1117
    @rameshram1117 Рік тому

    how to load multiple files with diffrent meta data using one single pipe and single stage its possible ?

    • @anupam.kushwah
      @anupam.kushwah  Рік тому

      It’s better to keep different type of files into different stages and load with separate snowpipes

    • @anupam.kushwah
      @anupam.kushwah  Рік тому

      With one pipe it’s not possible

    • @rameshram1117
      @rameshram1117 Рік тому

      @@anupam.kushwah ok sir thank you

  • @basavaraju1466
    @basavaraju1466 Рік тому

    Hi Bro, I need to configure create pipe connector, especially copy statement; please help me

    • @anupam.kushwah
      @anupam.kushwah  Рік тому +1

      Please use this link to create Snowpipe anupamkushwah.medium.com/snowflake-snowpipe-automated-data-loading-from-s3-bucket-b395f8d508da
      If you have any further question, please mention.

    • @basavaraju1466
      @basavaraju1466 Рік тому

      @@anupam.kushwah Thanku for your response, we don't want to use S3 or any other cloud, we just need direct integration between mule and snowflake that too using create pipe due to cost concerns

    • @anupam.kushwah
      @anupam.kushwah  Рік тому +1

      Snowpipe only support loading data from public cloud stages. please refer this documentation - docs.snowflake.com/en/user-guide/data-load-snowpipe-intro

  • @FurqaanShafi
    @FurqaanShafi Рік тому

    sir how to load data from internal storage i.e laptop ,csv file?

    • @anupam.kushwah
      @anupam.kushwah  Рік тому

      You can load CSV files from laptop using snowsql tool and put command. Follow this link for step by step instructions
      docs.snowflake.com/en/user-guide/data-load-internal-tutorial

  • @aakashyadav4062
    @aakashyadav4062 Рік тому

    i have done everything but data isn't showing in the table

    • @anupam.kushwah
      @anupam.kushwah  Рік тому

      could you please elaborate the problem and share the scripts here

    • @aakashyadav4062
      @aakashyadav4062 Рік тому

      @@anupam.kushwah In the beginning when I was using the copy command to load data from stage to my table. Then it was showing error on integer type columns. But when I took another dataset it works with copy command. Now it is not working with pipe.

    • @aakashyadav4062
      @aakashyadav4062 Рік тому

      @@anupam.kushwah CREATE OR REPLACE TABLE HEALTHCARE(
      Patientid VARCHAR(15),
      gender CHAR(8),
      age VARCHAR(5) ,
      hypertension CHAR(20),
      heart_disease CHAR(20),
      ever_married CHAR(30),
      work_type VARCHAR(60),
      Residence_type CHAR(30) ,
      avg_glucose_level VARCHAR(20),
      bmi VARCHAR(20) ,
      smoking_status VARCHAR(20),
      stroke CHAR(20)
      );
      CREATE FILE FORMAT CSV
      TYPE = CSV;
      --create storage integration
      CREATE OR REPLACE STORAGE INTEGRATION snowpipe_integration
      TYPE = external_stage
      STORAGE_PROVIDER = s3
      STORAGE_AWS_ROLE_ARN = 'arn:aws:iam::922199053730:role/my-snowpipe-role'
      ENABLED = true
      STORAGE_ALLOWED_LOCATIONS = ('*');
      desc integration snowpipe_integration
      --create stage
      create or replace stage patient_snow_stage
      url = 's3://my-snowflake-bucket-akas'
      file_format = CSV
      storage_integration = snowpipe_integration;
      LIST @patient_snow_stage
      show stages
      --pull data from stage directly
      select $1, $2 from @patient_snow_stage/test1.csv
      create or replace pipe patients_snowpipe auto_ingest = TRUE AS
      copy into DEMO_DATABASE.PUBLIC.patients3
      from @patient_snow_stage
      ON_ERROR = 'skip_file';
      show pipes
      alter pipe patients_snowpipe refresh;
      create or replace table patients3 like HEALTHCARE;
      select * from patients3;

    • @anupam.kushwah
      @anupam.kushwah  Рік тому

      @@aakashyadav4062 Did you setup the sqs in AWS for snowflake pipe?

    • @aakashyadav4062
      @aakashyadav4062 Рік тому

      @@anupam.kushwah yes i have created the event notification and paste the sqs key of notification channel in it.