Data Journey EP-02: Batch Ingestion 📦 - 5 ways to ingest files into Google Cloud

Поділитися
Вставка

КОМЕНТАРІ • 24

  • @loufoua7640
    @loufoua7640 3 місяці тому

    thank you very much very well explained

  • @baldevsingh-b5h9j
    @baldevsingh-b5h9j Рік тому +1

    awesome content..thanks for such videos

  • @jiyasingh664
    @jiyasingh664 10 місяців тому

    Amazing explanation for beginners, thanks a lot for this informative video!

  • @vis7681
    @vis7681 3 роки тому +1

    Great explanation..really to the point.. expecting more videos please

  • @karan7843
    @karan7843 Рік тому

    Brilliant content. Would love if more videos' on same series would come often.

  • @vijaysoni6517
    @vijaysoni6517 7 місяців тому

    Amazing 👏

  • @yingchen8028
    @yingchen8028 3 роки тому +1

    very clearly explained

  • @dattatreyakomakula1650
    @dattatreyakomakula1650 Рік тому

    Excellent sir.

  • @jitrammaharjan9506
    @jitrammaharjan9506 2 роки тому

    Thank you for this video.......very well explained

  • @kaijunhe3339
    @kaijunhe3339 Рік тому

    Great Video, super helpful

  • @NitinKumar65
    @NitinKumar65 2 роки тому

    Very well explained

  • @tonyforbes3889
    @tonyforbes3889 2 роки тому

    Thank you, very useful.

  • @muzahmad2104
    @muzahmad2104 Рік тому

    Great content keep it up. But sort the echo please.

  • @mulshiwaters5312
    @mulshiwaters5312 4 місяці тому

    any mechanism to move files from GCP (GCS Storage) to another Cloud Provider like Azure ?

  • @anandakumarsanthinathan4740
    @anandakumarsanthinathan4740 2 роки тому

    Very helpful video.
    I just have a question. These techniques look like they are more for data upload, and not really ingestion. When we talk about "ingestion", we usually are referring to data pipelines, DataFlow, DataProc, Cloud SQL, BigQuery, etc. And as far as "streaming ingestion" is concerned, Pub/Sub is probably the first thing that comes to our mind.
    I fully agree that there are no hard and fast rules and its perfectly ok to call copying data to Cloud Storage as 'ingestion'.

    • @selvapalani9727
      @selvapalani9727 2 роки тому +1

      He refers the actions of 1:1 data ingestion.. if we think of any transformation necessary then we do use data pipeline using DF/DP/BQ-procedures.. or any combination of it..

    • @anandakumarsanthinathan4740
      @anandakumarsanthinathan4740 2 роки тому

      @@selvapalani9727 you are right. Given that S3 buckets can indeed act as data lakes and that Athena, BigQuery, Redshift and other such services can directly query data from such object storages, there is no harm in calling the process of pushing data into them as 'ingestion'. They are not 'files' any more. They could be 'data' out of which insights can be derived.

  • @pandex8997
    @pandex8997 2 роки тому

    thkz !!! from BRASIL

  • @MrJohnloose
    @MrJohnloose 3 роки тому +1

    when will the next video on streaming? very useful for me.

    • @elastiq-ai
      @elastiq-ai  3 роки тому +1

      I'm sorry this has taken time. Stay tuned... 👍🏼

  • @sukritichettri7576
    @sukritichettri7576 3 роки тому

    Video helped👍

  • @punitmore
    @punitmore 2 роки тому

    sir make the videos on Google cloud.

  • @jayantmathur6810
    @jayantmathur6810 2 роки тому

    Hi sir, I want to ingest data, using an API from a 3 party vendor into GCP. The data comes once in a month. Can you suggest me a way to ingest data to GCP and which services should I use to ingest data and need a Scheduler or a trigger that run automatically after 30 days

    • @elastiq-ai
      @elastiq-ai  Рік тому

      Apologies for the late reply. You can check out cloud run to run your api calls and you can schedule them using something like cloud scheduler or cloud composer on GCP.