Very helpful video. I just have a question. These techniques look like they are more for data upload, and not really ingestion. When we talk about "ingestion", we usually are referring to data pipelines, DataFlow, DataProc, Cloud SQL, BigQuery, etc. And as far as "streaming ingestion" is concerned, Pub/Sub is probably the first thing that comes to our mind. I fully agree that there are no hard and fast rules and its perfectly ok to call copying data to Cloud Storage as 'ingestion'.
He refers the actions of 1:1 data ingestion.. if we think of any transformation necessary then we do use data pipeline using DF/DP/BQ-procedures.. or any combination of it..
@@selvapalani9727 you are right. Given that S3 buckets can indeed act as data lakes and that Athena, BigQuery, Redshift and other such services can directly query data from such object storages, there is no harm in calling the process of pushing data into them as 'ingestion'. They are not 'files' any more. They could be 'data' out of which insights can be derived.
Hi sir, I want to ingest data, using an API from a 3 party vendor into GCP. The data comes once in a month. Can you suggest me a way to ingest data to GCP and which services should I use to ingest data and need a Scheduler or a trigger that run automatically after 30 days
Apologies for the late reply. You can check out cloud run to run your api calls and you can schedule them using something like cloud scheduler or cloud composer on GCP.
thank you very much very well explained
awesome content..thanks for such videos
Amazing explanation for beginners, thanks a lot for this informative video!
Great explanation..really to the point.. expecting more videos please
Brilliant content. Would love if more videos' on same series would come often.
Amazing 👏
very clearly explained
Excellent sir.
Thank you for this video.......very well explained
Great Video, super helpful
Very well explained
Thank you, very useful.
Great content keep it up. But sort the echo please.
any mechanism to move files from GCP (GCS Storage) to another Cloud Provider like Azure ?
Very helpful video.
I just have a question. These techniques look like they are more for data upload, and not really ingestion. When we talk about "ingestion", we usually are referring to data pipelines, DataFlow, DataProc, Cloud SQL, BigQuery, etc. And as far as "streaming ingestion" is concerned, Pub/Sub is probably the first thing that comes to our mind.
I fully agree that there are no hard and fast rules and its perfectly ok to call copying data to Cloud Storage as 'ingestion'.
He refers the actions of 1:1 data ingestion.. if we think of any transformation necessary then we do use data pipeline using DF/DP/BQ-procedures.. or any combination of it..
@@selvapalani9727 you are right. Given that S3 buckets can indeed act as data lakes and that Athena, BigQuery, Redshift and other such services can directly query data from such object storages, there is no harm in calling the process of pushing data into them as 'ingestion'. They are not 'files' any more. They could be 'data' out of which insights can be derived.
thkz !!! from BRASIL
when will the next video on streaming? very useful for me.
I'm sorry this has taken time. Stay tuned... 👍🏼
Video helped👍
sir make the videos on Google cloud.
Hi sir, I want to ingest data, using an API from a 3 party vendor into GCP. The data comes once in a month. Can you suggest me a way to ingest data to GCP and which services should I use to ingest data and need a Scheduler or a trigger that run automatically after 30 days
Apologies for the late reply. You can check out cloud run to run your api calls and you can schedule them using something like cloud scheduler or cloud composer on GCP.