Seamless Data Integration: ETL from Google Cloud Storage Bucket to BigQuery with Cloud Functions

Поділитися
Вставка

КОМЕНТАРІ • 26

  • @zzzmd11
    @zzzmd11 8 місяців тому +1

    Hi, Thanks for the great informative video. can you explain the flow if the data srouce is from a Rest API. Can we have a dataflow configured to extract from a Rest API to big query with dataflow without having cloud functions or Apache beam scripts involved? Thanks a lot in advance..

    • @cloudquicklabs
      @cloudquicklabs  8 місяців тому

      Thank you for watching my videos.
      Your requirement is custom ,where data source is API and you should query API to get data. And I believe in this case Cloud Function best suitable as API need invocation here.

  • @ananbanerjee3111
    @ananbanerjee3111 15 днів тому +1

    great video, thanks

    • @cloudquicklabs
      @cloudquicklabs  15 днів тому

      Thank you for watching my videos.
      Glad that it helped you.

  • @nathaniasantanigels
    @nathaniasantanigels Місяць тому +2

    Is it possible for me to create a pipeline that scrapes data from Google Sheets, and whenever there's an update in the Google Sheet, the bot updates the data without overwriting the existing data? How can I ensure that when there's an update, it won't overwrite the previous data?

    • @cloudquicklabs
      @cloudquicklabs  Місяць тому

      Thank you for watching my videos.
      It should be possible considering below points.
      1. Google document event should be captured in GCP (need to figure out if it is possible)
      2. We can stop overwritten with the pipeline set up.

  • @andrewbateman2282
    @andrewbateman2282 11 місяців тому +1

    Useful informative video. Thanks.

    • @cloudquicklabs
      @cloudquicklabs  11 місяців тому

      Thank you for watching my videos.
      Glad that it helped you.

    • @varra19
      @varra19 10 місяців тому

      @@cloudquicklabs Informative video...
      its previous video(where code is explained) is not having Audio.
      ua-cam.com/video/bHD8aRaWZOY/v-deo.html. from 14 minutes

  • @sprinter5901
    @sprinter5901 11 місяців тому +1

    8:27 I don't have the entry point function like you have. Its just an empty function with some comments inside.

    • @sprinter5901
      @sprinter5901 11 місяців тому +1

      for those who want the code-
      import functions_framework
      # Triggered by a change in a storage bucket
      @functions_framework.cloud_event
      def hello_gcs(cloud_event):
      data = cloud_event.data
      event_id = cloud_event["id"]
      event_type = cloud_event["type"]
      bucket = data["bucket"]
      name = data["name"]
      metageneration = data["metageneration"]
      timeCreated = data["timeCreated"]
      updated = data["updated"]
      print(f"Event ID: {event_id}")
      print(f"Event type: {event_type}")
      print(f"Bucket: {bucket}")
      print(f"File: {name}")
      print(f"Metageneration: {metageneration}")
      print(f"Created: {timeCreated}")
      print(f"Updated: {updated}")

    • @cloudquicklabs
      @cloudquicklabs  11 місяців тому

      Thank you for watching my videos.
      Cloud is always evolutionary, it might have changed. Please find the necessary files from GitHub link shared in videos description.
      Note that GCP auto populating the code syntax as soon as you choose the run time at code option while creating Cloud Function.

  • @tejaspise4638
    @tejaspise4638 7 місяців тому +1

    Great video, where can i learn to write the script like the one used in the video(i want to learn how to use the google cloud library)

    • @cloudquicklabs
      @cloudquicklabs  7 місяців тому

      Thank you for watching my videos.
      Glad that it helped you.
      To get started you can check GCP documents for developers using APIs, I shall create new videos in future as well.

  • @prashantshankavaram
    @prashantshankavaram 11 місяців тому +1

    Hi Anjan, thank you for the nice video. But the code given does not work. is it because the entry point has to be modified

    • @cloudquicklabs
      @cloudquicklabs  11 місяців тому

      Thank you for watching my videos.
      And thak you for sharing your inputs here.

  • @theamithsingh
    @theamithsingh 9 місяців тому +1

    do an entire series that, shows how to engineer data on gcp :)

    • @cloudquicklabs
      @cloudquicklabs  9 місяців тому

      Thank you for watching my videos.
      Appreciate your valuable inputs here. I shall make this in my plan.

  • @CarlosMarin-lp9xe
    @CarlosMarin-lp9xe Рік тому +1

    Hi!
    I got this error "NameError: name 'data' is not defined". Does anyone know how to fix it? Thanks in advance.

    • @cloudquicklabs
      @cloudquicklabs  Рік тому

      Thank you for watching my videos.
      It looks to be code syntax issues. Please check you code again, May be you can re-use the file I shared in description

    • @hilo-coding-tutorials
      @hilo-coding-tutorials Рік тому

      i had the exact same issue and copy/pasted your code directly into the cloud function. What line in your code do you define this variable?@@cloudquicklabs

  • @iFunktion
    @iFunktion 10 місяців тому +1

    Not sure how you managed this at all, I just get an error saying Container Failed to Start. Any tips on what might have failed because google cloud does not appear to give any help

    • @cloudquicklabs
      @cloudquicklabs  10 місяців тому

      Thank you for watching my videos.
      While I understand the difficulty to use GCP service (as community is very small), to me it looks that you have issue at setting Cloud Function, May be try creating new Cloud function once again , do follow the video carefully. All the best.

  • @varra19
    @varra19 10 місяців тому +1

    its previous video(where code is explained) is not having Audio.
    ua-cam.com/video/bHD8aRaWZOY/v-deo.html.

    • @cloudquicklabs
      @cloudquicklabs  10 місяців тому

      Thank you for watching my videos.
      Yes there was miss in recording , Apologies.
      But the required code of the video can be found in video description which is missing piece in this video.