Building an Automated Data Pipeline for Sales Data in Google Cloud | GCP Data Engineering Project

Поділитися
Вставка
  • Опубліковано 10 вер 2024
  • Building an Automated Data Pipeline for Sales Data in Google Cloud | GCP Data Engineering Project
    Welcome to our comprehensive tutorial on building an automated data pipeline for sales data using Google Cloud Platform (GCP). In this project, we'll guide you through the entire process of setting up a robust data pipeline that facilitates the seamless upload, storage, processing, and visualization of sales data.
    🔹 Project Overview
    This project demonstrates the integration of several GCP services to create an efficient and automated data pipeline for sales data. We'll show you how to:
    Develop a web portal using Python Flask for uploading sales data files (CSV, Excel).
    Store the uploaded files in a Google Cloud Storage (GCS) bucket.
    Trigger a Google Cloud Function to process and load the data into BigQuery.
    Use Looker Studio to create insightful dashboards and reports for data visualization.
    Source Code - github.com/vis...
    Looking to get in touch?
    Drop me a line at vishal.bulbule@gmail.com, or schedule a meeting using the provided link topmate.io/vis...
    Playlists
    Associate Cloud Engineer -Complete Free Course
    • Associate Cloud Engine...
    Google Cloud Data Engineer Certification Course
    • Google Cloud Data Engi...
    Google Cloud Platform(GCP) Tutorials
    • Google Cloud Platform(...
    Generative AI
    • Generative AI
    Getting Started with Duet AI
    • Getting started with D...
    Google Cloud Projects
    • Google Cloud Projects
    Python For GCP
    • Python for GCP
    Terraform Tutorials
    • Terraform Associate C...
    Linkedin
    / vishal-bulbule
    Medium Blog
    / vishalbulbule
    Github
    Source Code
    github.com/vis...
    #googlecloud #gcp

КОМЕНТАРІ • 18

  • @mulshiwaters5312
    @mulshiwaters5312 3 місяці тому +4

    Excellent video gives end to end view with google services with data flow and Realtime example

  • @vigneshgiri4255
    @vigneshgiri4255 Місяць тому +1

    Your videos on GCP data engineering are outstanding! You explain complex concepts with such clarity and ease, making them accessible for everyone. Your step-by-step approach from the basics to advanced topics is incredibly helpful. Thanks for sharing your knowledge and making learning GCP so enjoyable and effective. Keep up the great work and post more videos! I have already SUSBCRIBED :)

    • @techtrapture
      @techtrapture  Місяць тому +1

      Thanks for the kind words 🎉

  • @bernasiakk
    @bernasiakk Місяць тому +1

    How great is this stuff, thanks for your time!!!

  • @sarathysrm
    @sarathysrm День тому

    How to load parameter from csv to big query like batch id column or insert time

  • @abdulfasith7905
    @abdulfasith7905 3 місяці тому +1

    Nice video!! Please add Dataproc in your next pipeline video.

  • @kuntalbanerjee6903
    @kuntalbanerjee6903 2 місяці тому +1

    Please provide a video on dataproc and please explain when to use what tools like dataproc,dataflow,cloud function,cloud data fusion etc and also add video how to automate process for streamline real time,semistructure and unstructure data.

  • @malleshdatta8832
    @malleshdatta8832 Місяць тому

    Can’t we create the same cloud functions like a python file you created in VScode to upload the file from source to bucket ??

    • @techtrapture
      @techtrapture  Місяць тому

      Yes we can create. But in this video we worked on the requirement that non technical front office person will upload file from the front office

  • @rishiraj2548
    @rishiraj2548 3 місяці тому

    🙏👍 good evening

  • @prashlovessamosa
    @prashlovessamosa 3 місяці тому +1

    please upload more data engineering stuff.

    • @techtrapture
      @techtrapture  3 місяці тому

      Definitely, I will add more videos

  • @fernandoplak6925
    @fernandoplak6925 Місяць тому

    Thankss

  • @mbuyimeech
    @mbuyimeech Місяць тому +1

    How do you create the 3 Separate csv files???

    • @techtrapture
      @techtrapture  Місяць тому

      I first downloaded CSV file from kaggle & then copy pasted a few( desired) records to another csv

  • @varunmedisetty
    @varunmedisetty Місяць тому

    Sir it while excepting code in vs it says quota limit exceeded

  • @khananas15
    @khananas15 Місяць тому

    Is this tutorial also for beginners?