Anjan GCP Data Engineering
Anjan GCP Data Engineering
  • 91
  • 427 128

Відео

Google Cloud Storage blazing fast uploads and downloads with Python client libraries
Переглядів 2912 місяці тому
Demo code is here: github.com/anjangcp/GCP-Data-Engineering-Demo-Codes/blob/1e710665c0d875330b486e5847c9a0d1b190a99e/GoogleCloudStorage/gcs_python_client_demo.py
Data Lineage | Track Big Query table data Lineage
Переглядів 5352 місяці тому
Data Lineage | Track Big Query table data Lineage
Write Big Query SQL queries with Gemini assistance
Переглядів 5934 місяці тому
Write Big Query SQL queries with Gemini assistance
Basics of Service Accounts and IAM Roles
Переглядів 2,4 тис.Рік тому
Basics of Service Accounts and IAM Roles
Big Query BigLake Tables
Переглядів 2,1 тис.Рік тому
Big Query BigLake Tables
Learn code free Data pipelines with Cloud Data Fusion | Packed with detailed Demos and explanations
Переглядів 10 тис.Рік тому
Learn code free Data pipelines with Cloud Data Fusion | Packed with detailed Demos and explanations
Serverless Change Data Capture (CDC) , Replication , Data Synchronisation with Cloud Datastream
Переглядів 2,7 тис.Рік тому
Serverless Change Data Capture (CDC) , Replication , Data Synchronisation with Cloud Datastream
ETL nested AVRO files into Cloud SQL, GCS using Dataflow Pipeline
Переглядів 1,5 тис.Рік тому
Code : github.com/anjangcp/GCP-Data-Engineering-Demo-Codes/blob/9b8b7b5c07129188237965b266f6d73f0350d5a0/Dataflow/batch_etl_avro_data_cloudsql.py Must watch below videos before you watch this : ua-cam.com/video/hWpNyGj5xVk/v-deo.htmlsi=is1OHok0P0XMDRsS ua-cam.com/video/dKGB3d2XXiE/v-deo.htmlsi=CvdRfpAeyl8NYaDd
Automate GCP Project’s IAM Roles Snapshot process using Cloud Functions, Bigquery, Cloud Scheduler
Переглядів 583Рік тому
Ref Code: github.com/anjangcp/GCP-Data-Engineering-Demo-Codes/blob/698552fb8352876303e3d41a2e3ea56ae552d103/Common_Realtime_Usecases/iam_snapshots.py
GCP Professional Data Engineer, re certified ,Exam experience and my suggestions to Aspirants
Переглядів 1,1 тис.Рік тому
GCP Professional Data Engineer, re certified ,Exam experience and my suggestions to Aspirants
Selecting GCP ETL/Data processing services | GCP Professional Data Engineer
Переглядів 1 тис.Рік тому
Selecting GCP ETL/Data processing services | GCP Professional Data Engineer
GCP Professional Data Engineer | Storage and Database selection process, Test your knowledge
Переглядів 617Рік тому
GCP Professional Data Engineer | Storage and Database selection process, Test your knowledge
GCP Professional Data Engineer | Selecting suitable storage and database services based on use case
Переглядів 1,3 тис.Рік тому
GCP Professional Data Engineer | Selecting suitable storage and database services based on use case
Cloud Big Table - Introduction | Console , CBT - Basic Demo with Examples
Переглядів 2,2 тис.Рік тому
Cloud Big Table - Introduction | Console , CBT - Basic Demo with Examples
Import Bulk data from GCS into Spanner Table using Dataflow Jobs
Переглядів 1,1 тис.Рік тому
Import Bulk data from GCS into Spanner Table using Dataflow Jobs
Basic Cloud Spanner database operations using Python Client Libraries
Переглядів 684Рік тому
Basic Cloud Spanner database operations using Python Client Libraries
Cloud Spanner Introduction demo with examples
Переглядів 1,7 тис.Рік тому
Cloud Spanner Introduction demo with examples
Automate Dataproc workloads using Cloud Composer
Переглядів 4,6 тис.Рік тому
Automate Dataproc workloads using Cloud Composer
Composer - Airflow https operators, Extract data from web API using https operators with examples
Переглядів 4,9 тис.Рік тому
Composer - Airflow https operators, Extract data from web API using https operators with examples
Thanks for your support !!! and More to come
Переглядів 93Рік тому
Thanks for your support !!! and More to come
Schedule secured data load from Google sheets to Big query|Cloud Functions,Cloud Scheduler and more
Переглядів 2,4 тис.Рік тому
Schedule secured data load from Google sheets to Big query|Cloud Functions,Cloud Scheduler and more
Big Query Clustered Tables with Examples
Переглядів 6 тис.Рік тому
Big Query Clustered Tables with Examples
Big Query Table Partitions with Examples
Переглядів 8 тис.Рік тому
Big Query Table Partitions with Examples
Google Cloud Storage Object Versioning and Auto Class
Переглядів 1,5 тис.Рік тому
Google Cloud Storage Object Versioning and Auto Class
Schedule Dataproc Workflows using Cloud Scheduler
Переглядів 2,2 тис.Рік тому
Schedule Dataproc Workflows using Cloud Scheduler
Cloud Scheduler Introduction with simple examples
Переглядів 6 тис.Рік тому
Cloud Scheduler Introduction with simple examples
Export and Analyze GCP Billing data with BigQuery and Looker Studio
Переглядів 3,7 тис.Рік тому
Export and Analyze GCP Billing data with BigQuery and Looker Studio
Google Cloud Storage (GCS) Object Lifecycle Management with Examples
Переглядів 5 тис.Рік тому
Google Cloud Storage (GCS) Object Lifecycle Management with Examples
Dataproc Workflow Templates | Execute Jobs DAG on Managed Dataproc Cluster
Переглядів 3,7 тис.Рік тому
Dataproc Workflow Templates | Execute Jobs DAG on Managed Dataproc Cluster

КОМЕНТАРІ

  • @VinodKumar-gv7cm
    @VinodKumar-gv7cm 3 дні тому

    Really a great job Anjan. It's a very crystal clear and simple. I learn lot of from your channel. I would like to hear your expertise about python. If I have read CSV from python stand alone while read inside GCP it's seems slightly diffrence . How can understand gcp code. Hopefully once you get free response.

  • @lug__aman
    @lug__aman 8 днів тому

    Hello Anjan Sir your all video are really helpful thanks, I have problem let see I want to capture CDC from PostgreSQL and I want to transfer into bigquery , instead of final data I want to store what was changed instead of final data, I try to created a stream from postgres to pub/sub for storing the CDC event in postgres, but there is no destination option available for pub/sub in data stream, can you I do do you have any idea? postgreSQL -> GCS -> pub/sub ----> pub/sub -> bigquery? I have a stream already running PostgreSQL to Bigquery, it will write change postgres to bigquery,

  • @ShehneelAhmedKhan
    @ShehneelAhmedKhan 9 днів тому

    Great stuff! Just one question, what would be the steps if we need to create pipeline for multiple tables?

  • @mandeepmails
    @mandeepmails 18 днів тому

    👍

  • @ayshwaryachavan7738
    @ayshwaryachavan7738 21 день тому

    Hi , This was a very good explanation. I am also looking for providing logging access using terraform.

  • @Mahasti0707
    @Mahasti0707 21 день тому

    Very great video sir. You are helping us learn and explore. Thank You Sir 😊

  • @saurav0777
    @saurav0777 22 дні тому

    How can we capture the log and in case of any exception and failure and store it to bigquery

  • @srk0001-z3r
    @srk0001-z3r 27 днів тому

    How its charged the billing in the sense after once we create cluster its keep running or else after complted the job is it went to ideal state pls let me sir....

    • @anjangcpdataengineering5209
      @anjangcpdataengineering5209 26 днів тому

      @@srk0001-z3r there is an option to stop the jupiter note book instance if you are not using it for some time , you can turn it on whenever you want it.

  • @anirudhalluru8021
    @anirudhalluru8021 Місяць тому

    👨‍🎓

  • @akilagopinathg
    @akilagopinathg Місяць тому

    Superb explanation, Sir

  • @srikrithibhat1999
    @srikrithibhat1999 Місяць тому

    Great explanation. Keep rocking Bro. Thank You so much for uploading.

  • @kinjal_suryavanshi
    @kinjal_suryavanshi Місяць тому

    In video voice is very low it seems, without handsfree can't hear you properly 🥲

  • @sathyar7078
    @sathyar7078 Місяць тому

    All your videos are useful , But please remove background music , really disgusting and irritating .

  • @sahasra3804
    @sahasra3804 Місяць тому

    Join button not visible 😊

  • @arunramanathan8214
    @arunramanathan8214 Місяць тому

    Can you just explain that how frequently you will post new member only videos Sir!

  • @sureshraina321
    @sureshraina321 Місяць тому

    sir , If I get membership can you tell me what are the skill set that we can expect in the channel. And how many months will it take to upload all the content in the channel(members only). Please tell me

    • @anjangcpdataengineering5209
      @anjangcpdataengineering5209 Місяць тому

      @@sureshraina321 hi , topics will be covered across GCP Data Engineering tech stack , uploading content is not a one time activity, it will take time that to we are will be focusing on realtime scenarios backed up by demos and codes , it's a continuous process , hope this answers your question.

    • @sureshraina321
      @sureshraina321 Місяць тому

      Yes I got it, may I know if I join today what all the existing skills are available to kickstart so that I will not wait for your next content to be updated ?

  • @aravindsagar9624
    @aravindsagar9624 2 місяці тому

    tq sir

  • @vinamrajain-b7e
    @vinamrajain-b7e 2 місяці тому

    Kindly Provide Access To The Python Code...

  • @Aravind-pw9in
    @Aravind-pw9in 2 місяці тому

    how did you connect jupyter notebook to gcs

  • @sebastianalegria6304
    @sebastianalegria6304 2 місяці тому

    Thanks Anjan, I am studiying for an interview related to Apache Beam and Dataflow. I will let you know in this comment if I get the job. Cheers from Chile

  • @shekharkashyap4752
    @shekharkashyap4752 2 місяці тому

    Excellent learning series.

  • @dcpandey89
    @dcpandey89 2 місяці тому

    Any specific reason you are using yield statement in ParDo logic instead of return statement? I see, return is giving me flatmap result while yield is giving me map result. why is this happening?

  • @shekharkashyap4752
    @shekharkashyap4752 2 місяці тому

    most underrated video, very good explanation 🙌🏼

  • @subramonyganesh2914
    @subramonyganesh2914 2 місяці тому

    Hi Anjan , Can you please let know where can I access the python script. I couldn’t find in demo codes repo Would be helpful to learn Thanks

  • @Humani11
    @Humani11 2 місяці тому

    Why to create compute engine for that? Just run the publisher script from local and verify by running the subscriber script

  • @nathaniasantanigels
    @nathaniasantanigels 2 місяці тому

    i wanna ask if in this process which one was right ETL or ELT from your video above?

  • @vignesh004
    @vignesh004 2 місяці тому

    Hi Anjan, Thanks for this. Kindly do more data engineering videos especially focusing on data mesh architecture. Also tips and tricks would be more beneficial.

  • @sjvr1628
    @sjvr1628 2 місяці тому

    Please make and upload videos on GCP data platform topics from start to end

  • @sjvr1628
    @sjvr1628 2 місяці тому

    Great Anjan, thanks for upgrading everyone watching on GCP, please create more videos and keep us learning more. I am joining my new job on GCP data platform leader

  • @shreyasshetty8554
    @shreyasshetty8554 2 місяці тому

    Invalid table-valued function EXTERNAL_QUERY Connect to PostgreSQL server failed: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request ,,, Im getting above error while querying the data from the external connection. Please help

  • @lug__aman
    @lug__aman 2 місяці тому

    I already liked this series so good

  • @lug__aman
    @lug__aman 2 місяці тому

    first comment

  • @VenkatesanVenkat-fd4hg
    @VenkatesanVenkat-fd4hg 2 місяці тому

    Great video and post periodically...

  • @vinayakapallakki247
    @vinayakapallakki247 2 місяці тому

    Simple and effective

  • @priyanka.mishraxWF
    @priyanka.mishraxWF 3 місяці тому

    very well and in detail explained

  • @dodat12
    @dodat12 3 місяці тому

    can we save the dataframe to hdfs instead of gcs

  • @shreyasv5116
    @shreyasv5116 3 місяці тому

    What’s the pricing of this table

  • @SuryaChandraDamerla
    @SuryaChandraDamerla 3 місяці тому

    Good Explanation

  • @sjvr1628
    @sjvr1628 3 місяці тому

    Very nice Anjan, please keep doing more to make us upskill on GCP, AWS, Azure Cloud data platforms. I want a full end to end data project, your explanantion is awesome, simple and clear to understand with slides and labs with examples

  • @yathagiribhaskar6937
    @yathagiribhaskar6937 3 місяці тому

    please provide the json file to understand