How To Export More Than 16000 Records To A CSV File Using Google BigQuery SQL

Поділитися
Вставка
  • Опубліковано 10 лип 2022
  • When you try to export a query result in Google BigQuery, the platform only lets you to export the first 16,000 records. If you want to export the entire resultset, there are a couple options: using Google BigQuery API or the EXPORT DATA OPTION feature in Google BigQuery SQL.
    In this tutorial, I will walk you through how to use BigQuery EXPORT DATA OPTIONS function in BigQuery SQL and Google Cloud Storage to export the entire resultset to a CSV file.
    PS: The EXPORT DATA OPTIONS function can only export up to 1GB of data per file. If your table size is exceeds 1GB, EXPORT DATA OPTIONS will split the table into multiple files.
    📑 More information on BigQuery: cloud.google.com/bigquery/
    ► Buy Me a Coffee? Your support is much appreciated!
    -------------------------------------------------------------------------------------------
    ☕ Paypal: www.paypal.me/jiejenn/5
    ☕ Venmo: @Jie-Jenn
    💸 Join Robinhood with my link and we'll both get a free stock: bit.ly/3iWr7LC
    ► Support my channel so I can continue making free contents
    ---------------------------------------------------------------------------------------------------------------
    🌳 Becoming a Patreon supporter: / jiejenn
    🛒 By shopping on Amazon → amzn.to/2JkGeMD
    📘 Facebook Page → / madeinpython
    📘 More tutorial videos on my website → LearnDataAnalysis.org
    👩‍💻 Follow me on Linked: / jiejenn
    ✉️ Business Inquiring: UA-cam@LearnDataAnalysis.org
    #BigQuery #GoogleBigQuery #GCP #GoogleStorage #GoogleCloud

КОМЕНТАРІ • 12

  • @deanovitamachely
    @deanovitamachely Рік тому

    Thanks, very helpful!

  • @percychoquevelarde4478
    @percychoquevelarde4478 Рік тому

    Thank you, it helped me a lot

  • @vitto_af
    @vitto_af Рік тому

    Great!

  • @elhipster1
    @elhipster1 Рік тому

    Thank you for the video. I've copied your code and created a bucket. For some reason it keeps splititng up my query into separate tiny csvs. It currently says, "Successfully exported 4071 rows into 7 files." These files only have 20 columns each. How do I change it so it exports as one file?

  • @digavalli123
    @digavalli123 Рік тому

    Thanks for the nice video..Have a question...Is it possible to delete table data based on row_number function to eliminate duplicate records?

    • @jiejenn
      @jiejenn  Рік тому

      Will have to look into it.

  • @Shreya5646
    @Shreya5646 Рік тому

    When I try this, it gives me an error saying export data cannot reference meta tables

  • @ReshmaDSouza-jj7mb
    @ReshmaDSouza-jj7mb Рік тому

    I would like to know if we can export the data given the variable type as array?

    • @jiejenn
      @jiejenn  Рік тому

      Can you be more specific?

    • @ReshmaDSouza-jj7mb
      @ReshmaDSouza-jj7mb Рік тому

      @@jiejenn The table has an array of json which I couldn't export.

  • @didyouknowamazingfacts2790
    @didyouknowamazingfacts2790 5 місяців тому

    in Bigquery would this way also work with a million records and could a use a AWS bucket instead of google?

    • @jiejenn
      @jiejenn  5 місяців тому

      There is no limited if you upload directly from GCS. Not sure about AWS, but I do think there may be a cap.