How to export all google (universal) analytics data | Airbyte, Bigquery, Google Sheet, looker studio
Вставка
- Опубліковано 4 лип 2024
- UA - GA4 migration - part I
★ TABLE OF CONTENTS ★
0:00 intro
1:15 GA sheets extension: Installation
2:22 GA sheets extensions: Create a new report
5:55 GA sheets extensions: Run/Schedule report
10:24 Airbyte
11:11 Airbyte: Getting started
14:25 Airbyte: Adding a source (google analytics)
20:55 Airbyte: Adding a destination (bigquery)
26:40 Airbyte: Creating a connection
29:45 Airbyte: Adding custom google analytics report
34:15 Looker studio: Create a report
35:45 Looker studio: Adding data from google sheets
37:48 Looker studio: Adding data from Bigquery
40:04 Airbyte: Stopping a local instance
Great video!
Thank you very much!
I like how you intentionally ran into the PERMISSION-DENIED error, and showed us how to solve it.
It's probably one of the most common places people get stuck.
Glad you found it useful
Thank you! very useful video, everything is described in detail.
Excellent Video Thank you, im waiting for DBT video 🙏🤙🤙🤙
Thanks a lot for the informative video 🤓🤓
Great video! I wonder if you have run into the issue of the Google Analytics API limiting the data to a 14 month window? When I run a Google Analytics Universal export, with a start date of e.g. 2020-01-01, the data always begins at exactly 14 months prior to the current date. This makes it impossible to export historic data prior to the rolling 14 month date. Is there a workaround for this?
Are there issues/challenges with sampled data? On the Airbyte website it mentions that could be a possibility but that was written prior to your video so I wonder if that is no longer an issue?
What about adding custom dimension /metrics in the google sheet using this add on?
Hi, I'm looking to export GA4 custom dimensions to Big query. Do you have a solution for that? Thanks
when I try to configure the destination as per the video, I get the below error message when i test the connections. can anyone help with this error? when i choose the option to use insert statement, the connection establishes successfully however, i get an error of "I don't have permission to create table" so I am not sure what to do. any help will be much appreciated.
Configuration check failed
State code: NoSuchKey; Message: The specified key does not exist. (Service: Amazon S3; Status Code: 404; Error Code: NoSuchKey; Request ID: null; S3 Extended Request ID: null; Proxy: null)
Great video! How can I estimate the Bigquery pricing? The Google Cloud calculator is not obvious.
Hi, Stella, at the second 28:10 there is a setting "Full refresh | Overwrite". Is it better to select "Incremental | Append", so we don't reload all the data every day and just sync changes?
I see, you've already answered, thanks.
Great video, thanks Stella! Quick question…I’m pulling pretty large data sets (5+ years by day) so we can preserve our historical data and I’m constantly running into sampling issues. Are there any tools that can pull the Google Analytics data without using sampled data?
@aklein17, Were you able to pull the 5 years data. I have similar requirement, trying to see whether you could accomplish it.
Great Explanation sir, So how we can backup old data from 2019, is it possible? Thanks in advance
I follow your video its working very fine, can you guide me how to backup UA ecommerce data. Thanks in advance
Any reason why you not just use the UA-Data Studio connection and start building up what you need in Data/Looker Studio and export from there? Or let me ask in a different way - why would you need to build uo the data in Google Sheets first instead of just using the connector between LookerStudio and UA?
From July 1st the data will no longer be available within UA.
Great explanation, thank you very much! Question: you are using "Full refresh - Overwrite". That would sync all data from scratch every time the sync runs, right? What would be the ideal sync mode for that case? Do we need dedup history for daily data?
Yes "Full refresh - overwrite" will delete and reload all data. You can look into "Incremental Sync - Deduped History" or other variants for your use case. Here are the docs docs.airbyte.com/understanding-airbyte/connections/. Hope that helps