You cannot imagine how valuable and insightful this video is, Mihir. Very detailed and a great resource for all the SEOs out there looking to improve their Python knowledge using GSC! Thank you!
Mihir, this was by far one of the best tutorials I have ever gone through. It was exactly what I needed to understand how to successfully call the GSC API, which is not at all easy to understand via the GSC documentation.
Mihir, thank a lot!!! It's work! Really work! This tutorial is hands down one of the best I’ve come across. It gave me the clear guidance I needed to successfully work with the GSC API!
Hey @Mihir Naik This is an excellent video and it was worth a wait. Thanks for making it easy to follow. One suggestion is to improve the audio, sometimes it was clear and sometimes it was low (maybe it's just me). I always struggle to know which pages have issues in featured snippets and what from a technical SEO standpoint we can address. Could you make something about this? it would be helpful. Great video again and look forward to the next one. (Subscribed BTW :) )
Thanks Mihir, this was really easy to follow video. I am always intimidated by the idea of using python in SEO for data analysis. it will be great if you can show the basics of this tool for data analysis in the beginning as well.
Hey Mihir great tutorial. How to do the auth part without opening the popup in browser? I want to get the credentials programatically as the code will be running on server, there I cant open the google popup to generate the authorization code .
Google Search API documentation would be helpful. They have given an example of the flask application where you can see how you can do oauth for the same.
Hi Mihir, great tutorial and thank you for the effort you are putting, but I got an error while inspecting URL with search console API, upon inspecting it further i came to know that for url inspection you need indexing api, so my question is can we not inspect a url using search console api? Thank you and keep up the great work!
Hi very nice video.. i will say best video for GSC api in python - introduction.. i have one question.. can we filter using metrics like clicks, impression etc.. in search analytics? I understand that filtering using dimensions like page & queries are possible.. i wanted to know about metrics filtering.. thanks in advance
Thanks Sankar. I think this is a great question. It shows you are thinking in valuable direction. Its not possible to filter using metrics in API calls. But you can do that using Pandas. Pandas allow us lot of flexibility that we will explore going forward.
Excellent tutorial! its Very Helpful, I want to learn Python, please suggest a good course I don't have any coding skills. any best python course for digital marketers?
In that case, you should refer to the code GSC API have given for Server-side Webapps. That code works with oauth2 authentication architecture and it will give you access token and refresh token. you can keep using refresh token to generate a new access token. This code won't work the best.
Thank you, @TheMihirNaik, for your response. This can be achieved by granting full owner access to the email address in Google Search Console from a service account in Google Cloud Platform (GCP).
@@TheMihirNaik sorry i mix up both tasks are different 1st one is i daily received like 100's indexing error and i dont want to click on start validation again 1 by 1. i need automation for it using console api with python. 2nd usage of console and ga4 api with screaming frog for seo audits.
Yes, I think you could Sitemap API and URL Inspection API to arrive at that number. The only limit is 2000 per day per site in Inspection API. This is just a guess, I will have to confirm.
@@TheMihirNaik I got it. I’ve tried to use the API, but the only information that I’ve got are clicks, CTR and impressions. If you find a way to do that, I will really appreciate.
Thank you for sharing this amazing and value able content. It was very hard for me to understand by Google API documentation but you make possible. May I know how we can export 404 pages by Search Console API?
@TheMihirNaik Hi, I've done some keyword tracking through PyTrends library so far for free but when I go with more keyword tracking I track like 5000 keywords daily So there might exceed the free limit and and I have to buy the API or it works for free?. Please let me know What will be the cost to buy API and from where to buy API and how it works?
Hi Salman, When you say keyword tracking, do you want to check where does your website rank for a specific keyword? I mean to ask, are you talking about Rank Tracking?
@@TheMihirNaik Hello bro, I want to track 5000 keywords so it could be as simple as ranking. I need to know their performances. Please help me how to do this . I reached out semrush API but it is very expensive.
Google Colab loses the state once the runtime is disconnected. To make it a set and forgot thing, you will have to transform them into a web application and save your credentials in a database. On every request to API, the web app will check if it has active credential in database, and if not it will use refresh token to create new access token.
I am making a postman call but when I click on the incoming url "404. That’s an error. The requested URL was not found on this server. That’s all we know." error i checked eveything especially redirect_uri but not solved :(
@ Oh ive solved my problem after sent this comment thank you for helping and answered my question 🙏 (The reason i use postman that i dont use python right now i have to code backend with C#, i wonder if your video would help me, that's why i watched and it helped thanks again)
Hello Mihir, i get the following error when going to my authorize_url: You can't sign in to this app because it doesn't comply with Google's OAuth 2.0 policy for keeping apps secure. What can i do? Thanks
Thanks Mihr for great videos, I'm following the queries, but getting the following error "SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1002)", can you help me?
Hi everyone I've one question can anyone please explain Why does the data in Google search console doesn't match with the data coming from google search console api Why the result coming from api has more number as compared to web ui?
@@TheMihirNaikHi, Thanks for the video, really helpful. In my case, the data from the API is less than the data shown directly in GSC. Do you have any clue why this can happen?
I have an issue at step Generate Authorization URL. I see this error: "You can’t sign in because GSC API sent an invalid request. You can try again later, or contact the developer about this issue. Learn more about this error If you are a developer of GSC API, see error details. Error 400: invalid_request". How can I solve this?
@@TheMihirNaik I I got myself some good quality headphones and I don't have a problem anymore. Great content, I will find it very useful in my work. It's good that you are here. Sending thanks from Poland.
Dear Mihir, thank you very much for this video! This data helps me a lot in understanding customer behaviour. Question: is it possible to automate these steps (i.e. runs this script every week automatically). Right now, there is still a manual procedure necessary: copy and pasting the Auth Code.
I think you could automate it for sure, it just needs a different way of authentication. You could check the documentation here for the authentication. developers.google.com/webmaster-tools/v1/how-tos/authorizing I don't mean to sell myself unnecessarily here but I could help you with custom script. www.mihirnaik.com/google-search-console-api-consultant/ The custom script will be a Google Cloud Function(may be). It will autorun or run on a ping, it will scrape the data, transform it to a CSV and you will receive a mail. You can create as many cloud functions as you would like.
You cannot imagine how valuable and insightful this video is, Mihir. Very detailed and a great resource for all the SEOs out there looking to improve their Python knowledge using GSC! Thank you!
I don't have words to thank you enough for this tutorial.
I've been trying to access the GSC API for ages, and I finally succeeded. Thank you!
Glad it helped!
Mihir, this was by far one of the best tutorials I have ever gone through. It was exactly what I needed to understand how to successfully call the GSC API, which is not at all easy to understand via the GSC documentation.
@@robertodelgado1234 glad you liked it and I very much appreciate you leaving this comment. 🥳
This is just fantastic!
You asked for feedback - just go on! It is perfect for practical use - if I want to implement on the go
Noted. I will focus more on those going forward.
I come back here to learn more about seo with python and to refresh some ideas. Its good to see your content and in a simple way to explain it. Thanks
Mihir, thank a lot!!! It's work! Really work! This tutorial is hands down one of the best I’ve come across. It gave me the clear guidance I needed to successfully work with the GSC API!
Great video, Mihir. Helped a lot, thanks!
Hey @Mihir Naik This is an excellent video and it was worth a wait. Thanks for making it easy to follow. One suggestion is to improve the audio, sometimes it was clear and sometimes it was low (maybe it's just me). I always struggle to know which pages have issues in featured snippets and what from a technical SEO standpoint we can address. Could you make something about this? it would be helpful. Great video again and look forward to the next one. (Subscribed BTW :) )
Thanks, Sreevathsa! I got this feedback from multiple people. I will work on it in the next video.
@@TheMihirNaik Am sure! thanks!
We need real SEO knowledge and gem like to Mihir thank you for sharing
Thanks 😇
in depth! literally spooon fed us! it was fantastic!
Your english is very good brother! and the video is insightful
Glad to hear that!
Thanks Mihir, this was really easy to follow video. I am always intimidated by the idea of using python in SEO for data analysis. it will be great if you can show the basics of this tool for data analysis in the beginning as well.
Glad it was helpful! Sure, I will try to cover.
Great! Thank you for this tutorial..... very helpful🙂
Hey Mihir great tutorial. How to do the auth part without opening the popup in browser? I want to get the credentials programatically as the code will be running on server, there I cant open the google popup to generate the authorization code
.
Google Search API documentation would be helpful. They have given an example of the flask application where you can see how you can do oauth for the same.
Hi Mihir, great tutorial and thank you for the effort you are putting, but I got an error while inspecting URL with search console API, upon inspecting it further i came to know that for url inspection you need indexing api, so my question is can we not inspect a url using search console api?
Thank you and keep up the great work!
@@SEOWizard-x9n there is a video for checking indexing status. I have explained it there.
@@SEOWizard-x9n ua-cam.com/video/TxmLOu_-lkQ/v-deo.htmlsi=GO_u9e6o0lGUrBnx
Here it is.
@@TheMihirNaik Thank you so much will check it out!
Hi very nice video.. i will say best video for GSC api in python - introduction.. i have one question.. can we filter using metrics like clicks, impression etc.. in search analytics? I understand that filtering using dimensions like page & queries are possible.. i wanted to know about metrics filtering.. thanks in advance
Thanks Sankar. I think this is a great question. It shows you are thinking in valuable direction.
Its not possible to filter using metrics in API calls. But you can do that using Pandas. Pandas allow us lot of flexibility that we will explore going forward.
@@TheMihirNaik Thank you for your clarification. Looking forward to more videos. Great start and keep rocking
thanks for sharing this content!
Excellent tutorial! its Very Helpful, I want to learn Python, please suggest a good course I don't have any coding skills. any best python course for digital marketers?
Thank you! CS50 is a great course to start with. Then you can go deeper into python. cs50.harvard.edu/x/2023/
Excellent tutorial!
I want to deploy code in lambda function in AWS to fetch incremental data every day. Do i need to generate auth_code every time i run the code?
In that case, you should refer to the code GSC API have given for Server-side Webapps. That code works with oauth2 authentication architecture and it will give you access token and refresh token. you can keep using refresh token to generate a new access token.
This code won't work the best.
Thank you, @TheMihirNaik, for your response. This can be achieved by granting full owner access to the email address in Google Search Console from a service account in Google Cloud Platform (GCP).
Great video sir keep it up and kindly make video on console auto alerts using python and screaming frog.
Thanks! Can you please share more clearly what are you suggesting?
@@TheMihirNaik sorry i mix up both tasks are different 1st one is i daily received like 100's indexing error and i dont want to click on start validation again 1 by 1. i need automation for it using console api with python.
2nd usage of console and ga4 api with screaming frog for seo audits.
sir, it has been, 1.7 years in SEO, so should I go for python seo, or i should learn GSC and GA completely first
Learning GSC and GA4 first would be helpful.
Superb Video Mihir! Can you explain the bulk request indexing too from Google Colab?
Yes I will cover in upcoming videos.
Thanks a lot for this video Great work
So nice of you
Hey @mihir naik , is there any api through which the search console api is enabled without doing manually
Sorry Lavanya, I'm not sure I'm getting your question.
Great job, Mihir! Do you know if it's possible to use the GSC API to extract the number of indexed pages peer day of a domain?
Yes, I think you could Sitemap API and URL Inspection API to arrive at that number. The only limit is 2000 per day per site in Inspection API. This is just a guess, I will have to confirm.
@@TheMihirNaik I got it. I’ve tried to use the API, but the only information that I’ve got are clicks, CTR and impressions. If you find a way to do that, I will really appreciate.
Thanks Man, You are super!
Thank you for sharing this amazing and value able content. It was very hard for me to understand by Google API documentation but you make possible. May I know how we can export 404 pages by Search Console API?
Thank you! Unfortunately GSC API doesn't allow exporting 404 pages.
Thank You. It's going to be really helpful for many of us.
Glad! Please share your feedback or suggestions so I can improve!
@@TheMihirNaik Yes sure :)
Man, Thank you very much, it helped a lot 💗
Glad it helped!
Amazing thank you!
You're very welcome!
Well done!
@TheMihirNaik
Hi, I've done some keyword tracking through PyTrends library so far for free but when I go with more keyword tracking I track like 5000 keywords daily So there might exceed the free limit and and I have to buy the API or it works for free?.
Please let me know What will be the cost to buy API and from where to buy API and how it works?
Hi Salman, When you say keyword tracking, do you want to check where does your website rank for a specific keyword? I mean to ask, are you talking about Rank Tracking?
@@TheMihirNaik Hello bro, I want to track 5000 keywords so it could be as simple as ranking. I need to know their performances. Please help me how to do this . I reached out semrush API but it is very expensive.
@@TheMihirNaik hello sir, I really need your help. Tell me how to contact you in person for detailed discussion.
When I re-run this script, I will be asked to enter auth code again.
I must be a bit confused about how to set this up so that it’s a one time thing.
Google Colab loses the state once the runtime is disconnected. To make it a set and forgot thing, you will have to transform them into a web application and save your credentials in a database. On every request to API, the web app will check if it has active credential in database, and if not it will use refresh token to create new access token.
I am making a postman call but when I click on the incoming url "404. That’s an error.
The requested URL was not found on this server. That’s all we know." error i checked eveything especially redirect_uri but not solved :(
@@g_ddd I am not sure I understand why are you making a Postman call.
@ Oh ive solved my problem after sent this comment thank you for helping and answered my question 🙏 (The reason i use postman that i dont use python right now i have to code backend with C#, i wonder if your video would help me, that's why i watched and it helped thanks again)
Hello Mihir, i get the following error when going to my authorize_url:
You can't sign in to this app because it doesn't comply with Google's OAuth 2.0 policy for keeping apps secure.
What can i do? Thanks
You will have to use your own credentials. And then add your email as a test email then you should be fine.
Hey Mihir! Can you help me? I've been trying to run the "Generate Authorization URL" code, but i just can't get it.
Can you share the error are you getting?
Wow wtf... Lovely tutorial
Thanks!
Thanks Mihr for great videos, I'm following the queries, but getting the following error "SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1002)", can you help me?
I'm not sure why this issue is coming up. Try to put the code in ChatGPT with this error, it might be able to solve.
Hi Mihir. where can I find my redirect URI?
The redirect URI is the same for everyone. The one I have given in the code.
what is default quota of google indexing api for a day
I think 200 a day.
@@TheMihirNaik bahut din baad aya 😅
Sir yi Direct URL kaha sa aya.???
Hey, I tried to get the data as pandas df but I couldn't
Were you able to connect to GSC API? Where are you stuck? What errors are you seeing?
Hi everyone
I've one question can anyone please explain
Why does the data in Google search console doesn't match with the data coming from google search console api
Why the result coming from api has more number as compared to web ui?
GSC Web UI only shows 1000 rows of data, while API allows you to extract everything they have except Anonymized data.
@@TheMihirNaikHi, Thanks for the video, really helpful. In my case, the data from the API is less than the data shown directly in GSC. Do you have any clue why this can happen?
@@lautarogomezdunaevsky5552 You might not be selecting correct dimensions.
I have an issue at step Generate Authorization URL. I see this error: "You can’t sign in because GSC API sent an invalid request. You can try again later, or contact the developer about this issue. Learn more about this error
If you are a developer of GSC API, see error details.
Error 400: invalid_request". How can I solve this?
You are not putting your own Credentials. You are using the ones I have given may be?
@@TheMihirNaik I used my own client id and client secret but I copied yours redirect uri (I dont know where its from)
Ok, it works. Do you plan to continue this series? It's very interesting
How to get data from many GSC websites in one script?
It's not possible because of how GSC API is structured. You could fetch them from multiple properties separately and then join them in one dataframe.
thank you bhaiya
Thanks a lot!
Mihir sir Please I need your mentorship. This is really important for me. Kindly reply me.
Your guidance is important for me in my carrier .
Hi there, how can I help?
@@TheMihirNaik Sir I need to learn about SEO and what should I learn this time to get a job. I need your mentor ship
@@Canadaswing I don’t do 1:1 mentoring. I think learningseo.io is a great way to learn SEO.
Very interesting, but I can barely hear you.
Sorry about that. I tried to do better with other videos.
@@TheMihirNaik I I got myself some good quality headphones and I don't have a problem anymore. Great content, I will find it very useful in my work. It's good that you are here. Sending thanks from Poland.
thx :)
Thanks
Welcome
Voice is too low.
Yes, that's right. I have corrected in the second video.
bro your voice is two low
Yes I have corrected the same in the following videos.
Plz quickly response sir??
Dear Mihir, thank you very much for this video! This data helps me a lot in understanding customer behaviour. Question: is it possible to automate these steps (i.e. runs this script every week automatically). Right now, there is still a manual procedure necessary: copy and pasting the Auth Code.
I think you could automate it for sure, it just needs a different way of authentication. You could check the documentation here for the authentication. developers.google.com/webmaster-tools/v1/how-tos/authorizing
I don't mean to sell myself unnecessarily here but I could help you with custom script.
www.mihirnaik.com/google-search-console-api-consultant/
The custom script will be a Google Cloud Function(may be). It will autorun or run on a ping, it will scrape the data, transform it to a CSV and you will receive a mail.
You can create as many cloud functions as you would like.