How to Identify Rankable Content Topics in minutes? SEO Automation
Вставка
- Опубліковано 4 лис 2023
- I use Google Autocomplete API to gather all the search queries for a keyword, then use ValueSERP API to scrape search results, and my custom SERP-based Clustering API to receive the Clustered Keywords, and then use Plotly Express to Visulize.
Use OpenAI API to use GPT-4 to generate Content Brief that relies on the clusters you just created. - Наука та технологія
@mihir This is pure gold. Thanks a lot for publishing this. It is super useful. Keep doing such brilliant work. Sending lot of positive vibes from Mumbai, India.
Thanks Sameer. Appreciate the comment.
Also make video on click loss vs click gains coz seo audit consume alot of time this automation helps alot of tech SEOs
Hey, great video! Is it possible to access this colab? I couldn't find it on your site. Thanks
Thanks Mihir,
Really want such contents more. Please be motivated and never stop man! :)
Thanks for sharing Python SEO related tutorials 🚀
Thanks Kamal. I will try my best to put on more content. What do you think would be more helpful? Any clues/ideas would you like to see more of?
@@TheMihirNaik yesterday I tried your GSC +25k rows export code.
Found it really helpful. I recommend to add it progress bar while code is exporting (for me runtime expired and I waited a lot).
In other codes I used "from tqdm import tqdm" and using it on loop (example, for url in tqdm(site_urls...). It gives progress percentage and bar to track.
Additionally, we should find parallel requesting method to make it more fast. Maybe colab is not for it, but any solution to run in on local PC. Even getting page and url dimensions and metrics, it gets millions of rows for 16 months period. My purpose of using API is to export and store data before it was deleted by Google. So getting all relevant information resuls in billions of rows. So we need to find something capable of it.
One more thing. I want to get the same graph on Search console, how can I export the site impression (ctr, clicks) data. On this tutorial it only gets url data as I see.
I will watch all videos and also will share my findings on the process.
Appreciated
Kindly make video on click gap analysis at page level and query level using python.
Can you expand a little bit on this? What do you mean when you say click gap analysis?
@@TheMihirNaik click gap is segmenting Urls by Performance changes
1- urls losing clicks
2 losing clicks and impressions
3 clicks, impressions and avaergae positions
See which Urls lost clicks that no longer indexed.
Blend with internal links and indexing data.
@@connectdigital.official okay sure. I can do that. I'm thinking of making it available in a tool itself. It can be self served instead of relying on a script.
Why arent you usjng DataForSeo instead this api ?
I plan to. Its on my to do list to try it out.
This was a really wonderful learning experience. Thank you @mihir
Glad it was helpful!