How to Identify Rankable Content Topics in minutes? SEO Automation

Поділитися
Вставка
  • Опубліковано 4 лис 2023
  • I use Google Autocomplete API to gather all the search queries for a keyword, then use ValueSERP API to scrape search results, and my custom SERP-based Clustering API to receive the Clustered Keywords, and then use Plotly Express to Visulize.
    Use OpenAI API to use GPT-4 to generate Content Brief that relies on the clusters you just created.
  • Наука та технологія

КОМЕНТАРІ • 16

  • @Wise.Webmaster
    @Wise.Webmaster 9 місяців тому

    @mihir This is pure gold. Thanks a lot for publishing this. It is super useful. Keep doing such brilliant work. Sending lot of positive vibes from Mumbai, India.

    • @TheMihirNaik
      @TheMihirNaik  9 місяців тому

      Thanks Sameer. Appreciate the comment.

  • @connectdigital.official
    @connectdigital.official 7 місяців тому +1

    Also make video on click loss vs click gains coz seo audit consume alot of time this automation helps alot of tech SEOs

  • @richardoravkin6199
    @richardoravkin6199 2 місяці тому

    Hey, great video! Is it possible to access this colab? I couldn't find it on your site. Thanks

  • @kamal-allazov
    @kamal-allazov 9 місяців тому

    Thanks Mihir,
    Really want such contents more. Please be motivated and never stop man! :)
    Thanks for sharing Python SEO related tutorials 🚀

    • @TheMihirNaik
      @TheMihirNaik  9 місяців тому

      Thanks Kamal. I will try my best to put on more content. What do you think would be more helpful? Any clues/ideas would you like to see more of?

    • @kamal-allazov
      @kamal-allazov 9 місяців тому

      @@TheMihirNaik yesterday I tried your GSC +25k rows export code.
      Found it really helpful. I recommend to add it progress bar while code is exporting (for me runtime expired and I waited a lot).
      In other codes I used "from tqdm import tqdm" and using it on loop (example, for url in tqdm(site_urls...). It gives progress percentage and bar to track.
      Additionally, we should find parallel requesting method to make it more fast. Maybe colab is not for it, but any solution to run in on local PC. Even getting page and url dimensions and metrics, it gets millions of rows for 16 months period. My purpose of using API is to export and store data before it was deleted by Google. So getting all relevant information resuls in billions of rows. So we need to find something capable of it.
      One more thing. I want to get the same graph on Search console, how can I export the site impression (ctr, clicks) data. On this tutorial it only gets url data as I see.
      I will watch all videos and also will share my findings on the process.

  • @connectdigital.official
    @connectdigital.official 7 місяців тому

    Appreciated
    Kindly make video on click gap analysis at page level and query level using python.

    • @TheMihirNaik
      @TheMihirNaik  5 місяців тому +1

      Can you expand a little bit on this? What do you mean when you say click gap analysis?

    • @connectdigital.official
      @connectdigital.official 5 місяців тому +1

      @@TheMihirNaik click gap is segmenting Urls by Performance changes
      1- urls losing clicks
      2 losing clicks and impressions
      3 clicks, impressions and avaergae positions
      See which Urls lost clicks that no longer indexed.
      Blend with internal links and indexing data.

    • @TheMihirNaik
      @TheMihirNaik  5 місяців тому +1

      @@connectdigital.official okay sure. I can do that. I'm thinking of making it available in a tool itself. It can be self served instead of relying on a script.

  • @black-pebble
    @black-pebble 4 місяці тому +1

    Why arent you usjng DataForSeo instead this api ?

    • @TheMihirNaik
      @TheMihirNaik  4 місяці тому

      I plan to. Its on my to do list to try it out.

  • @SreevathsaBV
    @SreevathsaBV 9 місяців тому

    This was a really wonderful learning experience. Thank you @mihir