Edit: Ok so it appears that a lot of you have smarter ways to automate data enrichment, which is awesome. My email and twitter/X DMs are open for anyone who is willing to chat about automating this step of the process! Would love to learn from you. Before you start building your directory website, making a logo, choosing fonts in Wordpress...here's the most important part of the whole directory building process. If anyone discovers a better way to clean data, please share them below! Subscribe to the directory newsletter: shipyourdirectory.kit.com/ P.S. sorry for the bird that started chirping mid-video.
YOOOO WHATEVER YOU DO PLEASE PLEASE PLEASE FINISH OUT THIS SERIES from validating to building website to pgrammatic SEO to taking the #1 Google Search result. YOU ARE THE MAN
Thanks man, great video. A couple thoughts: 1. The more data you have in your directory, the bigger the moat. While parsing through a massive number of listings is a lot of work, it also makes the data more valuable. 2. For the handful of directories I have, APIs have been incredible. Sometimes people have already collected the data and offered it publicly. Your job is then to present the data through a new lens. Yesterday I scraped data for 615 movies, and am building a directory in a movie niche. I used Claude to build a script that scraped from multiple data sources and combined it. I just told Claude in plain language what I wanted. Worked well. 3. My hypothesis is part of the added value of a directory is someone knowing that a real person created it. It isn’t some programmatic BOT spinning up a website. This perception adds value to the data. I am looking forward to your video on getting traffic
Solid points. 1. Yes, I agree but I noticed a larger data set leads me down to the programmatic directory path where data is dynamically pulled via a Wordpress plugin using shortcodes. It’s way easier to publish (and arguably offers better UX) but noticed it doesn’t rank as well on Google compared to fully static pages. 2. That’s dope! I need to play around with Claude more. Did you need to clean the data still after this? Or was the data from Claude already looking cleaned? 3. 100% agree, there are some things I do to generate human sounding descriptions for each directory listing so it’s valuable + avoids AI sound text and plagiarism. Will cover this in future vids.
@@nathann6482 oh yeah i'm 100% just talking from my personal experience. I'm not saying pSEO doesn't work at all. Its worked for me and I'm a fan. Just pointing out that those dynamic pages, compared to my static pillar page content, hasn't ranked as well for my target keywords. honestly just depends on the niche. this case study is aiming for high traffic/display ad monetization. Open to hearing your experience with pSEO projects!
@@FreyChu yeah I had to clean the data up, especially since it was from multiple sources. It is a non linear process. I refined some of it myself, passed it to chat gpt, etc. I went through a couple dozen iterations of my CSV before I was happy. Now I will pay my daughter to generate 600 Amazon affiliate links to fill in one of the columns. Tedious work, but tedious is another moat 😎
I just did a directory that started with 25,000 rows. Refined to roughly 1800. I was able to automate the data enrichment with google sheets addons. I think you could automate your review scraping pretty easily and clean the data with clever AI prompting. If you want some help I’d be happy to give some pointers.
Would love pointers on this! I tried to do this with the ChatGPT plugin for google sheets and failed lol Do you mind messaging me on X or via email at shipyourdirectory@gmail.com?
@@hans6304 will do! I've actually been planning on doing some videos around some of my projects involving this stuff on my other channel: www.youtube.com/@joeyready Will let you know when I get something up on the matter :)
Fascinating video. I know even less about spreadsheets than you do, but I still want to learn how to make a directory with recent technology like AI. Approx 20 something years ago I made a local directory for doctors by manually copying and pasting the data. I enjoyed building it, but I eventually let it go because I knew nothing about getting traffic to it. Without traffic, it was a useless project.
It's the best time to start with all this cool new technology. I'm still learning too as you can see. Also, so many great (and lucrative) directory opportunities lie in the healthcare field still.
thanks a lot for the video. i am a web dev, i don't know much about wordpress but there is a way to fill the data into you website easly using Puppeteer or cypress. basicly it will act as a robot that control your browser, reads your csv file and fill the form then submit the data row by row.
Dude thanks! puppeteer looks super interesting. Id love to save the extra costs when it comes to creating the static directory on Wordpress. Would you say it’s relatively easy to learn as a non coder? Started learning a little JavaScript recently but literally just started lol
Hey Frey thanks for the video! I think outscraper might be skirting around the Places API TOS by not "caching or storing" the data they send to you, and not selling the data, instead selling the service of pulling the data. However if you use the data you might violate Google's TOS, should probably be careful. Anyways GL!
amazing video! :) looking forward to more videos like this! Very interesting. If you manage to find a good way to automate this step more, please share it here on youtube! Would be nice to see! Thanks man for the value.
Great video thanks for sharing all this behind the scenes details. This was very helpful. Question: Do you also get the photos when you scrape the data? If you do, how do you go about adding that to the directory? is that a manual or automated process?
Yeah I get them from scraping so I don't have to manually get them. adding to the directory can be automated forsure! For a non-coder like me, I hire a wordpress dev to resize it, compress image size and then add it to my static directory pages
Hey @FreyChu, as you mentioned to use the copy pasting the data from Google Maps Listings to the directory, is that allowed to copy that data like reviews and images and store them on the site as that against to google terms. there is no clear answer for this anywhere.
Hi Frey, what do you think about using a CMS like Wordpress compared to using dedicated platforms like edirectory, brilliant directories, etc for building a directory?
There's not really a set rule for me when it comes to number of listings. It's all based on the location your directory is targeting and the keyword search volume/difficulty. Like, if your keyword research within your niche shows 1000 monthly searches for "[keyword] los angeles" then I'm going to make sure to add as many quality listings for los angeles to avoid thin content and increase my chances of ranking for that keyword. Alternatively, if "[keyword] Long Beach" is only getting 50 monthly searches, then I'd probably not spend as much time creating listings for that city. Hope that makes sense!
There's a ahrefs lite that came out kinda recently. I probably would recommend that if anything. I've tried an alternative called keysearch but it wasn't nearly as good in my opinion. Data was also completely off compared to ahrefs (i used it to lookup my own videos) Unfortunately, you get what you pay for when it comes to these research tools imo.
I came here after you did a reaction video of a directory video that gregg isenberg did withe another person and he stressed not spending ao much time on it until you justify the traffic. Do you put all this time and money just to test it?
I believe it's technically a gray area thing (depends on what data you're scraping right), but all this data is public so I lean towards it being totally okay :)
Frey.. great video 😊 Cant the yags be scraped? Couldn't you scrape comments and then use AI to read both and suggest tags and even write a summary based on the comments?
Yeah definitely! Originally I was going to do it this way. I chose “review tags” under parameters when scraping and out scraper failed to give me the tags ☹️
Hey Frey, that’s cool. But I wanted to make sure I do understand what you are trying to do here. When you build this directory, you basically compete against the google maps directory? And you try to build your directory as a Wordpress website better and more user friendly as Google maps by providing a more refined or accurate search? Is this the idea and your business model?
I think you’re right on the money, and this is likely a doomed model. He mentions wanting to list the whole countries dog parks…. But Google is already doing that better than a one man team with limited tech ability could ever do. Niching down is the way to provide actual value
Yeah you've got it, that's the big picture. Identify where google maps' shortcomings are for high search volume keywords, then build out a directory where the core strength is filling that need. A lot of it is also formatting the information on a website better than Google maps can too. It's a simple idea, but not always easy. The success hinges on the competitive of the space. In the last video, the #1 rank directory for dog park near me was getting 20k/monthly searches. that's a good amount of people looking beyond google maps for what they're looking for!
@@jrgzz Niching down is always solid. The issue is how you plan to monetize it! If you niche down too much, and your game plan is monetizing through display ads, that can be rough. I agree that dog parks is a really big niche...may be not niched down enough too. But I disagree that google is doing a better job mapping out dog parks. Based on the search volume other dog park directories are getting + social validation from reddit, I still think there's an opportunity here
Thanks a lot for sharing all this information. i do the same thing you do for building directories with automated wordpress website with Python and scrap the data. If you want to collaborate, i will help, with Payton you can make the process more easier and faster ( also you can clean the data with your requirements and upload it directly to wordpress, Full automated site 🤝
It costs money for the scraper to run. Someone has to set up the software, pay for the servers, APIs, etc etc. You can dramatically reduce those costs if you know how to code and work with the Google Maps API yourself.
Edit: Ok so it appears that a lot of you have smarter ways to automate data enrichment, which is awesome. My email and twitter/X DMs are open for anyone who is willing to chat about automating this step of the process! Would love to learn from you.
Before you start building your directory website, making a logo, choosing fonts in Wordpress...here's the most important part of the whole directory building process.
If anyone discovers a better way to clean data, please share them below!
Subscribe to the directory newsletter: shipyourdirectory.kit.com/
P.S. sorry for the bird that started chirping mid-video.
Please make an updated video if you get some good info! This was amazing, and I'd love to learn more!
YOOOO WHATEVER YOU DO PLEASE PLEASE PLEASE FINISH OUT THIS SERIES from validating to building website to pgrammatic SEO to taking the #1 Google Search result. YOU ARE THE MAN
Good job! Thanks takinfg the time to share and teach what us what you know. Looking forward to see how your directory turns out.
Man, you came again with gold mine 🎉 thanks ❤
Thanks man, great video.
A couple thoughts:
1. The more data you have in your directory, the bigger the moat. While parsing through a massive number of listings is a lot of work, it also makes the data more valuable.
2. For the handful of directories I have, APIs have been incredible. Sometimes people have already collected the data and offered it publicly. Your job is then to present the data through a new lens. Yesterday I scraped data for 615 movies, and am building a directory in a movie niche. I used Claude to build a script that scraped from multiple data sources and combined it. I just told Claude in plain language what I wanted. Worked well.
3. My hypothesis is part of the added value of a directory is someone knowing that a real person created it. It isn’t some programmatic BOT spinning up a website. This perception adds value to the data.
I am looking forward to your video on getting traffic
Solid points.
1. Yes, I agree but I noticed a larger data set leads me down to the programmatic directory path where data is dynamically pulled via a Wordpress plugin using shortcodes. It’s way easier to publish (and arguably offers better UX) but noticed it doesn’t rank as well on Google compared to fully static pages.
2. That’s dope! I need to play around with Claude more. Did you need to clean the data still after this? Or was the data from Claude already looking cleaned?
3. 100% agree, there are some things I do to generate human sounding descriptions for each directory listing so it’s valuable + avoids AI sound text and plagiarism. Will cover this in future vids.
Your number 1 response is cap bro. Purely your experience - no data to back it up. Others actually have the opposite view.
@@nathann6482 oh yeah i'm 100% just talking from my personal experience. I'm not saying pSEO doesn't work at all. Its worked for me and I'm a fan. Just pointing out that those dynamic pages, compared to my static pillar page content, hasn't ranked as well for my target keywords.
honestly just depends on the niche. this case study is aiming for high traffic/display ad monetization. Open to hearing your experience with pSEO projects!
@@FreyChu yeah I had to clean the data up, especially since it was from multiple sources. It is a non linear process. I refined some of it myself, passed it to chat gpt, etc. I went through a couple dozen iterations of my CSV before I was happy. Now I will pay my daughter to generate 600 Amazon affiliate links to fill in one of the columns. Tedious work, but tedious is another moat 😎
Thank you for sharing your process!
I just did a directory that started with 25,000 rows. Refined to roughly 1800. I was able to automate the data enrichment with google sheets addons. I think you could automate your review scraping pretty easily and clean the data with clever AI prompting. If you want some help I’d be happy to give some pointers.
I would love this as well!
I did something similar as well, just loading the csv into Claude AI, and having it clean the data up for me. Much better for large datasets
Would love pointers on this! I tried to do this with the ChatGPT plugin for google sheets and failed lol
Do you mind messaging me on X or via email at shipyourdirectory@gmail.com?
Please make a video and share! You'll be a great tutor 💪💪
@@hans6304 will do! I've actually been planning on doing some videos around some of my projects involving this stuff on my other channel: www.youtube.com/@joeyready
Will let you know when I get something up on the matter :)
Frey with another classic 🔥
I think your videos are great!!!
Fascinating video. I know even less about spreadsheets than you do, but I still want to learn how to make a directory with recent technology like AI. Approx 20 something years ago I made a local directory for doctors by manually copying and pasting the data. I enjoyed building it, but I eventually let it go because I knew nothing about getting traffic to it. Without traffic, it was a useless project.
It's the best time to start with all this cool new technology. I'm still learning too as you can see. Also, so many great (and lucrative) directory opportunities lie in the healthcare field still.
thanks a lot for the video.
i am a web dev, i don't know much about wordpress but there is a way to fill the data into you website easly using Puppeteer or cypress.
basicly it will act as a robot that control your browser, reads your csv file and fill the form then submit the data row by row.
Dude thanks! puppeteer looks super interesting. Id love to save the extra costs when it comes to creating the static directory on Wordpress.
Would you say it’s relatively easy to learn as a non coder?
Started learning a little JavaScript recently but literally just started lol
it’s not hard, specially if you use claud.ia or chat gpt to write the code for you.
@@FreyChu Yesi think you can succed to do it with the help of claude or chatgpt.
note: learn about selectors in chrome dev tool , you will need it
yes you can do it if you use some ia to help your write the code. you need to learn about selector in chrome dev tools
Hey Frey thanks for the video! I think outscraper might be skirting around the Places API TOS by not "caching or storing" the data they send to you, and not selling the data, instead selling the service of pulling the data. However if you use the data you might violate Google's TOS, should probably be careful. Anyways GL!
amazing video! :) looking forward to more videos like this! Very interesting.
If you manage to find a good way to automate this step more, please share it here on youtube! Would be nice to see! Thanks man for the value.
Great video thanks for sharing all this behind the scenes details. This was very helpful. Question: Do you also get the photos when you scrape the data? If you do, how do you go about adding that to the directory? is that a manual or automated process?
Yeah I get them from scraping so I don't have to manually get them. adding to the directory can be automated forsure! For a non-coder like me, I hire a wordpress dev to resize it, compress image size and then add it to my static directory pages
How do you upload the CSV to the website?
Hey @FreyChu, as you mentioned to use the copy pasting the data from Google Maps Listings to the directory, is that allowed to copy that data like reviews and images and store them on the site as that against to google terms. there is no clear answer for this anywhere.
Hi Frey, what do you think about using a CMS like Wordpress compared to using dedicated platforms like edirectory, brilliant directories, etc for building a directory?
Thanks for the video. How many listings do you recommend launching with to validate the niche? Thanks in advance.
There's not really a set rule for me when it comes to number of listings. It's all based on the location your directory is targeting and the keyword search volume/difficulty.
Like, if your keyword research within your niche shows 1000 monthly searches for "[keyword] los angeles" then I'm going to make sure to add as many quality listings for los angeles to avoid thin content and increase my chances of ranking for that keyword.
Alternatively, if "[keyword] Long Beach" is only getting 50 monthly searches, then I'd probably not spend as much time creating listings for that city.
Hope that makes sense!
Great video man! Is it okay to use those photos from google maps?
Very nice video! Would be great to have some information regarding ahrefs alternatives (free or lowcost) since it's 130$/month
There's a ahrefs lite that came out kinda recently. I probably would recommend that if anything.
I've tried an alternative called keysearch but it wasn't nearly as good in my opinion. Data was also completely off compared to ahrefs (i used it to lookup my own videos)
Unfortunately, you get what you pay for when it comes to these research tools imo.
Is 122,916 the max it can spit out? I get the same number for every pull
I came here after you did a reaction video of a directory video that gregg isenberg did withe another person and he stressed not spending ao much time on it until you justify the traffic. Do you put all this time and money just to test it?
Thanks, Frey
Hi there! Question, is it legal to scrape? I wanna make sure it is. Ty and thanks for your help!
I believe it's technically a gray area thing (depends on what data you're scraping right), but all this data is public so I lean towards it being totally okay :)
There is a scraper I bought for about $30USD which also scrapes all the review keywords also
Drop the link mate!
I too would love the link
why not focus on the major cities first? And then expand… or maybe 10-20 parks in a city
Frey.. great video 😊 Cant the yags be scraped? Couldn't you scrape comments and then use AI to read both and suggest tags and even write a summary based on the comments?
Yeah definitely! Originally I was going to do it this way. I chose “review tags” under parameters when scraping and out scraper failed to give me the tags ☹️
@FreyChu oh, cool. Also, have you tried Apify?
Hey Frey, that’s cool. But I wanted to make sure I do understand what you are trying to do here.
When you build this directory, you basically compete against the google maps directory?
And you try to build your directory as a Wordpress website better and more user friendly as Google maps by providing a more refined or accurate search?
Is this the idea and your business model?
I think you’re right on the money, and this is likely a doomed model. He mentions wanting to list the whole countries dog parks…. But Google is already doing that better than a one man team with limited tech ability could ever do. Niching down is the way to provide actual value
Yeah you've got it, that's the big picture. Identify where google maps' shortcomings are for high search volume keywords, then build out a directory where the core strength is filling that need.
A lot of it is also formatting the information on a website better than Google maps can too.
It's a simple idea, but not always easy. The success hinges on the competitive of the space. In the last video, the #1 rank directory for dog park near me was getting 20k/monthly searches. that's a good amount of people looking beyond google maps for what they're looking for!
@@jrgzz Niching down is always solid. The issue is how you plan to monetize it! If you niche down too much, and your game plan is monetizing through display ads, that can be rough.
I agree that dog parks is a really big niche...may be not niched down enough too.
But I disagree that google is doing a better job mapping out dog parks. Based on the search volume other dog park directories are getting + social validation from reddit, I still think there's an opportunity here
So when building a directory, I will need permission from the businesses to put on the directory?
There shouldn't be any legal issues with including businesses on your directory
Nope, you don't need permission. They're only benefitting from an seo perspective because you're creating a citation for these local businesses
@@FreyChu looking forward to your course! ty
Thanks a lot for sharing all this information. i do the same thing you do for building directories with automated wordpress website with Python and scrap the data. If you want to collaborate, i will help, with Payton you can make the process more easier and faster ( also you can clean the data with your requirements and upload it directly to wordpress, Full automated site 🤝
do you use geodirectories for you sites?
I'd love this, can you send me a message on X or email me at shipyourdirectory@gmail.com? Would love to learn how you're automating this step
Bro please teach from scratch cause I can't understand which panel are you using like it doesn't look like c panel
waste of money apify is free
... for $10 worth of scrapes, yeah, it's free. Try getting all the rows of data you want with that budget.
Dumb question… when you are getting info.. why are you paying?
It costs money for the scraper to run. Someone has to set up the software, pay for the servers, APIs, etc etc. You can dramatically reduce those costs if you know how to code and work with the Google Maps API yourself.