How To Get High Quality Data For Your Website Directory

Поділитися
Вставка
  • Опубліковано 28 гру 2024

КОМЕНТАРІ • 73

  • @FreyChu
    @FreyChu  День тому +7

    Edit: Ok so it appears that a lot of you have smarter ways to automate data enrichment, which is awesome. My email and twitter/X DMs are open for anyone who is willing to chat about automating this step of the process! Would love to learn from you.
    Before you start building your directory website, making a logo, choosing fonts in Wordpress...here's the most important part of the whole directory building process.
    If anyone discovers a better way to clean data, please share them below!
    Subscribe to the directory newsletter: shipyourdirectory.kit.com/
    P.S. sorry for the bird that started chirping mid-video.

    • @joshuadav10
      @joshuadav10 33 хвилини тому

      Please make an updated video if you get some good info! This was amazing, and I'd love to learn more!

  • @MinhNguyen-wd2wc
    @MinhNguyen-wd2wc 12 годин тому +2

    YOOOO WHATEVER YOU DO PLEASE PLEASE PLEASE FINISH OUT THIS SERIES from validating to building website to pgrammatic SEO to taking the #1 Google Search result. YOU ARE THE MAN

  • @camiloaquino4466
    @camiloaquino4466 День тому +3

    Good job! Thanks takinfg the time to share and teach what us what you know. Looking forward to see how your directory turns out.

  • @hashim.pakara
    @hashim.pakara 11 годин тому

    Man, you came again with gold mine 🎉 thanks ❤

  • @marcusedvalson
    @marcusedvalson День тому +2

    Thanks man, great video.
    A couple thoughts:
    1. The more data you have in your directory, the bigger the moat. While parsing through a massive number of listings is a lot of work, it also makes the data more valuable.
    2. For the handful of directories I have, APIs have been incredible. Sometimes people have already collected the data and offered it publicly. Your job is then to present the data through a new lens. Yesterday I scraped data for 615 movies, and am building a directory in a movie niche. I used Claude to build a script that scraped from multiple data sources and combined it. I just told Claude in plain language what I wanted. Worked well.
    3. My hypothesis is part of the added value of a directory is someone knowing that a real person created it. It isn’t some programmatic BOT spinning up a website. This perception adds value to the data.
    I am looking forward to your video on getting traffic

    • @FreyChu
      @FreyChu  23 години тому +1

      Solid points.
      1. Yes, I agree but I noticed a larger data set leads me down to the programmatic directory path where data is dynamically pulled via a Wordpress plugin using shortcodes. It’s way easier to publish (and arguably offers better UX) but noticed it doesn’t rank as well on Google compared to fully static pages.
      2. That’s dope! I need to play around with Claude more. Did you need to clean the data still after this? Or was the data from Claude already looking cleaned?
      3. 100% agree, there are some things I do to generate human sounding descriptions for each directory listing so it’s valuable + avoids AI sound text and plagiarism. Will cover this in future vids.

    • @nathann6482
      @nathann6482 23 години тому

      Your number 1 response is cap bro. Purely your experience - no data to back it up. Others actually have the opposite view.

    • @FreyChu
      @FreyChu  22 години тому +1

      @@nathann6482 oh yeah i'm 100% just talking from my personal experience. I'm not saying pSEO doesn't work at all. Its worked for me and I'm a fan. Just pointing out that those dynamic pages, compared to my static pillar page content, hasn't ranked as well for my target keywords.
      honestly just depends on the niche. this case study is aiming for high traffic/display ad monetization. Open to hearing your experience with pSEO projects!

    • @marcusedvalson
      @marcusedvalson 20 годин тому

      @@FreyChu yeah I had to clean the data up, especially since it was from multiple sources. It is a non linear process. I refined some of it myself, passed it to chat gpt, etc. I went through a couple dozen iterations of my CSV before I was happy. Now I will pay my daughter to generate 600 Amazon affiliate links to fill in one of the columns. Tedious work, but tedious is another moat 😎

    • @chrisb.9856
      @chrisb.9856 3 години тому

      Thank you for sharing your process!

  • @BWBGarage
    @BWBGarage День тому +6

    I just did a directory that started with 25,000 rows. Refined to roughly 1800. I was able to automate the data enrichment with google sheets addons. I think you could automate your review scraping pretty easily and clean the data with clever AI prompting. If you want some help I’d be happy to give some pointers.

    • @yoyoma4424
      @yoyoma4424 День тому +3

      I would love this as well!

    • @AwesomeCameras
      @AwesomeCameras День тому +3

      I did something similar as well, just loading the csv into Claude AI, and having it clean the data up for me. Much better for large datasets

    • @FreyChu
      @FreyChu  День тому +4

      Would love pointers on this! I tried to do this with the ChatGPT plugin for google sheets and failed lol
      Do you mind messaging me on X or via email at shipyourdirectory@gmail.com?

    • @hans6304
      @hans6304 23 години тому +1

      Please make a video and share! You'll be a great tutor 💪💪

    • @AwesomeCameras
      @AwesomeCameras 23 години тому

      @@hans6304 will do! I've actually been planning on doing some videos around some of my projects involving this stuff on my other channel: www.youtube.com/@joeyready
      Will let you know when I get something up on the matter :)

  • @fiftyseventh
    @fiftyseventh 19 годин тому

    Frey with another classic 🔥

  • @Newsinrealestate
    @Newsinrealestate 9 годин тому

    I think your videos are great!!!

  • @luxurycardstore
    @luxurycardstore День тому +1

    Fascinating video. I know even less about spreadsheets than you do, but I still want to learn how to make a directory with recent technology like AI. Approx 20 something years ago I made a local directory for doctors by manually copying and pasting the data. I enjoyed building it, but I eventually let it go because I knew nothing about getting traffic to it. Without traffic, it was a useless project.

    • @FreyChu
      @FreyChu  23 години тому +1

      It's the best time to start with all this cool new technology. I'm still learning too as you can see. Also, so many great (and lucrative) directory opportunities lie in the healthcare field still.

  • @myjamal89
    @myjamal89 День тому +2

    thanks a lot for the video.
    i am a web dev, i don't know much about wordpress but there is a way to fill the data into you website easly using Puppeteer or cypress.
    basicly it will act as a robot that control your browser, reads your csv file and fill the form then submit the data row by row.

    • @FreyChu
      @FreyChu  День тому

      Dude thanks! puppeteer looks super interesting. Id love to save the extra costs when it comes to creating the static directory on Wordpress.
      Would you say it’s relatively easy to learn as a non coder?
      Started learning a little JavaScript recently but literally just started lol

    • @myjamal89
      @myjamal89 23 години тому

      ⁠it’s not hard, specially if you use claud.ia or chat gpt to write the code for you.

    • @myjamal89
      @myjamal89 8 годин тому

      @@FreyChu Yesi think you can succed to do it with the help of claude or chatgpt.
      note: learn about selectors in chrome dev tool , you will need it

    • @myjamal89
      @myjamal89 6 годин тому

      yes you can do it if you use some ia to help your write the code. you need to learn about selector in chrome dev tools

  • @mmxcrono
    @mmxcrono 2 години тому

    Hey Frey thanks for the video! I think outscraper might be skirting around the Places API TOS by not "caching or storing" the data they send to you, and not selling the data, instead selling the service of pulling the data. However if you use the data you might violate Google's TOS, should probably be careful. Anyways GL!

  • @PinkKoala-k4s
    @PinkKoala-k4s 19 годин тому

    amazing video! :) looking forward to more videos like this! Very interesting.
    If you manage to find a good way to automate this step more, please share it here on youtube! Would be nice to see! Thanks man for the value.

  • @jwalkoviak
    @jwalkoviak День тому +1

    Great video thanks for sharing all this behind the scenes details. This was very helpful. Question: Do you also get the photos when you scrape the data? If you do, how do you go about adding that to the directory? is that a manual or automated process?

    • @FreyChu
      @FreyChu  23 години тому

      Yeah I get them from scraping so I don't have to manually get them. adding to the directory can be automated forsure! For a non-coder like me, I hire a wordpress dev to resize it, compress image size and then add it to my static directory pages

  • @imwilliamzhang2
    @imwilliamzhang2 13 годин тому

    How do you upload the CSV to the website?

  • @triwebdigital8436
    @triwebdigital8436 22 години тому +1

    Hey @FreyChu, as you mentioned to use the copy pasting the data from Google Maps Listings to the directory, is that allowed to copy that data like reviews and images and store them on the site as that against to google terms. there is no clear answer for this anywhere.

  • @zakhir123
    @zakhir123 9 годин тому

    Hi Frey, what do you think about using a CMS like Wordpress compared to using dedicated platforms like edirectory, brilliant directories, etc for building a directory?

  • @Christopher-N7
    @Christopher-N7 День тому

    Thanks for the video. How many listings do you recommend launching with to validate the niche? Thanks in advance.

    • @FreyChu
      @FreyChu  23 години тому +1

      There's not really a set rule for me when it comes to number of listings. It's all based on the location your directory is targeting and the keyword search volume/difficulty.
      Like, if your keyword research within your niche shows 1000 monthly searches for "[keyword] los angeles" then I'm going to make sure to add as many quality listings for los angeles to avoid thin content and increase my chances of ranking for that keyword.
      Alternatively, if "[keyword] Long Beach" is only getting 50 monthly searches, then I'd probably not spend as much time creating listings for that city.
      Hope that makes sense!

  • @wacalu
    @wacalu 20 годин тому

    Great video man! Is it okay to use those photos from google maps?

  • @byziad
    @byziad День тому +1

    Very nice video! Would be great to have some information regarding ahrefs alternatives (free or lowcost) since it's 130$/month

    • @FreyChu
      @FreyChu  23 години тому

      There's a ahrefs lite that came out kinda recently. I probably would recommend that if anything.
      I've tried an alternative called keysearch but it wasn't nearly as good in my opinion. Data was also completely off compared to ahrefs (i used it to lookup my own videos)
      Unfortunately, you get what you pay for when it comes to these research tools imo.

  • @eNVy100
    @eNVy100 17 годин тому

    Is 122,916 the max it can spit out? I get the same number for every pull

  • @irhapsody2010
    @irhapsody2010 15 годин тому

    I came here after you did a reaction video of a directory video that gregg isenberg did withe another person and he stressed not spending ao much time on it until you justify the traffic. Do you put all this time and money just to test it?

  • @qasimux
    @qasimux День тому

    Thanks, Frey

  • @yoyoma4424
    @yoyoma4424 День тому

    Hi there! Question, is it legal to scrape? I wanna make sure it is. Ty and thanks for your help!

    • @FreyChu
      @FreyChu  23 години тому

      I believe it's technically a gray area thing (depends on what data you're scraping right), but all this data is public so I lean towards it being totally okay :)

  • @jamestucker4800
    @jamestucker4800 20 годин тому +1

    There is a scraper I bought for about $30USD which also scrapes all the review keywords also

    • @julianm4500
      @julianm4500 17 годин тому +1

      Drop the link mate!

    • @DougieW
      @DougieW 6 годин тому

      I too would love the link

  • @fiftyseventh
    @fiftyseventh 19 годин тому +1

    why not focus on the major cities first? And then expand… or maybe 10-20 parks in a city

  • @SimonStJohn
    @SimonStJohn День тому

    Frey.. great video 😊 Cant the yags be scraped? Couldn't you scrape comments and then use AI to read both and suggest tags and even write a summary based on the comments?

    • @FreyChu
      @FreyChu  День тому +1

      Yeah definitely! Originally I was going to do it this way. I chose “review tags” under parameters when scraping and out scraper failed to give me the tags ☹️

    • @SimonStJohn
      @SimonStJohn 22 години тому

      @FreyChu oh, cool. Also, have you tried Apify?

  • @Hobnockers
    @Hobnockers День тому

    Hey Frey, that’s cool. But I wanted to make sure I do understand what you are trying to do here.
    When you build this directory, you basically compete against the google maps directory?
    And you try to build your directory as a Wordpress website better and more user friendly as Google maps by providing a more refined or accurate search?
    Is this the idea and your business model?

    • @jrgzz
      @jrgzz День тому +1

      I think you’re right on the money, and this is likely a doomed model. He mentions wanting to list the whole countries dog parks…. But Google is already doing that better than a one man team with limited tech ability could ever do. Niching down is the way to provide actual value

    • @FreyChu
      @FreyChu  23 години тому

      Yeah you've got it, that's the big picture. Identify where google maps' shortcomings are for high search volume keywords, then build out a directory where the core strength is filling that need.
      A lot of it is also formatting the information on a website better than Google maps can too.
      It's a simple idea, but not always easy. The success hinges on the competitive of the space. In the last video, the #1 rank directory for dog park near me was getting 20k/monthly searches. that's a good amount of people looking beyond google maps for what they're looking for!

    • @FreyChu
      @FreyChu  23 години тому +1

      @@jrgzz Niching down is always solid. The issue is how you plan to monetize it! If you niche down too much, and your game plan is monetizing through display ads, that can be rough.
      I agree that dog parks is a really big niche...may be not niched down enough too.
      But I disagree that google is doing a better job mapping out dog parks. Based on the search volume other dog park directories are getting + social validation from reddit, I still think there's an opportunity here

  • @EnergiesLives
    @EnergiesLives День тому +1

    So when building a directory, I will need permission from the businesses to put on the directory?

    • @AwesomeCameras
      @AwesomeCameras День тому +1

      There shouldn't be any legal issues with including businesses on your directory

    • @FreyChu
      @FreyChu  23 години тому +1

      Nope, you don't need permission. They're only benefitting from an seo perspective because you're creating a citation for these local businesses

    • @EnergiesLives
      @EnergiesLives 21 годину тому +1

      @@FreyChu looking forward to your course! ty

  • @gameplay6751
    @gameplay6751 День тому +1

    Thanks a lot for sharing all this information. i do the same thing you do for building directories with automated wordpress website with Python and scrap the data. If you want to collaborate, i will help, with Payton you can make the process more easier and faster ( also you can clean the data with your requirements and upload it directly to wordpress, Full automated site 🤝

    • @jwalkoviak
      @jwalkoviak День тому

      do you use geodirectories for you sites?

    • @FreyChu
      @FreyChu  23 години тому

      I'd love this, can you send me a message on X or email me at shipyourdirectory@gmail.com? Would love to learn how you're automating this step

  • @shivanshdubey1189
    @shivanshdubey1189 15 годин тому

    Bro please teach from scratch cause I can't understand which panel are you using like it doesn't look like c panel

  • @xnegusx
    @xnegusx 21 годину тому +2

    waste of money apify is free

    • @ChetanRao
      @ChetanRao Годину тому

      ... for $10 worth of scrapes, yeah, it's free. Try getting all the rows of data you want with that budget.

  • @Newsinrealestate
    @Newsinrealestate 9 годин тому

    Dumb question… when you are getting info.. why are you paying?

    • @ChetanRao
      @ChetanRao Годину тому

      It costs money for the scraper to run. Someone has to set up the software, pay for the servers, APIs, etc etc. You can dramatically reduce those costs if you know how to code and work with the Google Maps API yourself.