How I Use Data Pipelines in my Web Scrapers

Поділитися
Вставка
  • Опубліковано 15 жов 2024
  • Check Out ProxyScrape here: proxyscrape.co...
    ➡ WORK WITH ME
    johnwr.com
    ➡ COMMUNITY
    / discord
    / johnwatsonrooney
    ➡ PROXIES
    proxyscrape.co...
    ➡ HOSTING
    m.do.co/c/c7c9...
    If you are new, welcome. I'm John, a self taught Python developer working in the web and data space. I specialize in data extraction and automation. If you like programming and web content as much as I do, you can subscribe for weekly content.
    ⚠ DISCLAIMER
    Some/all of the links above are affiliate links. By clicking on these links I receive a small commission should you chose to purchase any services or items.
    This video was sponsored by ProxyScrape.

КОМЕНТАРІ • 12

  • @alexdin1565
    @alexdin1565 2 місяці тому +3

    Hi Johne i have a question can we use scrapy with django? i mean make the webscraper as online tool

    • @RicardoPorteladaSilva
      @RicardoPorteladaSilva 2 місяці тому +3

      I think you could create script to scrape separately and load de result to django databases. The processing occurs in separated moments. I hope you understand my English, I'm from Brazil, learning English. if you need more specific please feel free to getting in touch. its a great pleasure to help you

    • @JohnWatsonRooney
      @JohnWatsonRooney  2 місяці тому +1

      this is pretty much it!

    • @HitAndMissLab
      @HitAndMissLab 2 місяці тому

      @@RicardoPorteladaSilva what is the advantage of using Django DB?

  • @personofnote1571
    @personofnote1571 2 місяці тому +1

    Great point about separation of concerns. As you stated, the scraper should only be concerned with getting data and saving data.
    I am curious what other use cases would be compatible with scrapy’s pipelines. Would pipelines be a good place for things like “save to this OTHER database”, or “upload to S3”, or “ping this api”?
    Will be diving into this myself soon but curious about your thoughts here.

    • @JohnWatsonRooney
      @JohnWatsonRooney  2 місяці тому +1

      yes absolutely, you could use an item field to decide whether to upload to X DB or Y DB, and certainly uploading to S3 would come here too. pinging an API you mean like to notify another system? I think that would be a great use case for pipelines (not thought of that before)

  • @piercenorton1544
    @piercenorton1544 2 місяці тому +1

    What if we want to take a full page so we can give it to an LLM to parse? For example, what if we were parsing financial filings or contracts. We want chunks or pages to pass to an LLM to structure outputs.
    I think splitting the text on a tag and then joining the items together would be best, but maybe there is a better way.

  • @HitAndMissLab
    @HitAndMissLab 2 місяці тому +1

    Do you have any videos on how to use proxies in Python?

    • @JohnWatsonRooney
      @JohnWatsonRooney  2 місяці тому +1

      I don’t specifically but that’s a good idea I will create a video on proxies inc how to use

  • @jjeffery129
    @jjeffery129 2 місяці тому

    What’s wrong with scrapping them as string and change them in the end in your output file?

  • @CeratiGilmour
    @CeratiGilmour 2 місяці тому

    Funcionaría junto con selenium?

  • @elmzlan
    @elmzlan 2 місяці тому

    I hope you have a course