Scaling our Laravel app, after a flash sale took down our MySQL database

Поділитися
Вставка
  • Опубліковано 18 тра 2024
  • #webdevelopment #laravel #developer #devops #mysql #redis #caching
    In this video I talk about how a sudden spike in traffic (ie. a Flash Sale) took down our database and Laravel app. Learn how we effectively scaled our app using an effective Redis caching strategy.
    Shopify pod-based architecture: shopify.engineering/a-pods-ar...
    00:00 - Introduction
    00:25 - The camping site
    01:08 - The day of the flash sale
    02:05 - How we got here
    02:47 - Our caching strategy
    04:16 - Final takeaways

КОМЕНТАРІ • 33

  • @OnlinePseudonym
    @OnlinePseudonym Місяць тому +4

    Great video. Covered the topic wonderfully.

  • @binaryfire
    @binaryfire Місяць тому +1

    This is great content! Please share more insights like this. It'd be great to get more technical details and code examples next time.

  • @phpannotated
    @phpannotated Місяць тому +1

    Very nice video! I subscribed 💯

  • @targetx1733
    @targetx1733 Місяць тому +4

    Show us how to setup it

  • @SXsoft99
    @SXsoft99 Місяць тому +1

    ahhhh the "MySQL has gone away problem" :))
    add more load palancers

    • @badpussycat
      @badpussycat Місяць тому

      How would this help? Also you can not just load-balance to different MySQL servers If you really want this you need a Cluster (Or a Master-Slave but then you need to configure Laravel to send inserts and updates to the master and selects to the slaves - so not exactly a simple load balancer) a smart caching strategy is way cheaper.

    • @sabatino_masala
      @sabatino_masala  25 днів тому

      Agreed! Even though Amazon RDS makes read/writes replicas easy nowadays, good caching strategies are often way cheaper.

  • @shadyarbzharothman8689
    @shadyarbzharothman8689 Місяць тому

    Thanks for the video, by the way I'm working on a laravel Multi-Tenancy Database-Per-User application, where's the best place to host it and make it easy for me?

    • @sabatino_masala
      @sabatino_masala  Місяць тому +1

      I'm a big fan of Laravel Forge! For a Database-Per-User application I'd look into a managed AWS RDS instance. We've recently done a migration off a self-managed database into AWS RDS, and never looked back. So our setup is as follows:
      - We have a LoadBalancer + 3 application servers on Laravel Forge (with Linode/DigitalOcean as a provider)
      - Our 3 application servers connect to a managed RDS MySQL Database
      The latency between Linode & AWS is negligible, but we could opt for migrating everything to AWS EC2 so it lives 'closer' to our RDS Database.

    • @shadyarbzharothman8689
      @shadyarbzharothman8689 Місяць тому

      @@sabatino_masala Thank you very much

  • @Jelle-DV
    @Jelle-DV Місяць тому

    Thanks a lot for making these videos! They're really useful for understanding the thought process and the insights you've gained from the challenges you faced.
    Do you have experience with autoscaling Laravel apps? E.g. Laravel Vapor/AWS Lambda or Kubernetes

    • @sabatino_masala
      @sabatino_masala  Місяць тому

      Thanks! I've dabbled with Laravel Vapor, but for most of my projects of scale I use AWS EKS.

  • @dsmncihagt8187
    @dsmncihagt8187 Місяць тому

    what resources did the vps have? cpu and ram and storage

    • @sabatino_masala
      @sabatino_masala  Місяць тому

      It was a Linode VPS, 16GB of ram, 6 CPU’s and 320GB of SSD storage (+- $100/month).

  • @macctosh
    @macctosh 21 день тому +1

    How are you different from shopify?

    • @sabatino_masala
      @sabatino_masala  21 день тому +1

      We’re focused on restaurants & snackbars (fast food). Our system is built specifically for these use-cases. I’ll give you an example: in Belgium it’s required to have a registered POS. We have such a license (and had to jump to a massive amount of hoops to get it), Shopify does not.
      If we were to expand into retail, we will get into Shopify territory. Currently we’re in a different niche.

    • @macctosh
      @macctosh 21 день тому

      @@sabatino_masala Okay cool!

  • @1234matthewjohnson
    @1234matthewjohnson Місяць тому

    Whats the sites domain?

  • @DailyTechShot
    @DailyTechShot Місяць тому

    Foodticket ? :P

  • @mohamednaser4265
    @mohamednaser4265 Місяць тому

    40 query per page? you should start from there :"

    • @sabatino_masala
      @sabatino_masala  Місяць тому +1

      You’d be surprised how fast you get there 😅

    • @mohamednaser4265
      @mohamednaser4265 Місяць тому

      @@sabatino_masala I worked on a multi tenant ecommerce before with postgresql (db schema per merchant) it's 5-7 queries max.
      max query number I got was on creating orders, it was 11 then minimized it to 7 as well, but I had to rewrite my forms without using the built-in solution, maybe you've something more advanced

    • @sabatino_masala
      @sabatino_masala  24 дні тому

      These things drive our queries up:
      - domain resolving
      - loading products
      - loading product availability
      - open timeslots
      - taken timeslots (for disabled state)
      - delivery regions & configuration
      - previous orders (‘ordered before’ functionality)
      - most popular products
      - upselling products
      - active promotions
      … the list goes on
      A decent caching strategy was so important for us to be able to scale.
      Impressive you got the job done in 5-7 queries! Our use case simply didn’t allow low query counts without caching mechanisms & async calls.

    • @mohamednaser4265
      @mohamednaser4265 24 дні тому

      @@sabatino_masala As I thought, you've something more advanced
      I was using redis to resolve domain to schema name
      was caching most selling products, categories and shipment regions
      products listing was just one query, it was one big query with subqueries but works fine
      Guess I'll need to check your ideas to see what I'm missing

  • @HideBuz
    @HideBuz Місяць тому

    Userid and user data caching is reaaaaaally standard, if you didn't do that, then you are doing it wrong.

    • @sabatino_masala
      @sabatino_masala  Місяць тому +2

      Not really, because we were building features fast instead of worrying about caching 👍

    • @HideBuz
      @HideBuz Місяць тому +2

      @@sabatino_masala There is a difference in lean and bad design. You mentioned the app was 10 years old and worked fine. It means you shipped that for production and not a proof of concept. Yeah, I didn't mean this as harsh criticism, we all learn. Learn and live.