This Simple Trick Makes My API a LOT Faster

Поділитися
Вставка
  • Опубліковано 10 гру 2024

КОМЕНТАРІ • 60

  • @BeyondLegendary
    @BeyondLegendary Рік тому +41

    Impressive, very nice. Let's see Paul Allen's optimization.

    • @joshtriedcoding
      @joshtriedcoding  Рік тому +7

      dude I really appreciate you for commenting so often. Mhmm very nice, it has such an impressive THICCNESS to it

    • @BeyondLegendary
      @BeyondLegendary Рік тому +6

      Your complient was sufficient, Josh.

    • @semyaza555
      @semyaza555 Рік тому

      @@BeyondLegendaryLmfao

    • @joshtriedcoding
      @joshtriedcoding  Рік тому

      @@BeyondLegendary hahaha

  • @filipkovac767
    @filipkovac767 Рік тому +11

    nearly by half -> nearly 40% -> in the end 34%
    just tell us the truth even if it doesn't sound so flashy 📸

  • @DontFollowZim
    @DontFollowZim Рік тому +10

    Edge servers tend to be weaker virtual machines. Their advantage is their proximity but they have a disadvantage of weaker machines, which could be a significant factor if your requests are taking 200ms+ it sounds like there's either a decent amount of compute happening, which would make the weaker machines more noticeable, or there's a decent amount of network io, which could be related to what you were showing with distance between servers and DB.

    • @joshtriedcoding
      @joshtriedcoding  Рік тому +4

      oh yeah there's definitely some compute, this is nowhere near an empty api route. Just the relative differences between edge and non-edge were a bit surprising to me

  • @kavindesivalli
    @kavindesivalli Рік тому +5

    Oh damn... another day, another new thing I'm learning from you 👍💪

  • @MikeNugget
    @MikeNugget Рік тому +4

    Edge includes additional layer, additional routing, stack and network specifics of the provider. They also do analytics, collect telemetry and a bunch of other things which can just slow down the request.

  • @chiblitheone
    @chiblitheone Рік тому +3

    Maybe the extra delay is because Edge Functions are running on Cloudflare compared to regular Serverless Functions on AWS. And with the both Upstash and PlanetScale running primarily on AWS, the connection inside of AWS might be faster.

  • @mateja176
    @mateja176 Рік тому +1

    Conceptually, it acts like a DB transaction. Alternatively, in some cases it would be possible to conditionally merge the operations.

  • @OryginTech
    @OryginTech Рік тому +4

    I’m confused, isn’t the downside of this that you’d be making unnecessary calls if the 2nd func only needs to be executed conditionally? Yes the route takes less time now, but if you have a very expensive func, you’d be running that constantly.

    • @joshtriedcoding
      @joshtriedcoding  Рік тому +1

      if the condition doesn't run, the command will not be added to the pipeline and will not be executed

    • @OryginTech
      @OryginTech Рік тому +1

      @@joshtriedcoding maybe I’m missing something, but according to your diagram doesn’t this mean that it needs to anyway wait for the first command to return to then run the second? So how is it different from awaiting the command?

  • @TheTmLev
    @TheTmLev Рік тому +3

    What you're calling "blocking" requests are not actually blocking, since you use async/await. The correct term is "sequential".

    • @joshtriedcoding
      @joshtriedcoding  Рік тому

      blocking means the client usually waits and doesn't do anything else until it receives the server response, doesn't that block the process/thread reading the response? Sequential sounds like a good term to describe this either way

    • @TheTmLev
      @TheTmLev Рік тому +1

      @@joshtriedcoding `await` doesn't block the thread, quite the opposite - it allows thread to process other Promises in the meantime.

    • @11r3start11
      @11r3start11 Рік тому

      ​@@joshtriedcodingblocking in multithreading means the one which blocks thread and make it unusable at all.
      await waits for the execution but doesn't blocks which is the main difference.
      I'd avoid usage "blocking" terminology in this case, as none of the threads were blocked.

  • @akhilscamp1905
    @akhilscamp1905 Рік тому

    Any such feature in axios ? I guess we could only opt for SWR for suck kind of optimisations

  • @outroddet
    @outroddet Рік тому

    Hey, is programming your daily job, or what do you do for living?

  • @sjain07
    @sjain07 Рік тому +4

    You can configure the vercel edge locations, then they will definitely be faster than serverless

  • @Gerrilicious
    @Gerrilicious Рік тому

    Sehr informatives Video Danke dafür!

  • @arnhazra
    @arnhazra Рік тому +1

    Hey Josh, I am using MongoDB with next js and its super slow, 15-20 seconds each request. Same API takes only 500ms in Express or Nest JS

    • @joshtriedcoding
      @joshtriedcoding  Рік тому +1

      oh wooow it should not take 15-20 seconds

    • @eliaswennerlund7581
      @eliaswennerlund7581 Рік тому +5

      I don't know the specifics of your code, but something that I encountered when using mongodb and next.js was that a new connection to the database was initialized on every incoming request. I fixed this by caching the connection. I also believe that the type of runtime may cause the same thing to happen, since some runtimes doesn't support long-lived connections.

    • @miguderp
      @miguderp Рік тому

      Check your functions' location, I believe by default it's set to Washington. You can find that under Settings > Functions in your Vercel project page

    • @PwrXenon
      @PwrXenon Рік тому

      Standard nextjs user

    • @arnhazra
      @arnhazra Рік тому

      @@miguderp , No I have set it to nearest, also I am saying about my local.

  • @shivanshubisht
    @shivanshubisht Рік тому

    use vercel's regional edge, which would only use edge workers near your database region

  • @obinnaee868
    @obinnaee868 5 місяців тому

    How do I do this for springboot ?

  • @dogfrogfog
    @dogfrogfog Рік тому

    why did you decide to use Redis for this project?

    • @hafidselbi2497
      @hafidselbi2497 Рік тому

      great question 👍

    • @joshtriedcoding
      @joshtriedcoding  Рік тому

      cause its fast

    • @11r3start11
      @11r3start11 Рік тому

      for this scenario - its seems like scalable, fast and popular. But i'd say its quite a misuse and something event-driven and/or actor-based will be more suitable)

    • @joshtriedcoding
      @joshtriedcoding  Рік тому

      @@11r3start11 The built in TTL is super handy, it's fast because it's in-memory and beyond key-value pairs and some simple hashes there are no complex relations. Not sure what you mean by misuse

  • @TheIpicon
    @TheIpicon Рік тому +1

    actually Theo has a video answering my question on stream about the exact same topic (when edge was just introduced).
    Theo said when he recommends using the edge, he talks about the RUNTIME. not the location. because of the same issue you figured out by yourself.
    you can config the Vercel's edge to only run in specific region (which you'll want next to your DB), but still use the good and fast runtime of edge.
    here's the video I'm referencing: ua-cam.com/video/UPo_Xahee1g/v-deo.html
    (I'm so hyped about it because it was the first time he noticed me on stream 😆)

    • @TheIpicon
      @TheIpicon Рік тому

      great job btw figuring it out on your own

  • @koustavmaity-fh3gx
    @koustavmaity-fh3gx Рік тому

    can you make a complete next-auth tutorial video on basic to advanced level..

  • @babayaga6172
    @babayaga6172 Рік тому

    Nice Can u please make a video hoe to handle cache and invalidate cacahe in large relational database and how to setup keys with prisma and redis please

  • @developer_hadi
    @developer_hadi Рік тому

    How can I do that in mongoose🤓

  • @benji9325
    @benji9325 Рік тому +2

    But 200+ms is still slow tho..

  • @Chris-zt4ol
    @Chris-zt4ol Рік тому

    Imagine Prisma had that

  • @breakinggood-r2v
    @breakinggood-r2v Рік тому

    is this a course or you building your own website/project

  • @wasd3108
    @wasd3108 Рік тому

    wait, u're gonna tell me, instead of making a sequential requests, that parallel will be faster? NO WAAAAAAAAAAAAY

  • @breakinggood-r2v
    @breakinggood-r2v Рік тому

    You looks like foden football player

  • @berniko4954
    @berniko4954 Рік тому +2

    I am lazy to watch video full but I want to increase api speed

  • @CallousCoder
    @CallousCoder Рік тому

    Stop using a baby language and use Rust or C++ and you have a 100/200% speed increase. Us system developers are like 250ms pffff kill off the API guys this adds too much overhead.

    • @joshtriedcoding
      @joshtriedcoding  Рік тому +1

      🤡

    • @CallousCoder
      @CallousCoder Рік тому

      @@joshtriedcoding Yeah it always humors me when I hear JavaScript and Python devs talk about performance, when their initial choice to use those languages for a backend should be at the very least eyebrow raising.
      And they are ugly large bloated languages. I like small lean and mean languages, they are also more robust

  • @StingSting844
    @StingSting844 Рік тому

    This could have been a short. You just batches requests together using redis-pipeline. It's not a trick but a common occurrence in all products. Disappointed with the clickbait 😞

  • @Omery-od6vu
    @Omery-od6vu Рік тому

    no