Latency vs Throughput | System Design Essentials

Поділитися
Вставка
  • Опубліковано 15 лис 2024

КОМЕНТАРІ • 43

  • @krishanmadushanka9521
    @krishanmadushanka9521 Рік тому +4

    Most clear explanation about latency vs throughput in the internet. Thanks!

  • @jordanb9363
    @jordanb9363 Рік тому +2

    I'm going through the System design playlist and just wanna say these are so valuable. Appreciate all the effort that went behind this and for keeping it free!

  • @samartajshaikh2601
    @samartajshaikh2601 8 місяців тому

    Have never seen someone explain latency and throughput in this much depth

  • @adrienesquerre5790
    @adrienesquerre5790 Рік тому +1

    Such an elegant demonstration ! I am sincerely grateful to you, Master.

  • @charlesopuoro5295
    @charlesopuoro5295 Рік тому +1

    Thank you very much for this clear , pragmatic explanation.

  • @stephenmccallion5886
    @stephenmccallion5886 3 роки тому +7

    This is great I really enjoy your content. I'd love to see any videos on CI/CD tools deploying to AWS.

    • @BeABetterDev
      @BeABetterDev  3 роки тому +1

      Thanks stephen! I hear you loud and clear - I've added a CI / Deployment item to my backlog of videos. Hopefully I can have something out soon.
      Cheerss

  • @funkykong9001
    @funkykong9001 3 роки тому +4

    I would have liked to have heard more about the relationship of latency and throughput and how latency can limit potential throughput.

    • @carlosflor1766
      @carlosflor1766 2 роки тому +3

      From my perspective, latency can not limit throughput directly, but it can delay the amount of data that we trying to sent/receive. For example if we have a server in Asia and we are trying to do a request from South America, even if the server in Asia has a high throughput (e.g. up to 500k requests per second in a client server model) the time that our requests will take to reach the server will not affect the amount of requests the server can handle (still 500k requests per second).
      For us as a client having a high latency means wait more time to servers to react, on other hand low throughput means requests start being denied at some point or server crash (if the server is not prepared for that)
      There's some cases where throughput can affect latency, for example when we're doing requests and the server reach their limits and it starts scaling up (vertically or horizontally) so our requests will take more time to be processed since those will need to wait for the scaling process (In modern systems, if we are not talking about scaling in different regions across the globe, usually some milliseconds)

  • @guilhesas
    @guilhesas 3 місяці тому

    Gold nugget.

  • @sakethtadimeti9008
    @sakethtadimeti9008 3 роки тому

    Thanks for taking the time to explain this so well.

  • @adarshrana8243
    @adarshrana8243 3 роки тому +1

    Sir you are toooo good, love from India

  • @BelarusianInUk
    @BelarusianInUk 6 місяців тому

    Regarding Latency/counts chart. The axises should be swapped. Latency should be on the vertical axis as this is a function of counts

  • @joshbriton2859
    @joshbriton2859 3 роки тому +1

    Thank you for this clear explanation

  • @uchennaonyia9377
    @uchennaonyia9377 2 роки тому +1

    Another great video. Thanks very much. They are always very insightful. I was wondering if you would do a video on instance sizing. Picking the right type of instance for your application and the things to Consider

    • @BeABetterDev
      @BeABetterDev  2 роки тому

      Great idea for a video, thank you!

  • @ypucandeleteit
    @ypucandeleteit 3 роки тому +1

    thank you this was a really good source of information

  • @abdulmoizsheikh8031
    @abdulmoizsheikh8031 3 роки тому +1

    THank you soo much. Really enjoyed this talk

  • @MyRajeshwar
    @MyRajeshwar 2 роки тому +1

    Thank you Sir.

  • @Neelu2023
    @Neelu2023 3 роки тому +1

    Thank you. This is really helpful

  • @rohitmishraet
    @rohitmishraet 3 роки тому

    Thanks, very well explained!!

  • @AIThoughts_MA
    @AIThoughts_MA 2 роки тому +1

    Really liked this video, thank you :)

  • @Teh-Penguin
    @Teh-Penguin 11 місяців тому

    Great videos!

  • @bhatsachin
    @bhatsachin 2 роки тому +1

    This is a very nice explanation, i was searching for better understating for a while now. Thank you.
    Also, i have a question around network bandwidth. how much network bandwidth does cloud provider provides? because if I can increase number of servers and my each request is 1 mb how many concurrent requests can my network handle? Because there should be some limit to the network which connects from client to Load balancer.

  • @aayyaa1188
    @aayyaa1188 3 роки тому +1

    Thank you soooo much for the awesome tutorial! It is super clear.
    Just one small question. If we level up throughput, does it mean that the processing latency could be lower?
    Thank you for any answer you provide.

    • @BeABetterDev
      @BeABetterDev  3 роки тому +3

      Hi Anyi,
      Increasing throughput (concurrency) can actually INCREASE latency. The reason is because typically to increase concurrency, we increase the amount of requests to a host while keeping the number of hosts fixed. This can result in more CPU usage on higher latency API calls. If you add more hosts, then this shouldn't be a problem.
      Hope this helps,
      Daniel

  • @kamkum2k
    @kamkum2k 3 роки тому

    Thanks, very useful

  • @MrHorse16
    @MrHorse16 2 роки тому

    What’s the difference between Throughput and Bandwidth?

  • @manouchehrzadahmad4052
    @manouchehrzadahmad4052 3 роки тому +1

    what is the difference between Response time and Latency? It seems your definition of latency is same as the response time.

  • @johnboy14
    @johnboy14 6 місяців тому +1

    Its not a hard concept to grasp, the problems arise when you try to measure them properly.

  • @0JoeTheCat0
    @0JoeTheCat0 3 роки тому +1

    awesome

  • @truelifestories880
    @truelifestories880 2 роки тому

    In your definition of throughput, what is the size of your packet?