python: functools.lru_cache (beginner - intermediate) anthony explains #54

Поділитися
Вставка
  • Опубліковано 18 чер 2020
  • today I explain functools.lru_cache as well as a few ways that you might use it in your programs!
    - decorators video: • python @decorators - (...
    playlist: • anthony explains
    ==========
    twitch: / anthonywritescode
    dicsord: / discord
    twitter: / codewithanthony
    github: github.com/asottile
    stream github: github.com/anthonywritescode
    I won't ask for subscriptions / likes / comments in videos but it really helps the channel. If you have any suggestions or things you'd like to see please comment below!
  • Наука та технологія

КОМЕНТАРІ • 38

  • @stephan24297
    @stephan24297 2 роки тому +6

    very clear explanation. I've been seeing this used in a lot of leetcode solutions and never understood what it meant. thanks!

    • @anthonywritescode
      @anthonywritescode  2 роки тому +1

      easy caching! there's a followup re: leetcode here: ua-cam.com/video/4tc5MUBjw-g/v-deo.html

  • @python360
    @python360 Рік тому +4

    Excellent. I first came across lru_cache in FastAPI but this video helps understand it's wider usage

  • @workflowinmind
    @workflowinmind 3 роки тому +1

    To the point, I love it! Subscribed!

  • @dera_ng
    @dera_ng Рік тому +1

    This makes my memoization DP code A LOT cleaner! Thanks!😅

  • @juantorres6247
    @juantorres6247 Рік тому

    Thanks man. This has been very helpful.

  • @maksymprotsak3316
    @maksymprotsak3316 7 місяців тому

    Thank you for video!

  • @mindcraft4043
    @mindcraft4043 Рік тому

    great video 👍

  • @VladimirTheAesthete
    @VladimirTheAesthete 2 роки тому

    Wow, using lru_cache as a closure namespace is a neat little trick!

  • @a_maxed_out_handle_of_30_chars
    @a_maxed_out_handle_of_30_chars 2 роки тому +1

    thanks a lot

  • @shaharrefaelshoshany9442
    @shaharrefaelshoshany9442 3 роки тому +1

    best ever

  • @muralidhar40
    @muralidhar40 2 роки тому

    Super!

  • @Walruz1000
    @Walruz1000 4 роки тому

    I am spamming you with lots of comments, apologies for this... do you think this can be used within AWS lambda functions? I have a specific use case to retrieve read only data which is only updated nightly, given the lambda container doesnt hang around all that long I am wondering if I could use this as a cache for json data instead of the additional overhead and complexity of adding elasticache in. I guess what would be even better were if the cache TTL could be controlled?

    • @anthonywritescode
      @anthonywritescode  4 роки тому +1

      comments are always good! happy to answer as well :) lambda keeps your instance running for like ~4 hours so this can be used (in memory caching is one of the easiest ways to improve performance with lambda). lru_cache doesn't have a TTL but there are several libraries on pypi which add this functionality

  • @sparkyb6
    @sparkyb6 3 роки тому +2

    You mentioned that the large cache is wasteful on that function that will only ever return one value, but is it? I assumed the cache is just like a dictionary or something so it wouldn't reserve space for up to maxsize, so if you only ever put one item in it, the size would be the same no matter what you set maxsize to.

    • @anthonywritescode
      @anthonywritescode  3 роки тому +2

      yeah it looks like the cpython implementation doesn't use a sized table (I misremembered here!) -- but you could imagine an implementation which does (or even has a fastpath optimization for n=1)

  • @smjure
    @smjure 3 роки тому +2

    Thanks for the video. Can you maybe make a video on the variable types, e.g. in the video you write: def square(x: float) -> float. What's the usefulness of it other than you explicitly see what are the variable types, in this case input and output are floats.

    • @anthonywritescode
      @anthonywritescode  3 роки тому +2

      yep -- check this one out: ua-cam.com/video/H5CnZQDKfhU/v-deo.html

    • @smjure
      @smjure 3 роки тому

      @@anthonywritescode thank you, appreciated!! 👍

  • @zig131
    @zig131 3 роки тому +2

    I'm guessing functools only works with functions :/
    If you set some default values in a module, how do you stop them being reset if that module is imported a second time?

    • @anthonywritescode
      @anthonywritescode  3 роки тому +2

      module objects are global so they'll only ever be initialized once (unless someone uses `reload(...)` though at that point almost all bets are off because things like singletons are no longer singletons!)

    • @zig131
      @zig131 3 роки тому

      @@anthonywritescode Well that is awfully sensible - horray for Python!
      And thanks for clearing that up :)

  • @yangliu_6688
    @yangliu_6688 3 роки тому +1

    is this kinda like dfs with memorisation?

  • @wan9he
    @wan9he 3 роки тому +2

    How long the values would be cached for with lru_cache?

    • @anthonywritescode
      @anthonywritescode  3 роки тому +2

      until the process exits or the cache is manually cleared or they are evicted due to reaching the `maxsize`

    • @_elkd
      @_elkd Рік тому

      I'm surprised there is no way to specify time to live?

  • @akshaymestry971
    @akshaymestry971 3 роки тому +1

    Can a logger be benefited by slamming lru_cache on every single* method of the Logger class?
    I did profiling on the logger class and turns out the number of function call were reduced but is it still safe to assume that this slamming most of the methods with lru_cache is okay? Implementation wise the output of the logger is correct so haven't had any issues with that as well, just curious if it is good implementation or am I missing something big here?

    • @anthonywritescode
      @anthonywritescode  3 роки тому +1

      I assume you're writing a log-once logger? adding lru_cache would prevent duplicate logging calls from succeeding

    • @akshaymestry971
      @akshaymestry971 3 роки тому

      @@anthonywritescode sorry, but I didn't get what you mean by log-once logger. This logger is something I initialize once and then use it throughout the code. And yes, lru_cache did manage to avoid duplicate function calls and my output was also something that I expected, but not sure why i feel i'm doing something wrong here. Also as per the below comment, please explain `partial` as well.

    • @anthonywritescode
      @anthonywritescode  3 роки тому +1

      I guess I don't understand what you're asking -- perhaps show some code?

    • @akshaymestry971
      @akshaymestry971 3 роки тому

      @@anthonywritescode haha fine. here's the link to the code: github.com/kaamiki/miroslava/blob/main/src/miroslava/utils/logger.py

    • @anthonywritescode
      @anthonywritescode  3 роки тому +1

      yeah I would not suggest doing that -- you're essentially writing a memory leak (keeping log entry objects alive much longer than necessary) -- formatPath is probably the only function that _could_ be lru_cache'd though it's unlikely to improve your performance

  • @patternsandconnections9211
    @patternsandconnections9211 3 роки тому +2

    please explain functools.partial as well.

  • @gokhankesler7201
    @gokhankesler7201 3 роки тому +1

    Respects.