The Single Most Useful Decorator in Python
Вставка
- Опубліковано 11 гру 2020
- The most useful decorator in Python is @cache.
It's from the functools library (and a similar variant called @lru_cache too). This decorator was introduced in Python 3.9, but lru_cache has been available since 3.2. Often times, the cache decorator can be used to automatically do dynamic programming algorithms.
― mCoding with James Murphy (mcoding.io)
Source code: github.com/mCodingLLC/VideosS...
SUPPORT ME ⭐
---------------------------------------------------
Patreon: / mcoding
Paypal: www.paypal.com/donate/?hosted...
Other donations: mcoding.io/donate
BE ACTIVE IN MY COMMUNITY 😄
---------------------------------------------------
Discord: / discord
Github: github.com/mCodingLLC/
Reddit: / mcoding
Facebook: / james.mcoding - Наука та технологія
You're telling me I've been manually coding caches unnecessarily? That's what you get by relying on your decade old knowledge of C in Python.
Sometimes I forget and write one anyway!
now i wanna know how you code cache
@@senku3288 me too
This happens to me too much to comfort.
I started coding in python when there was the 2 version. Some years go by and now there's python 3 and I start to learn it and find myself with so many things that are different and so many new stuff that's cool but I never heard of it.
It's annoying to know that things you spent a lot of time on learning are now really simple to use/implement thanks to libraries and such. It's a blessing for newcomers, a torment for old timers.
@@senku3288 They probably use dynamic programming methods: Creating a data structure to store solutions and call it before doing work to see if that work is necessary before retreading old ground. The caching and its efficiency will then be determined by if you made sure to consider how that data is being accessed to take advantage of cache lines.
There is one BIG WARNING: Use this only on pure functions (without side effects)!
@cache does not run the function but only returns its value so if you for example write to a file in that function, the file write will not happen if the cache is evoked.
If you want clean code, use 2 types of functions. 1) functions with side effects that return nothing, 2) functions without side effects that return values. Then you will know, that you can call the function safely anytime, if it returns value.
It has to run the function at least once to know what the value is for that input, right? So a file write would happen on the first run. That also means that if a function return value is supposed to be random, cache would cause it to not be random. Right?
@@quinndirks5653 usually all "random" functions are only pseudorandom. They are seeded usually with time
@@Slada1 Irrelevant to my question. I know how they are seeded and I know they are only pseudorandom. Read my first comment again for context. I'm thinking that if you put @cache on a random number generator for instance, @cache would cause it to always return the same number, right? I'm imagining that @cache stores the return value for a given input, so if you call rand() several times in a row, it runs it once to get the return value, stores the return value for the given input, which is nothing in this case, and subsequent calls to @cache rand() will cause it to return whatever the random number was the first time it ran. Right? Or wrong?
@@Slada1 O.P. does say "if cache is evoked" meaning O.P. knows that the function has to run at least once to get the return value, which means that the write would happen. And it's only subsequent calls where the write wouldn't happen, once the cache is being evoked. Unless cache works in a different way than I'm suspecting, I think I'm right about random function becoming non-random with the use of @cache.
I did not know python had automatic memoization. Thank you for this knowledge.
You are very welcome! There are some other related tools you may like to read up on in the functools library.
I've written my own memorization decorator so many times. It simple. But having a built-in in one is awesome.
⁰⁰⁰
Just few days ago i solved a kata on codewars which had this exact problem
I wish i knew this at that time
I used a txt file to store the results 🤣
@@wannabedev2452 an easier way to implement it yourself is to store a dict with the arguments as keys and if you have multiple args, put them in a tuple
If anyone was wondering; if you'd like to invalidate the cache you could do (for this example) `fib.cache_clear()`.
Thanks for sharing this tip!
Another very useful related decorator is "cached_property", which lets you compute a property just once when needed, and remember it as long as the object exists.
Thanks for pointing this out! This is also in my top 10 most useful decorators!
@@mCoding Will you make a video to those top 10? This video already blew my mind, I'd love to have some more of that hidden knowledge!
Python noob here. What's the difference between cached property and what was used here?
That's it's a property rather than just a function/method.
How is a cached property different from a normal variable?
I'm amazed, when I saw this my first thought was THERE'S NO WAY. Such a big improvement with so little effort is rare to see! Thanks a lot for sharing!
Thank you! Cheers!
it uses much more memory if not lru_cache
It's rare that video whose title contains a superlative actually lives up to the expectations. Great job!
Glad you liked it!
@@mCoding :) LOL (lots of love)
Audio tip for the future: Boost the high frequencies (+18 dB) with a high shelf at starting around >3 kHz, or low Q (0.5) bell at 7.5 kHz to make your voice a lot more clear. Big difference. You can probably do this in any video editing software.
PS: loved this vid
I'm such a noob at video editing! Thanks for the tip.
@@mCoding Don't worry, great videos. Just thought the audio was a bit unclear :) Bringing back some high frequencies should do it.
Wanted to add that if your function returns a mutable object (eg list) and you then modify that value, you are modifying the return value of the cache, making all future calls return the modified object. You need to make a copy of the object before using it (or do not modify it in the first place).
Thanks for pointing out this gotcha that we need to watch out for!
Returning immutable objects is the best option, that or either composing the cached function with copy.deepcopy like ``copy.deepcopy(my_cached_fun(args))``, probably using another decorator would be the most pythonic way.
Which is why these decorators are included in "functools": they will only reliably work from a functional programming point of view, which excludes mutable objects.
@@jorgeestebanmendozaortiz873
thankx for info
but could you explain what he meant by:
"You need to make a copy of the object before using it"1
how so in real example it will look like
@@ko-Daegu just make a copy of any list returned. You can do this either with "list.copy()" or with the idiom "list[]."
This is in fact an incredible tip!
4 minutes of video, direct to the point and a life saver =)
Glad it was helpful!
I literally jumped from my chair when I saw how fast cache made the function.
Thank you for sharing such a useful knowledge.
You're welcome!
I can't believe I manually coded a cache just a few weeks back when this was a thing in python the whole time.
It's still a good idea to know *how* a cache is designed (e.g. for whiteboard interviews).
@@MagnumCarta in case someone wants to know, basically like this:
cache = dict()
def func(arg):
if arg in cache:
return cache[arg]
ret = ...
cache[arg] = ret
return ret
And turn the keys into tuples if you have multiple args
Built-in memoization, that's so cool! Thanks for sharing!
You bet!
I can see someone using this cache decorator in a real product, forgetting that it does not evict cached items from memory, then they start running out of memory and wondering why.
You are better off using lru_cache with a size limit and only do it if your function is actually called with values that can be reused in the next calls to the function
Good point! I should have warned about this in my video!
I can also see someone using this to cash items for my function that isn't actually side effect free and then getting all sorts of corruption, or using this in returning items like lists which are mutable and then having all sorts of hell happen.
I would just like to say that I discovered this channel a couple of days ago.
Your videos are incredibly helpful... thank you for the uploads, you have a new subscriber (and I will be binge watching your videos this weekend).
Man your channel is really helpful,
Thanks for work!
Happy to help!
Did not know that. I really like the length too. Short, straight to the point. Very valuable, subscribed :)
Thanks!
This video is going to blow up. It is super useful, time-saving, short and "yt algo friendly".
Glad it was helpful!
@@mCoding I've to thank you. I've learned something new yesterday and today. Without people like you, UA-cam would just be shitposts lol
Found this channel recently, very clear and great examples!
Glad you think so!
Short, to-the-point, extremely useful. I watched this one video from your channel and subscribed immediately afterwards.
Awesome, thank you!
I knew from the thumbnail of this video (seeing the fib function), that you were going to go over the cache decorator. It is also my favorite decorator! Although, I never really get to use it in production code, as I mostly use it with recursive functions and we try to avoid recursion in our codebase.
Glad you have the same favorite!
It can be useful in non-recursive functions to avoid extra latencies in common requests or whatever expensive computation (just notice you're a rustacean too yayy :) )
This is a perfect little video. Short, clear, and very useful. Thank you.
Glad you enjoyed it!
This is awesome. Thank you so much!
I had no idea about this before.
Your videos are amazing by the way.
Happy to help!
You are amazing. Short and to the point!!
You earned my sub. Thank you.
Thanks for the sub!
Thanks so much for this video! I was trying to compute a highly intensive recursive function and I had never heard of this!
That's really cool, subscribed 😊
I love how you reply to all comments. Even though I've been using python for some years I'm still learning valuable material
I try but now there's way too many for me to get to!
This channel is awesome man. Thanks for this
This is very useful, I didn't know about it. Thank you for sharing it!
This is amazing :D I'd LOVE to see more videos like this :D
More to come!
Neat, at first I thought it would only be half as fast. Very efficient, thanks!
Thankyou so much this was exactly what I was looking for! this seems this to help speed up stuff in pure python without numba or cython.
You're very welcome!
I find it so interesting how we approach coding differently. Whilst I've used the @cache decorator, this isn't somewhere I've ever thought to use it. My immediate thought would have been to create a generator function. Now that I think about it, though, this is much more concise. If you wrote this as a generator then each time you computed fib(n-1) you would have to store it in some variable ( for the sake of providing a concrete example, call that variable "last_fib" and assume that it is only declared within the scope of the generator function). Then the next time the generator is called, it substitutes "last_fib" for fib(n-2) ... This is much simpler.
This got recommended to me, and wow this is actually really useful
Thanks man
Glad to hear it!
I cannot believe the amount of useful info you have on your channel. Many thanks, brother
You are most welcome!
That's a great tip, thanks for sharing!
Glad it was helpful!
Wow this is amazing, thanks man!
Wow. Love it. Great video. Thanks!
Thank you too!
Great stuff man 👍
This is very useful. Thank you!
You're welcome and thanks for watching!
alternate title: a 4 minute video that just completely makes dynamic programming useless in python
It's just a more elegant way to do dynamic programming.
MIND BLOWN. Amazing content.
Blown away. Awesome !
That is very cool, thanks!
This was very useful. You got a new subscriber.
Awesome, thank you!
Yup this is quality content. Subbed!
Much appreciated!
Okay, the cache thing was known to me and is just a time vs memory tradeoff, but the LRU cache is fucking genius
Thanks, very helpful decorator
this is precious
how didn't i learn about this sooner
You're welcome!
Amazing Video!
This is life changing omg thank you!
Glad it helped!
Thank you for helping us :) great content
Happy to help!
Just found this channel. Great topics, excellently explained.
Thanks!
Those of you on older Python without an explicit @cache you can use @lru_cache(None) the explicit None means cache all the things forever
Thank you, and thanks to youtube algorithm for bringing me here.
Thanks for coming!
Yo dawg, I heard you like algorithms so I used an algorithm to send you an algorithm....
That's pretty powerful! And it allows you to keep you code elegant and readable.
wow that's actually amazing! i oughta try it out
This is great! I think I'm going to make a version of this for c because it's so useful
Go for it!
Very nice, I never heard of this one before. Thanks!
Glad you like it!
This is awesome! I love your videos! New subscriber here! :D
Welcome to the club!
Dude, I love you for this, you just saved me lol
Thanks! You're welcome!
Fantastic tip! Thanks!
You bet!
That's beautiful, very helpful.
Thank you! Cheers!
holy shit this is so useful tysm for making these videos
Somehow, I knew about this decorator, and also about memoisation in dynamic programming, but it never clicked that they're the same thing. So I was still implementing my own caches for DP
Fantastic. Thank you so much.
A.
Amazing, thanks for sharing
Thanks for watching!
How did I not know about this!? Thanks a lot!
Nice! Thanks
This is absolutely brilliant how did I not know this existed
Amazing content :D subbed and liked
Much appreciated!
Wow this is very useful!!
This is beautiful!
Learned something useful today, thank you. No need to create another dict for memoization
Wow. Seeing the speed difference is remarkable. Thanks for the explanation.
Glad it was helpful!
Hey this is amazing, thank you.
You're welcome!
Great. Thank you.
Just a note for those who tried cache and it didn't work for them, try lru_cache instead, eg: @lru_cache(maxsize=5)
cache seems to have been added in Python 3.9, but lru_cache looks to have been added since 3.2.
Great video by the way, really easy to follow and start using the code myself!
this is great! Thank you
Glad you liked it!
Amaazzzing 😱..
Thanx a lot for such a useful tip.
although i am a beginner and don't understand this as deeply but this made me instantly subscribe to your channel. 😁
works like magic, thanks ;-)
I'm pretty new to Python, I think more people should know about this
Me too!
fantastic - thanks for that
Glad you enjoyed it
Mind blowing tip bro. This means we don't have to optimize our code using DP
Glad you liked it!
Your videos are gems!
This is amazing!
really interesting, subscribed for more :)
Awesome, thank you!
great tip, thank for share
First time I've seen a title like this that's not clickbait. Thanks for helping out us occasional Python users
Awesome !
Great video!
Thanks!
What an awesome line of code. Life hack for dynamic programming interview questions 😂
I remembered one video about making a mandelbrot set being displayed using Pygame with different methods (using original code; numpy; numba, taichi...) ((it was a RUS video but nevertheless)), and it gave awesome results with some Numba Parallel computations, but then I tried applying lru_cache to it just today, and on identical render settings it jumped from ~30 to ~200 FPS as I added the decorator to the update function/method, which is quite interesting to observate
Glad I found this channel.
Wow. Amazing. Python is such well-thought language.
Now I'll have to look up all my projects at put cache everywhere.
Just be aware of UNSANE memory usage increase!
Use lru_cache() with low "maxsize" value (start from 1 and go upward)
This is the second video of yours I've watched and I feel obligated to subscribe
nice! thanks!
Big fan after seeng this video. Keep sharing such important python hacks.
Wow, in under 4 minutes, you given me back many hours of my life.. you are a wizard. :)
Thank you!
Wow I had no idea you could do this, absolutely blew my mind
Cool, right!