you should also specify that the cache works with pure functions; if a function has side effects and those effects are "needed", then the cache will break the functionality
I was just going to say that one of the best examples to show the usefulness of caching here is fibonacci numbers, but I'm glad that you went through that.
I make python coding videos, so feel that I am (partly) qualified to tell you that you produce excellent videos. You type hint python so well, that you should consider moving to a statically typed language !
Thanks mate! I've jumped back and forth a lot with statically typed languages, and I still love the freedom you get with Python, even if type hints are not enforced. Maybe someday I will go back to something statically typed (I've heard mojo was something new and hot).
@Indently you are welcome. Your expression of the language is the best I have seen (and I have watched thousands of hours from many good youtubers)... There is typescript, but I don't know if the community likes it sufficiently. Anyway, good work, so keep it up.
Thank you! But will this not be on the expense of the storage? If there is like an infinity of diffrent possibilities will that not cause any memory problems? 🤓 What are the actual limits? And when is Iru_cache best used or when to avoid it? (a SQL based function?) Can we save/load the cache so it is there if the script is newly started?
It can grow without limit, which eventually will cause issues. You can clear it manually with cache_clear(). lru_cache doesn't in itself support save/load. LRU is best used, when you want to cache last used computations, and the previous computations is often used for (near) future computations (hence the name, least recently used cache). If your limit is very large, you can also use it for complex or long running computations, that have a high likelihood of repeating itself. This can also be useful, if you spend a lot of time doing external (e.g. API) calls that have a high likelihood of repeating. LRU is a "one size fits all", it's not always the fastest, but it is simple. The usecase isn't massive. But when you can use it, it can often be an extreme performance boost.
Nice speed up 😮and nice and simple. But something can’t ever get infinitely faster, though 😊There’s this pesky thing called physical boundaries in our universe 😅
I am completely stuck with the "from tools import measure". Cannot import measure from tools. I have the tools package downloaded but there is no measure package there?
@@Indently That is really terrible in my view. You should at least write this in the python comment code instead of rushing through the voice comment on this particular important point. I hope you can consider my viewpoint here. Yes, this may be strongly worded but I do like what you contribute in general (hence why I am subscribed).
@@akhileshchaurasia7966There is Not much to guess, He literally Said that He used pref Count... You should Watch the Video before asking questions...
@@Indently caching like this, absolutely :D But i go out of my way to avoid recursion. "Every recursive function can be turned into a normal function with a loop" is all i ever needed to hear on recursion. That being said i don't know how bad recursion is in Python but my background is in Java and unless you want an out of memory exception because the call stack is too big you don't do recursion :D
@@123FireSnake Python has a default limit of 1000 recursions. It's to prevent accidental DoS. You can however increase this limit with a single function call.
Caching is one technique that is used commonly in dynamic programming, but there are others... so it isn't equivalent to dynamic programming, but it's a very useful tool to implement dynamic programming when you can do so.
"Worst thing of all time is recursion..." It's actually incredibly handy, and if you're doing functional programming, which is increasing in popularity for many reasons, there are no loops: recursion IS your looping technique. Making your function tail-recursive (which you can almost always do via techniques like accumulators, continuation-passing, or state monads) mitigates most of this by reusing the stack frame.
@@mr.technoidIn a compiled language, tail-call optimization often translates recursive code to the same or a very similar instruction set in assembly code that using a loop does. If what you're working on needs to be speedy, you probably won't be using Python anyway, unless you're just piecing together libraries that are implemented in C, like numpy and many of the ML libraries, and then you're not not going to be particularly concerned with functional programming. (Python support for FP isn't great.) That being said, functional programming is gaining popularity quite quickly. Even non-FP languages are integrating a lot of FP constructs (e.g. Java and C++). If you're working with pure FP, you would never use a loop. In languages like Haskell, there isn't even support for loops: you'd be using functions like map, filter, foldLeft, foldRight, etc. or if you needed to write an ADT, you'd implement it and its typeclass instances using recursion. If you're not familiar with FP, I'd highly recommend you familiarize yourself with it at least a bit: it's a really interesting and different way of doing things that deviates substantially from OOP and imperative paradigms, and has a lot of advantages. The concepts are harder to understand, but there's a lot of really cool elegance there building programs with referential transparency and pure functions, and capturing side effects in monads. Research shows that code written in FP is less error prone, and in a language like Haskell, if your code compiles, it's highly likely that it isn't going to contain runtime errors. On top of that, it's overdramatic to say "worst thing of all time is recursion..." because there are plenty of things "worse" than recursion.
you should also specify that the cache works with pure functions; if a function has side effects and those effects are "needed", then the cache will break the functionality
I was just going to say that one of the best examples to show the usefulness of caching here is fibonacci numbers, but I'm glad that you went through that.
I make python coding videos, so feel that I am (partly) qualified to tell you that you produce excellent videos.
You type hint python so well, that you should consider moving to a statically typed language !
Thanks mate! I've jumped back and forth a lot with statically typed languages, and I still love the freedom you get with Python, even if type hints are not enforced. Maybe someday I will go back to something statically typed (I've heard mojo was something new and hot).
@Indently you are welcome. Your expression of the language is the best I have seen (and I have watched thousands of hours from many good youtubers)...
There is typescript, but I don't know if the community likes it sufficiently.
Anyway, good work, so keep it up.
Thanks for the guides, man. They've helped me a lot in my projects. Hope you'll keep on making them!
Nice!
hands down! the fibonacci example literally blew my mind, great video
Interesting! I can think of some uses for this already. Especially for some math functions
Damn!!! That's a huge improvement.
Thank you for this very useful video!
Fibonacci example was amazing one to demonstrate this module, love it. ❤
😎👌🏻
😄🕶️👌🏻
First viewer and first like . Love the video . Well I've been doing python for 5 years and i know lru cache. But good one.
Your explanation is always fantastic..
one question - from where to get list of libraries available in python ?
PyPi
great. Now I see how my code will be running faster. I can already see a use for this. OMG python now just got better.
Thank you! But will this not be on the expense of the storage? If there is like an infinity of diffrent possibilities will that not cause any memory problems? 🤓 What are the actual limits? And when is Iru_cache best used or when to avoid it? (a SQL based function?) Can we save/load the cache so it is there if the script is newly started?
It can grow without limit, which eventually will cause issues. You can clear it manually with cache_clear(). lru_cache doesn't in itself support save/load. LRU is best used, when you want to cache last used computations, and the previous computations is often used for (near) future computations (hence the name, least recently used cache). If your limit is very large, you can also use it for complex or long running computations, that have a high likelihood of repeating itself. This can also be useful, if you spend a lot of time doing external (e.g. API) calls that have a high likelihood of repeating. LRU is a "one size fits all", it's not always the fastest, but it is simple.
The usecase isn't massive. But when you can use it, it can often be an extreme performance boost.
Super useful for video game development.
These imports needs to be installed first?
Thank you!
Key is same input should result same output. Do not use random, time, uuid, sequence etc …. Deterministic functions in oracle
How you are imported measure and you can share with that
My python version doesn't contain tools with measure function. Is it tools library or a your own one?
Keep in mind that using caching might be opening pandora's box. There are several cache management issues that can occur...
Nice speed up 😮and nice and simple.
But something can’t ever get infinitely faster, though 😊There’s this pesky thing called physical boundaries in our universe 😅
My ignorance has no physical boundries 😎
@@Indently 😁same here, I tried to patent my ignorance but someone smarter than beat me to it already 😂
I am completely stuck with the "from tools import measure".
Cannot import measure from tools.
I have the tools package downloaded but there is no measure package there?
I said I created it, no one can import my implementation 😅
@@Indently That is really terrible in my view. You should at least write this in the python comment code instead of rushing through the voice comment on this particular important point. I hope you can consider my viewpoint here. Yes, this may be strongly worded but I do like what you contribute in general (hence why I am subscribed).
It's not an important point, all it does is time the code as I mentioned.
@@IndentlyI guess u used perf_counter function and used (end time- starttime) to measure actual time of performing a function
@@akhileshchaurasia7966There is Not much to guess, He literally Said that He used pref Count... You should Watch the Video before asking questions...
I'm really curious to know where the cache is "physically" stored? I'm guessing it should be stored somewhere by the runtime
it is just a simple python hash table stored in RAM
Right, makes sense
is it necessary to clear the cash in some point!? and why!?
Making recusrive fibonacci viable, ansolutely disgusting :D
Don't lie, I know you love it 😉
@@Indently caching like this, absolutely :D
But i go out of my way to avoid recursion. "Every recursive function can be turned into a normal function with a loop" is all i ever needed to hear on recursion. That being said i don't know how bad recursion is in Python but my background is in Java and unless you want an out of memory exception because the call stack is too big you don't do recursion :D
@@123FireSnake tail call optimization can fix that!
@@123FireSnake Python has a default limit of 1000 recursions. It's to prevent accidental DoS. You can however increase this limit with a single function call.
@@tycodjFibonacci does not end with a tail call
is it dynamic programming?
Caching is one technique that is used commonly in dynamic programming, but there are others... so it isn't equivalent to dynamic programming, but it's a very useful tool to implement dynamic programming when you can do so.
The highest Fibonachi number which has ever been calculated seems to be the 150,000th number in the sequence, if chatGPT isn't lying.
Yay
Could you please share the code with us? It would definitely help! Thank you!
will help me make money
Worst thing of all time is recursion because it makes code length shorter but problem is at the same time it makes program a lot slower.
"Worst thing of all time is recursion..."
It's actually incredibly handy, and if you're doing functional programming, which is increasing in popularity for many reasons, there are no loops: recursion IS your looping technique.
Making your function tail-recursive (which you can almost always do via techniques like accumulators, continuation-passing, or state monads) mitigates most of this by reusing the stack frame.
@@vorpal22 Doesn't matter what ever you say truth is code become slower a lot.
@@mr.technoidIn a compiled language, tail-call optimization often translates recursive code to the same or a very similar instruction set in assembly code that using a loop does.
If what you're working on needs to be speedy, you probably won't be using Python anyway, unless you're just piecing together libraries that are implemented in C, like numpy and many of the ML libraries, and then you're not not going to be particularly concerned with functional programming. (Python support for FP isn't great.)
That being said, functional programming is gaining popularity quite quickly. Even non-FP languages are integrating a lot of FP constructs (e.g. Java and C++). If you're working with pure FP, you would never use a loop. In languages like Haskell, there isn't even support for loops: you'd be using functions like map, filter, foldLeft, foldRight, etc. or if you needed to write an ADT, you'd implement it and its typeclass instances using recursion. If you're not familiar with FP, I'd highly recommend you familiarize yourself with it at least a bit: it's a really interesting and different way of doing things that deviates substantially from OOP and imperative paradigms, and has a lot of advantages. The concepts are harder to understand, but there's a lot of really cool elegance there building programs with referential transparency and pure functions, and capturing side effects in monads. Research shows that code written in FP is less error prone, and in a language like Haskell, if your code compiles, it's highly likely that it isn't going to contain runtime errors.
On top of that, it's overdramatic to say "worst thing of all time is recursion..." because there are plenty of things "worse" than recursion.
can we use two decorator for the same function like @lru_cash and @staticmethod
2025 -> Python „programmers” discovered caching XDDD
My man commenting from the future in 2025