Me: "How do I do caching in python... I bet there's a module for this." NeuralNine: "yeah 'from functools import cache' " Perfect video, it's rare that you get the exact answer you are looking for with a walk through right away. I love the future!
I had already tried to import cache from functool in google colab but it got an error, do you know why this happens? So instead i import Cache from cache-decorator
Yeah all pyinstaller does is bundle the whole python runtime with the code, it won’t be any faster. So yes, this will increase execution speeds. One note, if you are using linux or cygwin you can install cython to compile python to C, that would result in super fast execution times.
Caching data basically means you're decreasing the amount of calculations you need to do by optimizing the process through which the calculations are done. But heres what im confused about. Why does caching make it THAT much faster? Here you're decreasing the number of calculations u need to do by half (or is that more than half) so why is it so significantly faster Found my answer. The decrease in calculations is significantly more than half. The Fibonacci sequence is super unoptimized
When caching is used in calculations like this the technique is often called 'memoization'. Also quite handy when writing a prime factor sieve. Love the channel. Long live human 2.0. Go SpaceX.
I think you would only need to do it on iterative functions, if you wanna learn more about this sort of "caching" data to speedup programs, FreeCodeCamp have a dynamic programming tutorial which basically explains how to do caching and speedup this function with code and not just a decorator. It's nice to see how things work :)
Hi bro, Thanks for tutorial. but I have problem for this: from functools import cache when I run my code give me error: ImportError: cannot import name 'cache' from 'functools' (/usr/lib/python3.8/functools.py)
`cache` was added to python 3.9. You will have to use `@functools.lru_cache(maxsize=None)` for the same effect as `@functools.cache` So just import `lru_cache` from functools