Django local memory caching not working on heroku
페이지 정보
본문
about local memory cache on Heroku
In general local memory cache works fine on Heroku. There is a limitation though: both the local_cache dict and the Django locmem cache backend are local to the process and the dyno.
So, for example, if you are using gunicorn, it will by default use sub-processes to handle the requests. And each subprocess will have its own locmem cache. The default WEB_CONCURRENCY setting is at least 2, depending on the dyno size.
Now about the general setup question:
In general local caching can be totally fine (and extremely fast) for production use, if the process/dyno-limitation is fine for you, and the fact that you cannot clear the whole cache without restarting all your dynos.
File system caching also works, but is again only local to your dyno, which means cleaning it is only possible by restarting all your dynos.
In an environment with multiple servers / containers for one application it is actually better to have a separate cache somewhere like redis, memcached, ... which all the dynos can access.
cache and OTP
in general you can use the cache here to store the token. The risk in here is, depending on the actual cache backend: if the cache is full, it will remove data. This is normally fine for a real cache, but it would be a problem for OTP.
In your case you could configure separate cache only for the OTP tokens, which is big enough to hold all currently valid tokens.
