Typical use case: one wants to memoize a function f : 'a -> 'b. Code sample:
let f x =
print_endline "call f";
x + 1;;
let f' = with_cache (lru 256) f;;
f' 0;; (* prints *)
f' 1;; (* prints *)
f' 0;; (* doesn't print, returns cached value *)
with_cache c f behaves like f, but caches calls to f in the cache c. It always returns the same value as f x, if f x returns, or raise the same exception. However, f may not be called if x is in the cache.
val with_cache_rec : ('a, 'b)t->(('a->'b)->'a->'b)->'a->'b
with_cache_rec c f is a function that first, applies f to some f' = fix f, such that recursive calls to f' are cached in c. It is similar to with_cache but with a function that takes as first argument its own recursive version. Example (memoized Fibonacci function):
let fib = with_cache_rec (lru 256)
(fun fib' n -> match n with
| 1 | 2 -> 1
| _ -> fib' (n-1) + fib' (n-2)
);;
fib 70;;
Linear cache with the given size. It stores key/value pairs in an array and does linear search at every call, so it should only be used with small size.
parametereq
optional equality predicate for keys
val replacing : ?eq:'aequal->?hash:'ahash->int ->('a, 'b)t
Replacing cache of the given size. Equality and hash functions can be parametrized. It's a hash table that handles collisions by replacing the old value with the new (so a cache entry is evicted when another entry with the same hash (modulo size) is added). Never grows wider than the given size.
val lru : ?eq:'aequal->?hash:'ahash->int ->('a, 'b)t
LRU cache of the given size ("Least Recently Used": keys that have not been used recently are deleted first). Never grows wider than the given size.
val unbounded : ?eq:'aequal->?hash:'ahash->int ->('a, 'b)t
Unbounded cache, backed by a Hash table. Will grow forever unless clear is called manually.