Optional
allowAllow LRUCache#get and LRUCache#fetch calls to return stale data, if available.
By default, if you set ttl
, stale items will only be deleted from the
cache when you get(key)
. That is, it's not preemptively pruning items,
unless OptionsBase.ttlAutopurge is set.
If you set allowStale:true
, it'll return the stale value as well as
deleting it. If you don't set this, then it'll return undefined
when
you try to get a stale entry.
Note that when a stale entry is fetched, even if it is returned due to
allowStale
being set, it is removed from the cache immediately. You
can suppress this behavior by setting
OptionsBase.noDeleteOnStaleGet, either in the constructor, or in
the options provided to LRUCache#get.
This may be overridden by passing an options object to cache.get()
.
The cache.has()
method will always return false
for stale items.
Only relevant if a ttl is set.
Optional
noDo not delete stale items when they are retrieved with LRUCache#get.
Note that the get
return value will still be undefined
unless OptionsBase.allowStale is true.
When using time-expiring entries with ttl
, by default stale
items will be removed from the cache when the key is accessed
with cache.get()
.
Setting this option will cause stale items to remain in the cache, until
they are explicitly deleted with cache.delete(key)
, or retrieved with
noDeleteOnStaleGet
set to false
.
This may be overridden by passing an options object to cache.get()
.
Only relevant if a ttl is used.
Optional
noSet to true to suppress calling the OptionsBase.dispose function if the entry key is still accessible within the cache.
This may be overridden by passing an options object to LRUCache#set.
Only relevant if dispose
or disposeAfter
are set.
Optional
noBoolean flag to tell the cache to not update the TTL when setting a new value for an existing key (ie, when updating a value rather than inserting a new value). Note that the TTL value is always set (if provided) when adding a new entry into the cache.
Has no effect if a OptionsBase.ttl is not set.
May be passed as an option to LRUCache#set.
Optional
sizeOptional
sizeA function that returns a number indicating the item's size.
Requires OptionsBase.maxSize to be set.
If not provided, and OptionsBase.maxSize or OptionsBase.maxEntrySize are set, then all LRUCache#set calls must provide an explicit SetOptions.size or sizeCalculation param.
Optional
startOptional
statusOptional
ttlMax time in milliseconds for items to live in cache before they are considered stale. Note that stale items are NOT preemptively removed by default, and MAY live in the cache, contributing to its LRU max, long after they have expired, unless OptionsBase.ttlAutopurge is set.
If set to 0
(the default value), then that means "do not track
TTL", not "expire immediately".
Also, as this cache is optimized for LRU/MRU operations, some of the staleness/TTL checks will reduce performance, as they will incur overhead by deleting items.
This is not primarily a TTL cache, and does not make strong TTL guarantees. There is no pre-emptive pruning of expired items, but you may set a TTL on the cache, and it will treat expired items as missing when they are fetched, and delete them.
Optional, but must be a non-negative integer in ms if specified.
This may be overridden by passing an options object to cache.set()
.
At least one of max
, maxSize
, or TTL
is required. This must be a
positive integer if set.
Even if ttl tracking is enabled, it is strongly recommended to set a
max
to prevent unbounded growth of the cache.
If ttl tracking is enabled, and max
and maxSize
are not set,
and ttlAutopurge
is not set, then a warning will be emitted
cautioning about the potential for unbounded memory consumption.
(The TypeScript definitions will also discourage this.)
Optional
updateWhen using time-expiring entries with ttl
, setting this to true
will
make each item's age reset to 0 whenever it is retrieved from cache with
LRUCache#get, causing it to not expire. (It can still fall out
of cache based on recency of use, of course.)
Has no effect if OptionsBase.ttl is not set.
This may be overridden by passing an options object to cache.get()
.
options which override the options set in the LRUCache constructor when calling LRUCache#memo.
This is the union of GetOptions and SetOptions, plus MemoOptions.forceRefresh, and MemoOptions.context
Any of these may be modified in the OptionsBase.memoMethod function, but the GetOptions fields will of course have no effect, as the LRUCache#get call already happened by the time the memoMethod is called.