A String value that is used in the creation of the default string
description of an object. Called by the built-in method
Object.prototype.toString
.
Optional
sizeThe total computed size of items in the cache (read-only)
LRUCache.OptionsBase.max (read-only)
LRUCache.OptionsBase.maxSize (read-only)
The number of items stored in the cache (read-only)
Iterating over the cache itself yields the same results as LRUCache.entries
Deletes a key out of the cache.
Returns true if the key was deleted, false otherwise.
Return an array of [key, LRUCache.Entry] tuples which can be passed to LRUCache#load.
The start
fields are calculated relative to a portable Date.now()
timestamp, even if performance.now()
is available.
Stale entries are always included in the dump
, even if
LRUCache.OptionsBase.allowStale is false.
Note: this returns an actual array, not a generator, so it can be more easily passed around.
Make an asynchronous cached fetch using the LRUCache.OptionsBase.fetchMethod function.
If the value is in the cache and not stale, then the returned Promise resolves to the value.
If not in the cache, or beyond its TTL staleness, then
fetchMethod(key, staleValue, { options, signal, context })
is
called, and the value returned will be added to the cache once
resolved.
If called with allowStale
, and an asynchronous fetch is
currently in progress to reload a stale value, then the former
stale value will be returned.
If called with forceRefresh
, then the cached item will be
re-fetched, even if it is not stale. However, if allowStale
is also
set, then the old value will still be returned. This is useful
in cases where you want to force a reload of a cached value. If
a background fetch is already in progress, then forceRefresh
has no effect.
If multiple fetches for the same key are issued, then they will all be coalesced into a single call to fetchMethod.
Note that this means that handling options such as LRUCache.OptionsBase.allowStaleOnFetchAbort, LRUCache.FetchOptions.signal, and LRUCache.OptionsBase.allowStaleOnFetchRejection will be determined by the FIRST fetch() call for a given key.
This is a known (fixable) shortcoming which will be addresed on when someone complains about it, as the fix would involve added complexity and may not be worth the costs for this edge case.
If LRUCache.OptionsBase.fetchMethod is not specified, then this is
effectively an alias for Promise.resolve(cache.get(key))
.
When the fetch method resolves to a value, if the fetch has not been aborted due to deletion, eviction, or being overwritten, then it is added to the cache using the options provided.
If the key is evicted or deleted before the fetchMethod
resolves, then the AbortSignal passed to the fetchMethod
will
receive an abort
event, and the promise returned by fetch()
will reject with the reason for the abort.
If a signal
is passed to the fetch()
call, then aborting the
signal will abort the fetch and cause the fetch()
promise to
reject with the reason provided.
Setting context
If an FC
type is set to a type other than unknown
, void
, or
undefined
in the LRUCache constructor, then all
calls to cache.fetch()
must provide a context
option. If
set to undefined
or void
, then calls to fetch must not
provide a context
option.
The context
param allows you to provide arbitrary data that
might be relevant in the course of fetching the data. It is only
relevant for the course of a single fetch()
operation, and
discarded afterwards.
Note: fetch()
calls are inflight-unique
If you call fetch()
multiple times with the same key value,
then every call after the first will resolve on the same
promise1,
even if they have different settings that would otherwise change
the behavior of the fetch, such as noDeleteOnFetchRejection
or ignoreFetchAbort
.
In most cases, this is not a problem (in fact, only fetching something once is what you probably want, if you're caching in the first place). If you are changing the fetch() options dramatically between runs, there's a good chance that you might be trying to fit divergent semantics into a single object, and would be better off with multiple cache instances.
1: Ie, they're not the "same Promise", but they resolve at the same time, because they're both waiting on the same underlying fetchMethod response.
Call the supplied function on each item in the cache, in order from most recently used to least recently used.
fn
is called as fn(value, key, cache)
.
If thisp
is provided, function will be called in the this
-context of
the provided object, or the cache if no thisp
object is provided.
Does not update age or recenty of use, or iterate over stale values.
In some cases, cache.fetch()
may resolve to undefined
, either because
a LRUCache.OptionsBase#fetchMethod was not provided (turning
cache.fetch(k)
into just an async wrapper around cache.get(k)
) or
because ignoreFetchAbort
was specified (either to the constructor or
in the LRUCache.FetchOptions). Also, the
LRUCache.OptionsBase.fetchMethod may return undefined
or void
, making
the test even more complicated.
Because inferring the cases where undefined
might be returned are so
cumbersome, but testing for undefined
can also be annoying, this method
can be used, which will reject if this.fetch()
resolves to undefined.
Return the number of ms left in the item's TTL. If item is not in cache,
returns 0
. Returns Infinity
if item is in cache without a defined TTL.
Check if a key is in the cache, without updating the recency of use. Will return false if the item is stale, even though it is technically in the cache.
Check if a key is in the cache, without updating the recency of
use. Age is updated if LRUCache.OptionsBase.updateAgeOnHas is set
to true
in either the options or the constructor.
Will return false
if the item is stale, even though it is technically in
the cache. The difference can be determined (if it matters) by using a
status
argument, and inspecting the has
field.
Will not update item age unless LRUCache.OptionsBase.updateAgeOnHas is set.
Get the extended info about a given entry, to get its value, size, and
TTL info simultaneously. Returns undefined
if the key is not present.
Unlike LRUCache#dump, which is designed to be portable and survive
serialization, the start
value is always the current timestamp, and the
ttl
is a calculated remaining time to live (negative if expired).
Always returns stale values, if their info is found in the cache, so be sure to check for expirations (ie, a negative LRUCache.Entry#ttl) if relevant.
Reset the cache and load in the items in entries in the order listed.
The shape of the resulting cache may be different if the same options are not used in both caches.
The start
fields are assumed to be calculated relative to a portable
Date.now()
timestamp, even if performance.now()
is available.
If the key is found in the cache, then this is equivalent to LRUCache#get. If not, in the cache, then calculate the value using the LRUCache.OptionsBase.memoMethod, and add it to the cache.
If an FC
type is set to a type other than unknown
, void
, or
undefined
in the LRUCache constructor, then all calls to cache.memo()
must provide a context
option. If set to undefined
or void
, then
calls to memo must not provide a context
option.
The context
param allows you to provide arbitrary data that might be
relevant in the course of fetching the data. It is only relevant for the
course of a single memo()
operation, and discarded afterwards.
Like LRUCache#get but doesn't update recency or delete stale items.
Returns undefined
if the item is stale, unless
LRUCache.OptionsBase.allowStale is set.
Inverse order version of LRUCache.entries
Return a generator yielding [key, value]
pairs,
in order from least recently used to most recently used.
Inverse order version of LRUCache.keys
Return a generator yielding the keys in the cache, in order from least recently used to most recently used.
Inverse order version of LRUCache.values
Return a generator yielding the values in the cache, in order from least recently used to most recently used.
Add a value to the cache.
Note: if undefined
is specified as a value, this is an alias for
LRUCache#delete
Fields on the LRUCache.SetOptions options param will override
their corresponding values in the constructor options for the scope
of this single set()
operation.
If start
is provided, then that will set the effective start
time for the TTL calculation. Note that this must be a previous
value of performance.now()
if supported, or a previous value of
Date.now()
if not.
Options object may also include size
, which will prevent
calling the sizeCalculation
function and just use the specified
number if it is a positive integer, and noDisposeOnSet
which
will prevent calling a dispose
function in the case of
overwrites.
If the size
(or return value of sizeCalculation
) for a given
entry is greater than maxEntrySize
, then the item will not be
added to the cache.
Will update the recency of the entry.
If the value is undefined
, then this is an alias for
cache.delete(key)
. undefined
is never stored in the cache.
Static
unsafeInternal
Do not call this method unless you need to inspect the inner workings of the cache. If anything returned by this object is modified in any way, strange breakage may occur.
These fields are private for a reason!
Default export, the thing you're using this module to get.
The
K
andV
types define the key and value types, respectively. The optionalFC
type defines the type of thecontext
object passed tocache.fetch()
andcache.memo()
.Keys and values must not be
null
orundefined
.All properties from the options object (with the exception of
max
,maxSize
,fetchMethod
,memoMethod
,dispose
anddisposeAfter
) are added as normal public members. (The listed options are read-only getters.)Changing any of these will alter the defaults for subsequent method calls.