Class LRUCache<K, V, FC>

Default export, the thing you're using this module to get.

The K and V types define the key and value types, respectively. The optional FC type defines the type of the context object passed to cache.fetch() and cache.memo().

Keys and values must not be null or undefined.

All properties from the options object (with the exception of max, maxSize, fetchMethod, memoMethod, dispose and disposeAfter) are added as normal public members. (The listed options are read-only getters.)

Changing any of these will alter the defaults for subsequent method calls.

Type Parameters

  • K extends {}
  • V extends {}
  • FC = unknown

Constructors

Properties

[toStringTag]: string = 'LRUCache'

A String value that is used in the creation of the default string description of an object. Called by the built-in method Object.prototype.toString.

allowStale: boolean
allowStaleOnFetchAbort: boolean
allowStaleOnFetchRejection: boolean
ignoreFetchAbort: boolean
maxEntrySize: number
noDeleteOnFetchRejection: boolean
noDeleteOnStaleGet: boolean
noDisposeOnSet: boolean
noUpdateTTL: boolean
sizeCalculation?: SizeCalculator<K, V>
ttl: number
ttlAutopurge: boolean
ttlResolution: number
updateAgeOnGet: boolean
updateAgeOnHas: boolean

Accessors

  • get calculatedSize(): number
  • The total computed size of items in the cache (read-only)

    Returns number

  • get size(): number
  • The number of items stored in the cache (read-only)

    Returns number

Methods

  • Iterating over the cache itself yields the same results as LRUCache.entries

    Returns Generator<[K, V], void, unknown>

  • Clear the cache entirely, throwing away all values.

    Returns void

  • Deletes a key out of the cache.

    Returns true if the key was deleted, false otherwise.

    Parameters

    Returns boolean

  • Return an array of [key, LRUCache.Entry] tuples which can be passed to LRUCache#load.

    The start fields are calculated relative to a portable Date.now() timestamp, even if performance.now() is available.

    Stale entries are always included in the dump, even if LRUCache.OptionsBase.allowStale is false.

    Note: this returns an actual array, not a generator, so it can be more easily passed around.

    Returns [K, Entry<V>][]

  • Return a generator yielding [key, value] pairs, in order from most recently used to least recently used.

    Returns Generator<[K, V], void, unknown>

  • Make an asynchronous cached fetch using the LRUCache.OptionsBase.fetchMethod function.

    If the value is in the cache and not stale, then the returned Promise resolves to the value.

    If not in the cache, or beyond its TTL staleness, then fetchMethod(key, staleValue, { options, signal, context }) is called, and the value returned will be added to the cache once resolved.

    If called with allowStale, and an asynchronous fetch is currently in progress to reload a stale value, then the former stale value will be returned.

    If called with forceRefresh, then the cached item will be re-fetched, even if it is not stale. However, if allowStale is also set, then the old value will still be returned. This is useful in cases where you want to force a reload of a cached value. If a background fetch is already in progress, then forceRefresh has no effect.

    If multiple fetches for the same key are issued, then they will all be coalesced into a single call to fetchMethod.

    Note that this means that handling options such as LRUCache.OptionsBase.allowStaleOnFetchAbort, LRUCache.FetchOptions.signal, and LRUCache.OptionsBase.allowStaleOnFetchRejection will be determined by the FIRST fetch() call for a given key.

    This is a known (fixable) shortcoming which will be addresed on when someone complains about it, as the fix would involve added complexity and may not be worth the costs for this edge case.

    If LRUCache.OptionsBase.fetchMethod is not specified, then this is effectively an alias for Promise.resolve(cache.get(key)).

    When the fetch method resolves to a value, if the fetch has not been aborted due to deletion, eviction, or being overwritten, then it is added to the cache using the options provided.

    If the key is evicted or deleted before the fetchMethod resolves, then the AbortSignal passed to the fetchMethod will receive an abort event, and the promise returned by fetch() will reject with the reason for the abort.

    If a signal is passed to the fetch() call, then aborting the signal will abort the fetch and cause the fetch() promise to reject with the reason provided.

    Setting context

    If an FC type is set to a type other than unknown, void, or undefined in the LRUCache constructor, then all calls to cache.fetch() must provide a context option. If set to undefined or void, then calls to fetch must not provide a context option.

    The context param allows you to provide arbitrary data that might be relevant in the course of fetching the data. It is only relevant for the course of a single fetch() operation, and discarded afterwards.

    Note: fetch() calls are inflight-unique

    If you call fetch() multiple times with the same key value, then every call after the first will resolve on the same promise1, even if they have different settings that would otherwise change the behavior of the fetch, such as noDeleteOnFetchRejection or ignoreFetchAbort.

    In most cases, this is not a problem (in fact, only fetching something once is what you probably want, if you're caching in the first place). If you are changing the fetch() options dramatically between runs, there's a good chance that you might be trying to fit divergent semantics into a single object, and would be better off with multiple cache instances.

    1: Ie, they're not the "same Promise", but they resolve at the same time, because they're both waiting on the same underlying fetchMethod response.

    Parameters

    Returns Promise<undefined | V>

  • Parameters

    • k: unknown extends FC
          ? K
          : FC extends undefined | void
              ? K
              : never
    • OptionalfetchOptions: unknown extends FC
          ? FetchOptions<K, V, FC>
          : FC extends undefined | void
              ? FetchOptionsNoContext<K, V>
              : never

    Returns Promise<undefined | V>

  • Find a value for which the supplied fn method returns a truthy value, similar to Array.find(). fn is called as fn(value, key, cache).

    Parameters

    Returns undefined | V

  • Call the supplied function on each item in the cache, in order from most recently used to least recently used.

    fn is called as fn(value, key, cache).

    If thisp is provided, function will be called in the this-context of the provided object, or the cache if no thisp object is provided.

    Does not update age or recenty of use, or iterate over stale values.

    Parameters

    Returns void

  • In some cases, cache.fetch() may resolve to undefined, either because a LRUCache.OptionsBase#fetchMethod was not provided (turning cache.fetch(k) into just an async wrapper around cache.get(k)) or because ignoreFetchAbort was specified (either to the constructor or in the LRUCache.FetchOptions). Also, the LRUCache.OptionsBase.fetchMethod may return undefined or void, making the test even more complicated.

    Because inferring the cases where undefined might be returned are so cumbersome, but testing for undefined can also be annoying, this method can be used, which will reject if this.fetch() resolves to undefined.

    Parameters

    Returns Promise<V>

  • Parameters

    • k: unknown extends FC
          ? K
          : FC extends undefined | void
              ? K
              : never
    • OptionalfetchOptions: unknown extends FC
          ? FetchOptions<K, V, FC>
          : FC extends undefined | void
              ? FetchOptionsNoContext<K, V>
              : never

    Returns Promise<V>

  • Return a value from the cache. Will update the recency of the cache entry found.

    If the key is not found, get() will return undefined.

    Parameters

    Returns undefined | V

  • Return the number of ms left in the item's TTL. If item is not in cache, returns 0. Returns Infinity if item is in cache without a defined TTL.

    Parameters

    • key: K

    Returns number

  • Check if a key is in the cache, without updating the recency of use. Will return false if the item is stale, even though it is technically in the cache.

    Check if a key is in the cache, without updating the recency of use. Age is updated if LRUCache.OptionsBase.updateAgeOnHas is set to true in either the options or the constructor.

    Will return false if the item is stale, even though it is technically in the cache. The difference can be determined (if it matters) by using a status argument, and inspecting the has field.

    Will not update item age unless LRUCache.OptionsBase.updateAgeOnHas is set.

    Parameters

    Returns boolean

  • Get the extended info about a given entry, to get its value, size, and TTL info simultaneously. Returns undefined if the key is not present.

    Unlike LRUCache#dump, which is designed to be portable and survive serialization, the start value is always the current timestamp, and the ttl is a calculated remaining time to live (negative if expired).

    Always returns stale values, if their info is found in the cache, so be sure to check for expirations (ie, a negative LRUCache.Entry#ttl) if relevant.

    Parameters

    • key: K

    Returns undefined | Entry<V>

  • Return a generator yielding the keys in the cache, in order from most recently used to least recently used.

    Returns Generator<K, void, unknown>

  • Reset the cache and load in the items in entries in the order listed.

    The shape of the resulting cache may be different if the same options are not used in both caches.

    The start fields are assumed to be calculated relative to a portable Date.now() timestamp, even if performance.now() is available.

    Parameters

    Returns void

  • If the key is found in the cache, then this is equivalent to LRUCache#get. If not, in the cache, then calculate the value using the LRUCache.OptionsBase.memoMethod, and add it to the cache.

    If an FC type is set to a type other than unknown, void, or undefined in the LRUCache constructor, then all calls to cache.memo() must provide a context option. If set to undefined or void, then calls to memo must not provide a context option.

    The context param allows you to provide arbitrary data that might be relevant in the course of fetching the data. It is only relevant for the course of a single memo() operation, and discarded afterwards.

    Parameters

    Returns V

  • Parameters

    • k: unknown extends FC
          ? K
          : FC extends undefined | void
              ? K
              : never
    • OptionalmemoOptions: unknown extends FC
          ? MemoOptions<K, V, FC>
          : FC extends undefined | void
              ? MemoOptionsNoContext<K, V>
              : never

    Returns V

  • Evict the least recently used item, returning its value or undefined if cache is empty.

    Returns undefined | V

  • Delete any stale entries. Returns true if anything was removed, false otherwise.

    Returns boolean

  • Inverse order version of LRUCache.entries

    Return a generator yielding [key, value] pairs, in order from least recently used to most recently used.

    Returns Generator<(K | V)[], void, unknown>

  • The same as LRUCache.forEach but items are iterated over in reverse order. (ie, less recently used items are iterated over first.)

    Parameters

    Returns void

  • Inverse order version of LRUCache.keys

    Return a generator yielding the keys in the cache, in order from least recently used to most recently used.

    Returns Generator<K, void, unknown>

  • Inverse order version of LRUCache.values

    Return a generator yielding the values in the cache, in order from least recently used to most recently used.

    Returns Generator<undefined | V, void, unknown>

  • Add a value to the cache.

    Note: if undefined is specified as a value, this is an alias for LRUCache#delete

    Fields on the LRUCache.SetOptions options param will override their corresponding values in the constructor options for the scope of this single set() operation.

    If start is provided, then that will set the effective start time for the TTL calculation. Note that this must be a previous value of performance.now() if supported, or a previous value of Date.now() if not.

    Options object may also include size, which will prevent calling the sizeCalculation function and just use the specified number if it is a positive integer, and noDisposeOnSet which will prevent calling a dispose function in the case of overwrites.

    If the size (or return value of sizeCalculation) for a given entry is greater than maxEntrySize, then the item will not be added to the cache.

    Will update the recency of the entry.

    If the value is undefined, then this is an alias for cache.delete(key). undefined is never stored in the cache.

    Parameters

    Returns LRUCache<K, V, FC>

  • Return a generator yielding the values in the cache, in order from most recently used to least recently used.

    Returns Generator<V, void, unknown>

  • Internal

    Do not call this method unless you need to inspect the inner workings of the cache. If anything returned by this object is modified in any way, strange breakage may occur.

    These fields are private for a reason!

    Type Parameters

    • K extends {}
    • V extends {}
    • FC extends unknown = unknown

    Parameters

    Returns {
        backgroundFetch: ((k: K, index: undefined | number, options: FetchOptions<K, V, FC>, context: any) => BackgroundFetch<V>);
        free: StackLike;
        indexes: ((options?: {
            allowStale: boolean;
        }) => Generator<Index, void, unknown>);
        isBackgroundFetch: ((p: any) => p is BackgroundFetch<V>);
        isStale: ((index: undefined | number) => boolean);
        keyList: (undefined | K)[];
        keyMap: Map<K, number>;
        moveToTail: ((index: number) => void);
        next: NumberArray;
        prev: NumberArray;
        rindexes: ((options?: {
            allowStale: boolean;
        }) => Generator<Index, void, unknown>);
        sizes: undefined | ZeroArray;
        starts: undefined | ZeroArray;
        ttls: undefined | ZeroArray;
        valList: (undefined | V | BackgroundFetch<V>)[];
        get head(): Index;
        get tail(): Index;
    }

    • backgroundFetch: ((k: K, index: undefined | number, options: FetchOptions<K, V, FC>, context: any) => BackgroundFetch<V>)
    • free: StackLike
    • indexes: ((options?: {
          allowStale: boolean;
      }) => Generator<Index, void, unknown>)
        • (options?): Generator<Index, void, unknown>
        • Parameters

          • Optionaloptions: {
                allowStale: boolean;
            }
            • allowStale: boolean

          Returns Generator<Index, void, unknown>

    • isBackgroundFetch: ((p: any) => p is BackgroundFetch<V>)
    • isStale: ((index: undefined | number) => boolean)
        • (index): boolean
        • Parameters

          • index: undefined | number

          Returns boolean

    • keyList: (undefined | K)[]
    • keyMap: Map<K, number>
    • moveToTail: ((index: number) => void)
        • (index): void
        • Parameters

          • index: number

          Returns void

    • next: NumberArray
    • prev: NumberArray
    • rindexes: ((options?: {
          allowStale: boolean;
      }) => Generator<Index, void, unknown>)
        • (options?): Generator<Index, void, unknown>
        • Parameters

          • Optionaloptions: {
                allowStale: boolean;
            }
            • allowStale: boolean

          Returns Generator<Index, void, unknown>

    • sizes: undefined | ZeroArray
    • starts: undefined | ZeroArray
    • ttls: undefined | ZeroArray
    • valList: (undefined | V | BackgroundFetch<V>)[]
    • get head(): Index
    • get tail(): Index