Interface OptionsTTLLimit<K, V, FC>

Options which may be passed to the LRUCache constructor.

Most of these may be overridden in the various options that use them.

Despite all being technically optional, the constructor requires that a cache is at minimum limited by one or more of OptionsBase.max, OptionsBase.ttl, or OptionsBase.maxSize.

If OptionsBase.ttl is used alone, then it is strongly advised (and in fact required by the type definitions here) that the cache also set OptionsBase.ttlAutopurge, to prevent potentially unbounded storage.

All options are also available on the LRUCache instance, making it safe to pass an LRUCache instance as the options argumemnt to make another empty cache of the same type.

Some options are marked as read-only, because changing them after instantiation is not safe. Changing any of the other options will of course only have an effect on subsequent method calls.

interface OptionsTTLLimit<K, V, FC> {
    allowStale?: boolean;
    allowStaleOnFetchAbort?: boolean;
    allowStaleOnFetchRejection?: boolean;
    dispose?: Disposer<K, V>;
    disposeAfter?: Disposer<K, V>;
    fetchMethod?: Fetcher<K, V, FC>;
    ignoreFetchAbort?: boolean;
    max?: number;
    maxEntrySize?: number;
    maxSize?: number;
    memoMethod?: Memoizer<K, V, FC>;
    noDeleteOnFetchRejection?: boolean;
    noDeleteOnStaleGet?: boolean;
    noDisposeOnSet?: boolean;
    noUpdateTTL?: boolean;
    sizeCalculation?: SizeCalculator<K, V>;
    ttl: number;
    ttlAutopurge: boolean;
    ttlResolution?: number;
    updateAgeOnGet?: boolean;
    updateAgeOnHas?: boolean;
}

Type Parameters

  • K
  • V
  • FC

Hierarchy (view full)

Properties

allowStale?: boolean

Allow LRUCache#get and LRUCache#fetch calls to return stale data, if available.

By default, if you set ttl, stale items will only be deleted from the cache when you get(key). That is, it's not preemptively pruning items, unless OptionsBase.ttlAutopurge is set.

If you set allowStale:true, it'll return the stale value as well as deleting it. If you don't set this, then it'll return undefined when you try to get a stale entry.

Note that when a stale entry is fetched, even if it is returned due to allowStale being set, it is removed from the cache immediately. You can suppress this behavior by setting OptionsBase.noDeleteOnStaleGet, either in the constructor, or in the options provided to LRUCache#get.

This may be overridden by passing an options object to cache.get(). The cache.has() method will always return false for stale items.

Only relevant if a ttl is set.

allowStaleOnFetchAbort?: boolean

Set to true to return a stale value from the cache when the AbortSignal passed to the OptionsBase.fetchMethod dispatches an 'abort' event, whether user-triggered, or due to internal cache behavior.

Unless OptionsBase.ignoreFetchAbort is also set, the underlying OptionsBase.fetchMethod will still be considered canceled, and any value it returns will be ignored and not cached.

Caveat: since fetches are aborted when a new value is explicitly set in the cache, this can lead to fetch returning a stale value, since that was the fallback value at the moment the fetch() was initiated, even though the new updated value is now present in the cache.

For example:

const cache = new LRUCache<string, any>({
ttl: 100,
fetchMethod: async (url, oldValue, { signal }) => {
const res = await fetch(url, { signal })
return await res.json()
}
})
cache.set('https://example.com/', { some: 'data' })
// 100ms go by...
const result = cache.fetch('https://example.com/')
cache.set('https://example.com/', { other: 'thing' })
console.log(await result) // { some: 'data' }
console.log(cache.get('https://example.com/')) // { other: 'thing' }
allowStaleOnFetchRejection?: boolean

Set to true to allow returning stale data when a OptionsBase.fetchMethod throws an error or returns a rejected promise.

This differs from using OptionsBase.allowStale in that stale data will ONLY be returned in the case that the LRUCache#fetch fails, not any other times.

If a fetchMethod fails, and there is no stale value available, the fetch() will resolve to undefined. Ie, all fetchMethod errors are suppressed.

Implies noDeleteOnFetchRejection.

This may be set in calls to fetch(), or defaulted on the constructor, or overridden by modifying the options object in the fetchMethod.

dispose?: Disposer<K, V>

Function that is called on items when they are dropped from the cache, as dispose(value, key, reason).

This can be handy if you want to close file descriptors or do other cleanup tasks when items are no longer stored in the cache.

NOTE: It is called before the item has been fully removed from the cache, so if you want to put it right back in, you need to wait until the next tick. If you try to add it back in during the dispose() function call, it will break things in subtle and weird ways.

Unlike several other options, this may not be overridden by passing an option to set(), for performance reasons.

The reason will be one of the following strings, corresponding to the reason for the item's deletion:

  • evict Item was evicted to make space for a new addition
  • set Item was overwritten by a new value
  • expire Item expired its TTL
  • fetch Item was deleted due to a failed or aborted fetch, or a fetchMethod returning `undefined.
  • delete Item was removed by explicit cache.delete(key), cache.clear(), or cache.set(key, undefined).
disposeAfter?: Disposer<K, V>

The same as OptionsBase.dispose, but called after the entry is completely removed and the cache is once again in a clean state.

It is safe to add an item right back into the cache at this point. However, note that it is very easy to inadvertently create infinite recursion this way.

fetchMethod?: Fetcher<K, V, FC>

Method that provides the implementation for LRUCache#fetch

fetchMethod(key, staleValue, { signal, options, context })

If fetchMethod is not provided, then cache.fetch(key) is equivalent to Promise.resolve(cache.get(key)).

If at any time, signal.aborted is set to true, or if the signal.onabort method is called, or if it emits an 'abort' event which you can listen to with addEventListener, then that means that the fetch should be abandoned. This may be passed along to async functions aware of AbortController/AbortSignal behavior.

The fetchMethod should only return undefined or a Promise resolving to undefined if the AbortController signaled an abort event. In all other cases, it should return or resolve to a value suitable for adding to the cache.

The options object is a union of the options that may be provided to set() and get(). If they are modified, then that will result in modifying the settings to cache.set() when the value is resolved, and in the case of OptionsBase.noDeleteOnFetchRejection and OptionsBase.allowStaleOnFetchRejection, the handling of fetchMethod failures.

For example, a DNS cache may update the TTL based on the value returned from a remote DNS server by changing options.ttl in the fetchMethod.

ignoreFetchAbort?: boolean

Set to true to ignore the abort event emitted by the AbortSignal object passed to OptionsBase.fetchMethod, and still cache the resulting resolution value, as long as it is not undefined.

When used on its own, this means aborted LRUCache#fetch calls are not immediately resolved or rejected when they are aborted, and instead take the full time to await.

When used with OptionsBase.allowStaleOnFetchAbort, aborted LRUCache#fetch calls will resolve immediately to their stale cached value or undefined, and will continue to process and eventually update the cache when they resolve, as long as the resulting value is not undefined, thus supporting a "return stale on timeout while refreshing" mechanism by passing AbortSignal.timeout(n) as the signal.

For example:

const c = new LRUCache({
ttl: 100,
ignoreFetchAbort: true,
allowStaleOnFetchAbort: true,
fetchMethod: async (key, oldValue, { signal }) => {
// note: do NOT pass the signal to fetch()!
// let's say this fetch can take a long time.
const res = await fetch(`https://slow-backend-server/${key}`)
return await res.json()
},
})

// this will return the stale value after 100ms, while still
// updating in the background for next time.
const val = await c.fetch('key', { signal: AbortSignal.timeout(100) })

Note: regardless of this setting, an abort event is still emitted on the AbortSignal object, so may result in invalid results when passed to other underlying APIs that use AbortSignals.

This may be overridden in the OptionsBase.fetchMethod or the call to LRUCache#fetch.

max?: number

The maximum number of items to store in the cache before evicting old entries. This is read-only on the LRUCache instance, and may not be overridden.

If set, then storage space will be pre-allocated at construction time, and the cache will perform significantly faster.

Note that significantly fewer items may be stored, if OptionsBase.maxSize and/or OptionsBase.ttl are also set.

It is strongly recommended to set a max to prevent unbounded growth of the cache.

maxEntrySize?: number

The maximum allowed size for any single item in the cache.

If a larger item is passed to LRUCache#set or returned by a OptionsBase.fetchMethod or OptionsBase.memoMethod, then it will not be stored in the cache.

Attempting to add an item whose calculated size is greater than this amount will not cache the item or evict any old items, but WILL delete an existing value if one is already present.

Optional, must be a positive integer if provided. Defaults to the value of maxSize if provided.

maxSize?: number

Set to a positive integer to track the sizes of items added to the cache, and automatically evict items in order to stay below this size. Note that this may result in fewer than max items being stored.

Attempting to add an item to the cache whose calculated size is greater that this amount will be a no-op. The item will not be cached, and no other items will be evicted.

Optional, must be a positive integer if provided.

Sets maxEntrySize to the same value, unless a different value is provided for maxEntrySize.

At least one of max, maxSize, or TTL is required. This must be a positive integer if set.

Even if size tracking is enabled, it is strongly recommended to set a max to prevent unbounded growth of the cache.

Note also that size tracking can negatively impact performance, though for most cases, only minimally.

memoMethod?: Memoizer<K, V, FC>

Method that provides the implementation for LRUCache#memo

noDeleteOnFetchRejection?: boolean

Set to true to suppress the deletion of stale data when a OptionsBase.fetchMethod returns a rejected promise.

noDeleteOnStaleGet?: boolean

Do not delete stale items when they are retrieved with LRUCache#get.

Note that the get return value will still be undefined unless OptionsBase.allowStale is true.

When using time-expiring entries with ttl, by default stale items will be removed from the cache when the key is accessed with cache.get().

Setting this option will cause stale items to remain in the cache, until they are explicitly deleted with cache.delete(key), or retrieved with noDeleteOnStaleGet set to false.

This may be overridden by passing an options object to cache.get().

Only relevant if a ttl is used.

noDisposeOnSet?: boolean

Set to true to suppress calling the OptionsBase.dispose function if the entry key is still accessible within the cache.

This may be overridden by passing an options object to LRUCache#set.

Only relevant if dispose or disposeAfter are set.

noUpdateTTL?: boolean

Boolean flag to tell the cache to not update the TTL when setting a new value for an existing key (ie, when updating a value rather than inserting a new value). Note that the TTL value is always set (if provided) when adding a new entry into the cache.

Has no effect if a OptionsBase.ttl is not set.

May be passed as an option to LRUCache#set.

sizeCalculation?: SizeCalculator<K, V>

A function that returns a number indicating the item's size.

Requires OptionsBase.maxSize to be set.

If not provided, and OptionsBase.maxSize or OptionsBase.maxEntrySize are set, then all LRUCache#set calls must provide an explicit SetOptions.size or sizeCalculation param.

ttl: number

Max time in milliseconds for items to live in cache before they are considered stale. Note that stale items are NOT preemptively removed by default, and MAY live in the cache, contributing to its LRU max, long after they have expired, unless OptionsBase.ttlAutopurge is set.

If set to 0 (the default value), then that means "do not track TTL", not "expire immediately".

Also, as this cache is optimized for LRU/MRU operations, some of the staleness/TTL checks will reduce performance, as they will incur overhead by deleting items.

This is not primarily a TTL cache, and does not make strong TTL guarantees. There is no pre-emptive pruning of expired items, but you may set a TTL on the cache, and it will treat expired items as missing when they are fetched, and delete them.

Optional, but must be a non-negative integer in ms if specified.

This may be overridden by passing an options object to cache.set().

At least one of max, maxSize, or TTL is required. This must be a positive integer if set.

Even if ttl tracking is enabled, it is strongly recommended to set a max to prevent unbounded growth of the cache.

If ttl tracking is enabled, and max and maxSize are not set, and ttlAutopurge is not set, then a warning will be emitted cautioning about the potential for unbounded memory consumption. (The TypeScript definitions will also discourage this.)

ttlAutopurge: boolean

Preemptively remove stale items from the cache.

Note that this may significantly degrade performance, especially if the cache is storing a large number of items. It is almost always best to just leave the stale items in the cache, and let them fall out as new items are added.

Note that this means that OptionsBase.allowStale is a bit pointless, as stale items will be deleted almost as soon as they expire.

Use with caution!

ttlResolution?: number

Minimum amount of time in ms in which to check for staleness. Defaults to 1, which means that the current time is checked at most once per millisecond.

Set to 0 to check the current time every time staleness is tested. (This reduces performance, and is theoretically unnecessary.)

Setting this to a higher value will improve performance somewhat while using ttl tracking, albeit at the expense of keeping stale items around a bit longer than their TTLs would indicate.

1
updateAgeOnGet?: boolean

When using time-expiring entries with ttl, setting this to true will make each item's age reset to 0 whenever it is retrieved from cache with LRUCache#get, causing it to not expire. (It can still fall out of cache based on recency of use, of course.)

Has no effect if OptionsBase.ttl is not set.

This may be overridden by passing an options object to cache.get().

updateAgeOnHas?: boolean

When using time-expiring entries with ttl, setting this to true will make each item's age reset to 0 whenever its presence in the cache is checked with LRUCache#has, causing it to not expire. (It can still fall out of cache based on recency of use, of course.)

Has no effect if OptionsBase.ttl is not set.