Interface FetcherFetchOptions<K, V, FC>

options which override the options set in the LRUCache constructor when calling LRUCache#fetch.

This is the union of GetOptions and SetOptions, plus OptionsBase.noDeleteOnFetchRejection, OptionsBase.allowStaleOnFetchRejection, FetchOptions.forceRefresh, and FetcherOptions.context

Any of these may be modified in the OptionsBase.fetchMethod function, but the GetOptions fields will of course have no effect, as the LRUCache#get call already happened by the time the fetchMethod is called.

interface FetcherFetchOptions<K, V, FC> {
    allowStale?: boolean;
    allowStaleOnFetchAbort?: boolean;
    allowStaleOnFetchRejection?: boolean;
    ignoreFetchAbort?: boolean;
    noDeleteOnFetchRejection?: boolean;
    noDeleteOnStaleGet?: boolean;
    noDisposeOnSet?: boolean;
    noUpdateTTL?: boolean;
    size?: number;
    sizeCalculation?: SizeCalculator<K, V>;
    status?: Status<V>;
    ttl?: number;
    updateAgeOnGet?: boolean;
}

Type Parameters

  • K
  • V
  • FC = unknown

Hierarchy (view full)

  • Pick<OptionsBase<K, V, FC>,
        | "allowStale"
        | "updateAgeOnGet"
        | "noDeleteOnStaleGet"
        | "sizeCalculation"
        | "ttl"
        | "noDisposeOnSet"
        | "noUpdateTTL"
        | "noDeleteOnFetchRejection"
        | "allowStaleOnFetchRejection"
        | "ignoreFetchAbort"
        | "allowStaleOnFetchAbort">

Properties

allowStale?: boolean

Allow LRUCache#get and LRUCache#fetch calls to return stale data, if available.

By default, if you set ttl, stale items will only be deleted from the cache when you get(key). That is, it's not preemptively pruning items, unless OptionsBase.ttlAutopurge is set.

If you set allowStale:true, it'll return the stale value as well as deleting it. If you don't set this, then it'll return undefined when you try to get a stale entry.

Note that when a stale entry is fetched, even if it is returned due to allowStale being set, it is removed from the cache immediately. You can suppress this behavior by setting OptionsBase.noDeleteOnStaleGet, either in the constructor, or in the options provided to LRUCache#get.

This may be overridden by passing an options object to cache.get(). The cache.has() method will always return false for stale items.

Only relevant if a ttl is set.

allowStaleOnFetchAbort?: boolean

Set to true to return a stale value from the cache when the AbortSignal passed to the OptionsBase.fetchMethod dispatches an 'abort' event, whether user-triggered, or due to internal cache behavior.

Unless OptionsBase.ignoreFetchAbort is also set, the underlying OptionsBase.fetchMethod will still be considered canceled, and any value it returns will be ignored and not cached.

Caveat: since fetches are aborted when a new value is explicitly set in the cache, this can lead to fetch returning a stale value, since that was the fallback value at the moment the fetch() was initiated, even though the new updated value is now present in the cache.

For example:

const cache = new LRUCache<string, any>({
ttl: 100,
fetchMethod: async (url, oldValue, { signal }) => {
const res = await fetch(url, { signal })
return await res.json()
}
})
cache.set('https://example.com/', { some: 'data' })
// 100ms go by...
const result = cache.fetch('https://example.com/')
cache.set('https://example.com/', { other: 'thing' })
console.log(await result) // { some: 'data' }
console.log(cache.get('https://example.com/')) // { other: 'thing' }
allowStaleOnFetchRejection?: boolean

Set to true to allow returning stale data when a OptionsBase.fetchMethod throws an error or returns a rejected promise.

This differs from using OptionsBase.allowStale in that stale data will ONLY be returned in the case that the LRUCache#fetch fails, not any other times.

If a fetchMethod fails, and there is no stale value available, the fetch() will resolve to undefined. Ie, all fetchMethod errors are suppressed.

Implies noDeleteOnFetchRejection.

This may be set in calls to fetch(), or defaulted on the constructor, or overridden by modifying the options object in the fetchMethod.

ignoreFetchAbort?: boolean

Set to true to ignore the abort event emitted by the AbortSignal object passed to OptionsBase.fetchMethod, and still cache the resulting resolution value, as long as it is not undefined.

When used on its own, this means aborted LRUCache#fetch calls are not immediately resolved or rejected when they are aborted, and instead take the full time to await.

When used with OptionsBase.allowStaleOnFetchAbort, aborted LRUCache#fetch calls will resolve immediately to their stale cached value or undefined, and will continue to process and eventually update the cache when they resolve, as long as the resulting value is not undefined, thus supporting a "return stale on timeout while refreshing" mechanism by passing AbortSignal.timeout(n) as the signal.

For example:

const c = new LRUCache({
ttl: 100,
ignoreFetchAbort: true,
allowStaleOnFetchAbort: true,
fetchMethod: async (key, oldValue, { signal }) => {
// note: do NOT pass the signal to fetch()!
// let's say this fetch can take a long time.
const res = await fetch(`https://slow-backend-server/${key}`)
return await res.json()
},
})

// this will return the stale value after 100ms, while still
// updating in the background for next time.
const val = await c.fetch('key', { signal: AbortSignal.timeout(100) })

Note: regardless of this setting, an abort event is still emitted on the AbortSignal object, so may result in invalid results when passed to other underlying APIs that use AbortSignals.

This may be overridden in the OptionsBase.fetchMethod or the call to LRUCache#fetch.

noDeleteOnFetchRejection?: boolean

Set to true to suppress the deletion of stale data when a OptionsBase.fetchMethod returns a rejected promise.

noDeleteOnStaleGet?: boolean

Do not delete stale items when they are retrieved with LRUCache#get.

Note that the get return value will still be undefined unless OptionsBase.allowStale is true.

When using time-expiring entries with ttl, by default stale items will be removed from the cache when the key is accessed with cache.get().

Setting this option will cause stale items to remain in the cache, until they are explicitly deleted with cache.delete(key), or retrieved with noDeleteOnStaleGet set to false.

This may be overridden by passing an options object to cache.get().

Only relevant if a ttl is used.

noDisposeOnSet?: boolean

Set to true to suppress calling the OptionsBase.dispose function if the entry key is still accessible within the cache.

This may be overridden by passing an options object to LRUCache#set.

Only relevant if dispose or disposeAfter are set.

noUpdateTTL?: boolean

Boolean flag to tell the cache to not update the TTL when setting a new value for an existing key (ie, when updating a value rather than inserting a new value). Note that the TTL value is always set (if provided) when adding a new entry into the cache.

Has no effect if a OptionsBase.ttl is not set.

May be passed as an option to LRUCache#set.

size?: number
sizeCalculation?: SizeCalculator<K, V>

A function that returns a number indicating the item's size.

Requires OptionsBase.maxSize to be set.

If not provided, and OptionsBase.maxSize or OptionsBase.maxEntrySize are set, then all LRUCache#set calls must provide an explicit SetOptions.size or sizeCalculation param.

status?: Status<V>
ttl?: number

Max time in milliseconds for items to live in cache before they are considered stale. Note that stale items are NOT preemptively removed by default, and MAY live in the cache, contributing to its LRU max, long after they have expired, unless OptionsBase.ttlAutopurge is set.

If set to 0 (the default value), then that means "do not track TTL", not "expire immediately".

Also, as this cache is optimized for LRU/MRU operations, some of the staleness/TTL checks will reduce performance, as they will incur overhead by deleting items.

This is not primarily a TTL cache, and does not make strong TTL guarantees. There is no pre-emptive pruning of expired items, but you may set a TTL on the cache, and it will treat expired items as missing when they are fetched, and delete them.

Optional, but must be a non-negative integer in ms if specified.

This may be overridden by passing an options object to cache.set().

At least one of max, maxSize, or TTL is required. This must be a positive integer if set.

Even if ttl tracking is enabled, it is strongly recommended to set a max to prevent unbounded growth of the cache.

If ttl tracking is enabled, and max and maxSize are not set, and ttlAutopurge is not set, then a warning will be emitted cautioning about the potential for unbounded memory consumption. (The TypeScript definitions will also discourage this.)

updateAgeOnGet?: boolean

When using time-expiring entries with ttl, setting this to true will make each item's age reset to 0 whenever it is retrieved from cache with LRUCache#get, causing it to not expire. (It can still fall out of cache based on recency of use, of course.)

Has no effect if OptionsBase.ttl is not set.

This may be overridden by passing an options object to cache.get().