path-scurry
    Preparing search index...

    Class ChildrenCacheInternal

    an LRUCache for storing child entries.

    Hierarchy

    Index

    Constructors

    Properties

    "[toStringTag]": string

    A String value that is used in the creation of the default string description of an object. Called by the built-in method Object.prototype.toString.

    allowStale: boolean

    LRUCache.OptionsBase.allowStale

    allowStaleOnFetchAbort: boolean

    LRUCache.OptionsBase.allowStaleOnFetchAbort

    allowStaleOnFetchRejection: boolean

    LRUCache.OptionsBase.allowStaleOnFetchRejection

    ignoreFetchAbort: boolean

    LRUCache.OptionsBase.ignoreFetchAbort

    maxEntrySize: number

    LRUCache.OptionsBase.maxEntrySize

    noDeleteOnFetchRejection: boolean

    LRUCache.OptionsBase.noDeleteOnFetchRejection

    noDeleteOnStaleGet: boolean

    LRUCache.OptionsBase.noDeleteOnStaleGet

    noDisposeOnSet: boolean

    LRUCache.OptionsBase.noDisposeOnSet

    noUpdateTTL: boolean

    LRUCache.OptionsBase.noUpdateTTL

    sizeCalculation?: SizeCalculator<PathBase, Children>

    LRUCache.OptionsBase.sizeCalculation

    ttl: number

    LRUCache.OptionsBase.ttl

    ttlAutopurge: boolean

    LRUCache.OptionsBase.ttlAutopurge

    ttlResolution: number

    LRUCache.OptionsBase.ttlResolution

    updateAgeOnGet: boolean

    LRUCache.OptionsBase.updateAgeOnGet

    updateAgeOnHas: boolean

    LRUCache.OptionsBase.updateAgeOnHas

    Accessors

    • get calculatedSize(): number

      The total computed size of items in the cache (read-only)

      Returns number

    • get dispose(): Disposer<K, V> | undefined

      LRUCache.OptionsBase.dispose (read-only)

      Returns Disposer<K, V> | undefined

    • get disposeAfter(): Disposer<K, V> | undefined

      LRUCache.OptionsBase.disposeAfter (read-only)

      Returns Disposer<K, V> | undefined

    • get fetchMethod(): Fetcher<K, V, FC> | undefined

      LRUCache.OptionsBase.fetchMethod (read-only)

      Returns Fetcher<K, V, FC> | undefined

    • get max(): number

      LRUCache.OptionsBase.max (read-only)

      Returns number

    • get maxSize(): number

      LRUCache.OptionsBase.maxSize (read-only)

      Returns number

    • get memoMethod(): Memoizer<K, V, FC> | undefined

      Returns Memoizer<K, V, FC> | undefined

    • get onInsert(): Inserter<K, V> | undefined

      LRUCache.OptionsBase.onInsert (read-only)

      Returns Inserter<K, V> | undefined

    • get perf(): Perf

      LRUCache.OptionsBase.perf

      Returns Perf

    • get size(): number

      The number of items stored in the cache (read-only)

      Returns number

    Methods

    • Iterating over the cache itself yields the same results as LRUCache.entries

      Returns Generator<[PathBase, Children], void, unknown>

    • Clear the cache entirely, throwing away all values.

      Returns void

    • Deletes a key out of the cache.

      Returns true if the key was deleted, false otherwise.

      Parameters

      Returns boolean

    • Return an array of [key, LRUCache.Entry] tuples which can be passed to LRUCache#load.

      The start fields are calculated relative to a portable Date.now() timestamp, even if performance.now() is available.

      Stale entries are always included in the dump, even if LRUCache.OptionsBase.allowStale is false.

      Note: this returns an actual array, not a generator, so it can be more easily passed around.

      Returns [PathBase, Entry<Children>][]

    • Return a generator yielding [key, value] pairs, in order from most recently used to least recently used.

      Returns Generator<[PathBase, Children], void, unknown>

    • Make an asynchronous cached fetch using the LRUCache.OptionsBase.fetchMethod function.

      If the value is in the cache and not stale, then the returned Promise resolves to the value.

      If not in the cache, or beyond its TTL staleness, then fetchMethod(key, staleValue, { options, signal, context }) is called, and the value returned will be added to the cache once resolved.

      If called with allowStale, and an asynchronous fetch is currently in progress to reload a stale value, then the former stale value will be returned.

      If called with forceRefresh, then the cached item will be re-fetched, even if it is not stale. However, if allowStale is also set, then the old value will still be returned. This is useful in cases where you want to force a reload of a cached value. If a background fetch is already in progress, then forceRefresh has no effect.

      If multiple fetches for the same key are issued, then they will all be coalesced into a single call to fetchMethod.

      Note that this means that handling options such as LRUCache.OptionsBase.allowStaleOnFetchAbort, LRUCache.FetchOptions.signal, and LRUCache.OptionsBase.allowStaleOnFetchRejection will be determined by the FIRST fetch() call for a given key.

      This is a known (fixable) shortcoming which will be addresed on when someone complains about it, as the fix would involve added complexity and may not be worth the costs for this edge case.

      If LRUCache.OptionsBase.fetchMethod is not specified, then this is effectively an alias for Promise.resolve(cache.get(key)).

      When the fetch method resolves to a value, if the fetch has not been aborted due to deletion, eviction, or being overwritten, then it is added to the cache using the options provided.

      If the key is evicted or deleted before the fetchMethod resolves, then the AbortSignal passed to the fetchMethod will receive an abort event, and the promise returned by fetch() will reject with the reason for the abort.

      If a signal is passed to the fetch() call, then aborting the signal will abort the fetch and cause the fetch() promise to reject with the reason provided.

      Setting context

      If an FC type is set to a type other than unknown, void, or undefined in the LRUCache constructor, then all calls to cache.fetch() must provide a context option. If set to undefined or void, then calls to fetch must not provide a context option.

      The context param allows you to provide arbitrary data that might be relevant in the course of fetching the data. It is only relevant for the course of a single fetch() operation, and discarded afterwards.

      Note: fetch() calls are inflight-unique

      If you call fetch() multiple times with the same key value, then every call after the first will resolve on the same promise1, even if they have different settings that would otherwise change the behavior of the fetch, such as noDeleteOnFetchRejection or ignoreFetchAbort.

      In most cases, this is not a problem (in fact, only fetching something once is what you probably want, if you're caching in the first place). If you are changing the fetch() options dramatically between runs, there's a good chance that you might be trying to fit divergent semantics into a single object, and would be better off with multiple cache instances.

      1: Ie, they're not the "same Promise", but they resolve at the same time, because they're both waiting on the same underlying fetchMethod response.

      Parameters

      Returns Promise<Children | undefined>

    • Make an asynchronous cached fetch using the LRUCache.OptionsBase.fetchMethod function.

      If the value is in the cache and not stale, then the returned Promise resolves to the value.

      If not in the cache, or beyond its TTL staleness, then fetchMethod(key, staleValue, { options, signal, context }) is called, and the value returned will be added to the cache once resolved.

      If called with allowStale, and an asynchronous fetch is currently in progress to reload a stale value, then the former stale value will be returned.

      If called with forceRefresh, then the cached item will be re-fetched, even if it is not stale. However, if allowStale is also set, then the old value will still be returned. This is useful in cases where you want to force a reload of a cached value. If a background fetch is already in progress, then forceRefresh has no effect.

      If multiple fetches for the same key are issued, then they will all be coalesced into a single call to fetchMethod.

      Note that this means that handling options such as LRUCache.OptionsBase.allowStaleOnFetchAbort, LRUCache.FetchOptions.signal, and LRUCache.OptionsBase.allowStaleOnFetchRejection will be determined by the FIRST fetch() call for a given key.

      This is a known (fixable) shortcoming which will be addresed on when someone complains about it, as the fix would involve added complexity and may not be worth the costs for this edge case.

      If LRUCache.OptionsBase.fetchMethod is not specified, then this is effectively an alias for Promise.resolve(cache.get(key)).

      When the fetch method resolves to a value, if the fetch has not been aborted due to deletion, eviction, or being overwritten, then it is added to the cache using the options provided.

      If the key is evicted or deleted before the fetchMethod resolves, then the AbortSignal passed to the fetchMethod will receive an abort event, and the promise returned by fetch() will reject with the reason for the abort.

      If a signal is passed to the fetch() call, then aborting the signal will abort the fetch and cause the fetch() promise to reject with the reason provided.

      Setting context

      If an FC type is set to a type other than unknown, void, or undefined in the LRUCache constructor, then all calls to cache.fetch() must provide a context option. If set to undefined or void, then calls to fetch must not provide a context option.

      The context param allows you to provide arbitrary data that might be relevant in the course of fetching the data. It is only relevant for the course of a single fetch() operation, and discarded afterwards.

      Note: fetch() calls are inflight-unique

      If you call fetch() multiple times with the same key value, then every call after the first will resolve on the same promise1, even if they have different settings that would otherwise change the behavior of the fetch, such as noDeleteOnFetchRejection or ignoreFetchAbort.

      In most cases, this is not a problem (in fact, only fetching something once is what you probably want, if you're caching in the first place). If you are changing the fetch() options dramatically between runs, there's a good chance that you might be trying to fit divergent semantics into a single object, and would be better off with multiple cache instances.

      1: Ie, they're not the "same Promise", but they resolve at the same time, because they're both waiting on the same underlying fetchMethod response.

      Parameters

      Returns Promise<Children | undefined>

    • Find a value for which the supplied fn method returns a truthy value, similar to Array.find(). fn is called as fn(value, key, cache).

      Parameters

      Returns Children | undefined

    • In some cases, cache.fetch() may resolve to undefined, either because a LRUCache.OptionsBase#fetchMethod was not provided (turning cache.fetch(k) into just an async wrapper around cache.get(k)) or because ignoreFetchAbort was specified (either to the constructor or in the LRUCache.FetchOptions). Also, the LRUCache.OptionsBase.fetchMethod may return undefined or void, making the test even more complicated.

      Because inferring the cases where undefined might be returned are so cumbersome, but testing for undefined can also be annoying, this method can be used, which will reject if this.fetch() resolves to undefined.

      Parameters

      Returns Promise<Children>

    • In some cases, cache.fetch() may resolve to undefined, either because a LRUCache.OptionsBase#fetchMethod was not provided (turning cache.fetch(k) into just an async wrapper around cache.get(k)) or because ignoreFetchAbort was specified (either to the constructor or in the LRUCache.FetchOptions). Also, the LRUCache.OptionsBase.fetchMethod may return undefined or void, making the test even more complicated.

      Because inferring the cases where undefined might be returned are so cumbersome, but testing for undefined can also be annoying, this method can be used, which will reject if this.fetch() resolves to undefined.

      Parameters

      Returns Promise<Children>

    • Call the supplied function on each item in the cache, in order from most recently used to least recently used.

      fn is called as fn(value, key, cache).

      If thisp is provided, function will be called in the this-context of the provided object, or the cache if no thisp object is provided.

      Does not update age or recenty of use, or iterate over stale values.

      Parameters

      Returns void

    • Return a value from the cache. Will update the recency of the cache entry found.

      If the key is not found, get() will return undefined.

      Parameters

      Returns Children | undefined

    • Return the number of ms left in the item's TTL. If item is not in cache, returns 0. Returns Infinity if item is in cache without a defined TTL.

      Parameters

      Returns number

    • Check if a key is in the cache, without updating the recency of use. Will return false if the item is stale, even though it is technically in the cache.

      Check if a key is in the cache, without updating the recency of use. Age is updated if LRUCache.OptionsBase.updateAgeOnHas is set to true in either the options or the constructor.

      Will return false if the item is stale, even though it is technically in the cache. The difference can be determined (if it matters) by using a status argument, and inspecting the has field.

      Will not update item age unless LRUCache.OptionsBase.updateAgeOnHas is set.

      Parameters

      Returns boolean

    • Get the extended info about a given entry, to get its value, size, and TTL info simultaneously. Returns undefined if the key is not present.

      Unlike LRUCache#dump, which is designed to be portable and survive serialization, the start value is always the current timestamp, and the ttl is a calculated remaining time to live (negative if expired).

      Always returns stale values, if their info is found in the cache, so be sure to check for expirations (ie, a negative LRUCache.Entry#ttl) if relevant.

      Parameters

      Returns Entry<Children> | undefined

    • Return a generator yielding the keys in the cache, in order from most recently used to least recently used.

      Returns Generator<PathBase, void, unknown>

    • Reset the cache and load in the items in entries in the order listed.

      The shape of the resulting cache may be different if the same options are not used in both caches.

      The start fields are assumed to be calculated relative to a portable Date.now() timestamp, even if performance.now() is available.

      Parameters

      Returns void

    • If the key is found in the cache, then this is equivalent to LRUCache#get. If not, in the cache, then calculate the value using the LRUCache.OptionsBase.memoMethod, and add it to the cache.

      If an FC type is set to a type other than unknown, void, or undefined in the LRUCache constructor, then all calls to cache.memo() must provide a context option. If set to undefined or void, then calls to memo must not provide a context option.

      The context param allows you to provide arbitrary data that might be relevant in the course of fetching the data. It is only relevant for the course of a single memo() operation, and discarded afterwards.

      Parameters

      Returns Children

    • If the key is found in the cache, then this is equivalent to LRUCache#get. If not, in the cache, then calculate the value using the LRUCache.OptionsBase.memoMethod, and add it to the cache.

      If an FC type is set to a type other than unknown, void, or undefined in the LRUCache constructor, then all calls to cache.memo() must provide a context option. If set to undefined or void, then calls to memo must not provide a context option.

      The context param allows you to provide arbitrary data that might be relevant in the course of fetching the data. It is only relevant for the course of a single memo() operation, and discarded afterwards.

      Parameters

      Returns Children

    • Like LRUCache#get but doesn't update recency or delete stale items.

      Returns undefined if the item is stale, unless LRUCache.OptionsBase.allowStale is set.

      Parameters

      Returns Children | undefined

    • Evict the least recently used item, returning its value or undefined if cache is empty.

      Returns Children | undefined

    • Delete any stale entries. Returns true if anything was removed, false otherwise.

      Returns boolean

    • Inverse order version of LRUCache.entries

      Return a generator yielding [key, value] pairs, in order from least recently used to most recently used.

      Returns Generator<(PathBase | Children)[], void, unknown>

    • The same as LRUCache.forEach but items are iterated over in reverse order. (ie, less recently used items are iterated over first.)

      Parameters

      Returns void

    • Inverse order version of LRUCache.keys

      Return a generator yielding the keys in the cache, in order from least recently used to most recently used.

      Returns Generator<PathBase, void, unknown>

    • Inverse order version of LRUCache.values

      Return a generator yielding the values in the cache, in order from least recently used to most recently used.

      Returns Generator<Children | undefined, void, unknown>

    • Add a value to the cache.

      Note: if undefined is specified as a value, this is an alias for LRUCache#delete

      Fields on the LRUCache.SetOptions options param will override their corresponding values in the constructor options for the scope of this single set() operation.

      If start is provided, then that will set the effective start time for the TTL calculation. Note that this must be a previous value of performance.now() if supported, or a previous value of Date.now() if not.

      Options object may also include size, which will prevent calling the sizeCalculation function and just use the specified number if it is a positive integer, and noDisposeOnSet which will prevent calling a dispose function in the case of overwrites.

      If the size (or return value of sizeCalculation) for a given entry is greater than maxEntrySize, then the item will not be added to the cache.

      Will update the recency of the entry.

      If the value is undefined, then this is an alias for cache.delete(key). undefined is never stored in the cache.

      Parameters

      Returns this

    • Return a generator yielding the values in the cache, in order from most recently used to least recently used.

      Returns Generator<Children, void, unknown>

    • Internal

      Do not call this method unless you need to inspect the inner workings of the cache. If anything returned by this object is modified in any way, strange breakage may occur.

      These fields are private for a reason!

      Type Parameters

      • K extends {}
      • V extends {}
      • FC extends unknown = unknown

      Parameters

      • c: LRUCache<K, V, FC>

      Returns {
          autopurgeTimers: (Timeout | undefined)[] | undefined;
          backgroundFetch: (
              k: K,
              index: number | undefined,
              options: FetchOptions<K, V, FC>,
              context: any,
          ) => BackgroundFetch<V>;
          free: StackLike;
          head: Index;
          indexes: (
              options?: { allowStale: boolean },
          ) => Generator<Index, void, unknown>;
          isBackgroundFetch: (p: any) => p is BackgroundFetch<V>;
          isStale: (index: number | undefined) => boolean;
          keyList: (K | undefined)[];
          keyMap: Map<K, number>;
          moveToTail: (index: number) => void;
          next: NumberArray;
          prev: NumberArray;
          rindexes: (
              options?: { allowStale: boolean },
          ) => Generator<Index, void, unknown>;
          sizes: ZeroArray | undefined;
          starts: ZeroArray | undefined;
          tail: Index;
          ttls: ZeroArray | undefined;
          valList: (V | BackgroundFetch<V> | undefined)[];
      }