Interface ValueCache<K,V extends @Nullable Object>

Type Parameters:
K - the type of cache keys
V - the type of cache values
All Known Implementing Classes:
NoOpValueCache

@PublicSpi @NullMarked public interface ValueCache<K,V extends @Nullable Object>
The ValueCache is used by data loaders that use caching and want a long-lived or external cache of values. The ValueCache is used as a place to cache values when they come back from an async cache store.

It differs from CacheMap which is in fact a cache of promised values aka CompletableFuture<V>'s.

ValueCache is more suited to be a wrapper of a long-lived or externally cached values. CompletableFutures can't be easily placed in an external cache outside the JVM say, hence the need for the ValueCache.

DataLoaders use a two stage cache strategy if caching is enabled. If the CacheMap already has the promise to a value that is used. If not then the ValueCache is asked for a value, if it has one then that is returned (and cached as a promise in the CacheMap).

If there is no value then the key is queued and loaded via the BatchLoader calls. The returned values will then be stored in the ValueCache and the promises to those values are also stored in the CacheMap.

The default implementation is a no-op store which replies with the key always missing and doesn't store any actual results. This is to avoid duplicating the stored data between the CacheMap out of the box.

The API signature uses CompletableFutures because the backing implementation MAY be a remote external cache and hence exceptions may happen in retrieving values, and they may take time to complete.

  • Method Details

    • defaultValueCache

      static <K, V> ValueCache<K,V> defaultValueCache()
      Creates a new value cache, using the default no-op implementation.
      Type Parameters:
      K - the type of cache keys
      V - the type of cache values
      Returns:
      the cache store
    • get

      CompletableFuture<V> get(K key)
      Gets the specified key from the value cache. If the key is not present, then the implementation MUST return an exceptionally completed future and not null because null is a valid cacheable value. An exceptionally completed future will cause DataLoader to load the key via batch loading instead.

      Parameters:
      key - the key to retrieve
      Returns:
      a future containing the cached value (which maybe null) or exceptionally completed future if the key does not exist in the cache.
    • getValues

      default CompletableFuture<List<Try<V>>> getValues(List<K> keys) throws ValueCache.ValueCachingNotSupported
      Gets the specified keys from the value cache, in a batch call. If your underlying cache cannot do batch caching retrieval then do not implement this method, and it will delegate back to get(Object) for you

      Each item in the returned list of values is a Try. If the key could not be found then a failed Try just be returned otherwise a successful Try contain the cached value is returned.

      You MUST return a List that is the same size as the keys passed in. The code will assert if you do not.

      If your cache does not have anything in it at all, and you want to quickly short-circuit this method and avoid any object allocation then throw ValueCache.ValueCachingNotSupported and the code will know there is nothing in cache at this time.

      Parameters:
      keys - the list of keys to get cached values for.
      Returns:
      a future containing a list of Try cached values for each key passed in.
      Throws:
      ValueCache.ValueCachingNotSupported - if this cache wants to short-circuit this method completely
    • set

      CompletableFuture<V> set(K key, V value)
      Stores the value with the specified key, or updates it if the key already exists.
      Parameters:
      key - the key to store
      value - the value to store
      Returns:
      a future containing the stored value for fluent composition
    • setValues

      default CompletableFuture<List<V>> setValues(List<K> keys, List<V> values) throws ValueCache.ValueCachingNotSupported
      Stores the value with the specified keys, or updates it if the keys if they already exist. If your underlying cache can't do batch caching setting then do not implement this method, and it will delegate back to set(Object, Object) for you
      Parameters:
      keys - the keys to store
      values - the values to store
      Returns:
      a future containing the stored values for fluent composition
      Throws:
      ValueCache.ValueCachingNotSupported - if this cache wants to short-circuit this method completely
    • delete

      CompletableFuture<Void> delete(K key)
      Deletes the entry with the specified key from the value cache, if it exists.

      NOTE: Your implementation MUST not throw exceptions, rather it should return a CompletableFuture that has completed exceptionally. Failure to do this may cause the DataLoader code to not run properly.

      Parameters:
      key - the key to delete
      Returns:
      a void future for error handling and fluent composition
    • clear

      Clears all entries from the value cache.

      NOTE: Your implementation MUST not throw exceptions, rather it should return a CompletableFuture that has completed exceptionally. Failure to do this may cause the DataLoader code to not run properly.

      Returns:
      a void future for error handling and fluent composition