Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Single-element cache #872

Open
gissuebot opened this issue Oct 31, 2014 · 38 comments
Open

Single-element cache #872

gissuebot opened this issue Oct 31, 2014 · 38 comments

Comments

@gissuebot
Copy link

Original issue created by raymond.rishty on 2012-01-18 at 02:55 AM


First off, Guava's caching API is great stuff. It's made my code more clean, more efficient, and more robust.

I do, however, have a confession... I sometimes abuse it.

Clearly, the cache is intended for key-based lookup, but occasionally, I have a single item I need to cache. It's something that is too expensive to fetch all the time, and the "memoizeWithExpiration" Supplier is a little thin (for example, I can't force eviction/invalidation).

So, what I've done is have some dummy key that I used to look up this single item. I'm sure you're cringing as your read this.

It would be nice to have a way of supporting this functionality that doesn't feel quite so dirty. Whether it would be some extension to the Cache API (don't know how...) or perhaps a beefed-up supplier that pulls in some functionality from Cache.

@gissuebot
Copy link
Author

Original comment posted by wasserman.louis on 2012-01-18 at 06:43 PM


I'm...in favor of this, and tentatively endorse a beefed-up Supplier.


Labels: Package-Cache, Type-Enhancement

@gissuebot
Copy link
Author

Original comment posted by raymond.rishty on 2012-01-18 at 08:42 PM


A proposed interface:

interface BeefySupplier<T> extends Supplier<T> {
    void invalidate(); // or, perhaps, clear()
    void refresh();
    void put(T object);
}

It would also be nice if the "memoizeWithExpiration" static factory could become a builder, so I could do something like:

BeefySupplier<Steer> steerSupplier = Suppliers.builder()
    .memoize()
    .expireAfterAccess(100)
    .refreshAfterWrite(200)
    .build(
        new Supplier<Steer>() {
            public Steer get() {
                return raiseASteer(); // it takes a long time to raise one, hence why I want to cache it
            }
        }
    );

@gissuebot
Copy link
Author

Original comment posted by raymond.rishty on 2012-01-19 at 01:27 AM


Hmm, so I just noticed that there is a CacheLoader.from(Supplier<V> supplier). Is this intended to be the way to go about making a single-element cache? If so, this issue can be closed.

@gissuebot
Copy link
Author

Original comment posted by wasserman.louis on 2012-01-19 at 01:30 AM


I'm not satisfied with that, exactly? I'd like to eliminate the key from the picture, for the single-element cache.

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2012-01-19 at 03:30 AM


Raymond, that's actually completely unrelated. It's just like any other cache, just that the CacheLoader doesn't actually depend on the key.

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2012-01-19 at 03:32 AM


In general, I think it's probably a good idea to explore the idea of a singleton-cache type that's the no-key,one-value analogue of CacheBuilder/LoadingCache, then whittle the set of methods suggested by that analogy down to the ones we know we have use cases for Out There, and see what we arrive at.

To me, the fact it'll implement Supplier is just incidental, as with the case of LoadingCache implementing Function, and so I don't think Supplier would feature in the name.


Status: Triaged

@gissuebot
Copy link
Author

Original comment posted by joe.j.kearney on 2012-01-19 at 09:54 AM


I have an implementation of this that I called ReferenceMaker, with options similar to MapMaker/CacheLoader for expiry etc, but only storing the single reference. There are plenty of things this could support that aren't applicable to or done in CacheLoader, in particular timed async refresh as opposed to expiry with on-demand reinitialisation.

I considered this to be more like a Reference with some extra options (e.g. self-refreshable) than like a cache, though maybe these are close to the same thing.

@gissuebot
Copy link
Author

Original comment posted by fry@google.com on 2012-02-16 at 07:18 PM


(No comment entered for this change.)


Status: Acknowledged

@gissuebot
Copy link
Author

Original comment posted by mindas on 2012-04-26 at 03:21 PM


Not sure if OP had a similar scenario in mind, but I needed a memoizing supplier which is calculated from non-static context (pseudo code):

if ((t = not_calculated_yet) != null) {
  block_all_other_threads();
  t = calculate(f);
  memoize(t);
}
return t;

My initial response to this was to use Guava's cache (like OP) but I think having an abstract class with double check idiom (to hide the inglorious bits) is a better solution.

There are numerous implementations on the web, and this is one of them: http://weblogs.java.net/blog/mason/archive/2006/09/rechecking_doub.html

@gissuebot
Copy link
Author

Original comment posted by cpovirk@google.com on 2012-04-26 at 03:34 PM


mindas, have you seen Suppliers.memoize?

https://google.github.io/guava/apidocs/com/google/common/base/Suppliers.html#memoize%28com.google.common.base.Supplier%29

@gissuebot
Copy link
Author

Original comment posted by mindas on 2012-04-26 at 03:50 PM


Don't think Suppliers.memoize would be applicable for cases where supplier is static and the data to build the supplier from ("f" in pseudo code) is coming, say, from a method parameter. Consider something like

class Foo {
  private static final Supplier<Bar> BAR_SUPPLIER = Suppliers.memoize(...);

  private Foo() {}

  public static Bar get(Baz baz) {
     BAR_SUPPLIER.get(baz); // there's no .get(key, Callable) like in Cache
  }
}

@gissuebot
Copy link
Author

Original comment posted by cpovirk@google.com on 2012-04-26 at 09:01 PM


Ah, so you know that get(Baz) will be called with the same Baz every time, but you don't know the value ahead of time?

@gissuebot
Copy link
Author

Original comment posted by mindas on 2012-04-27 at 08:29 AM


Yes, exactly.

@gissuebot
Copy link
Author

Original comment posted by shiber on 2012-05-25 at 05:08 PM


+1
I had the exact same experience as the OP.
I like the pattern that Raymond proposed (the Supplier builder). However we could start smaller by offering a supplier that behaves exactly like memoizeWithExpiration, except that it loads the value asynchronously (using an Executor passed at construction, for instance) automatically upon expiration or upon the first request that follows expiration, and continues to supply the old value until the new value is returned.

@gissuebot
Copy link
Author

Original comment posted by yoavtz@google.com on 2012-05-28 at 05:49 PM


I must admit that my recent abuse of the glorious CacheLoader is even dirtier:
not only do I need just one value, so I use a dummy key,
but I also want my data to be set under a lock, so I also use a dummy value, and just implement the get() method as:

public Object get(Object ignoredKey) {
  mutex.lock();
  try {
    outerClass.this.value = calculateValue();
  } finally {
    mutex.unlock();
  }
  return DUMMY; // Cannot return null
}

So Ugly, I know. I cynically use its "should get() be run now?" logic without anything else that has to do with a cache.
If memoizeWithExpiration is adapted to allow all the above, can it also please accept an optional lock to make writes under?

@gissuebot
Copy link
Author

Original comment posted by yoavtz@google.com on 2012-05-28 at 05:51 PM


(sorry, OuterClass should be capitalized... and this is the load() method...)

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2012-05-30 at 07:43 PM


(No comment entered for this change.)


Labels: -Type-Enhancement, Type-Addition

@gissuebot
Copy link
Author

gissuebot commented Oct 31, 2014

Original comment posted by em...@soldal.org on 2012-06-01 at 02:16 PM


I feel like this would be solved by just adding a reset method to the memoizing suppliers available today. At least thats what I've done in my code.

I found a need to memoize values for a long period of time, but every once in a while, usually triggered by users, I had to flush the value stored in the supplier.

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2012-06-22 at 06:16 PM


(No comment entered for this change.)


Status: Research

@gissuebot
Copy link
Author

Original comment posted by cky944 on 2012-07-01 at 06:13 PM


Another useful feature (that I have an immediate use for) that this singleton cache could provide is memoise-with-soft-reference. Just to add to kevinb's list for Comment 6.

@gissuebot
Copy link
Author

gissuebot commented Nov 1, 2014

Original comment posted by em...@soldal.org on 2012-07-03 at 01:22 PM


Not sure if this should be its own issue...but... I just recalled that one of my earliest "guava-like" classes was resettable variants of MemoizingSupplier and ExpiringMemoizingSupplier.

The reason was simpy because we had a config object which rarely changed, so holding it in a Memoizing supplier once it had been loaded was a good idea.... until it changed... then you had to reboot the server to get the changed to propogate.

Perhaps cache is the wrong idea here, just adding an interface like Resettable or Clearable akin to (Auto)Closable and attaching that to a tertiary interface which joins both Supplier and Resettable and returning that so that we can expose the reset method and supplier method but not the implementations.

Arguably these should be added to the current memoizing suppliers.

just my 2 cents

@gissuebot
Copy link
Author

Original comment posted by christoph.hoesler on 2012-07-30 at 02:57 PM


I had a similar need. My solution was to ask an extended Predicate if the memoized value should get updated. The memoizing Supplier calls a done() method on the 'UpdateRequest' after the value is updated, so that it can change it's state.

public interface UpdateRequest<T> extends Predicate<T> {
    void done();
}
public static <T> Supplier<T> memoize(Supplier<T> delegate, UpdateRequest<? super T> updateRequest) {...}

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2012-08-24 at 03:57 AM


Louis, if you're looking for work, this would be good to look into, at least as far as converging on an API/feature set we can all feel good about.


Owner: wasserman.louis

@gissuebot
Copy link
Author

Original comment posted by wasserman.louis on 2012-08-24 at 07:21 PM


I can do that; I'll do some experimentation over the next week or two.

@gissuebot
Copy link
Author

Original comment posted by mindas on 2012-10-31 at 10:56 AM


There's another functionality aspect which can possibly be considered for this feature.

Let's say there's some expensiveFunction() which takes a long time to calculate. Data that makes the input of this function may change and might be a need for an idiom which would invalidate the current process of expensive calculation and start anew, making all waiting threads to wait for longer until the calculation is complete and no more invalidations are done in the due course.

Current LoadingCache does not support this feature (this is not too obvious from the javadoc). In other words, call to cache.refresh(key) or cache.invalidate(key) or cache.invalidateAll() is ignored if the calculation is in process and only makes any difference if invalidation happens after the calculation.

I needed this feature and wrote a quick hack myself. The source code is available at http://codereview.stackexchange.com/questions/18056/ability-to-forget-the-memoized-supplier-value -- I'd be more than happy to hear any suggestions/improvements.

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2012-11-09 at 11:07 PM


Mindas, what you describe seems like something that should be filed separately.

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2013-03-12 at 06:43 PM


(No comment entered for this change.)


CC: fry@google.com

@gissuebot
Copy link
Author

Original comment posted by perneto@google.com on 2013-06-18 at 12:55 PM


I'm also using a LoadingCache the same way as the OP; my own reasons are:

  • expireAfterWrite
  • thread-safe loading.

In general I can see most of the other features of CacheBuilder being useful for a singleton holder (various expiration policies, etc).

@gissuebot
Copy link
Author

Original comment posted by cgdecker@google.com on 2013-07-03 at 07:24 PM


Issue #1466 has been merged into this issue.

@gissuebot
Copy link
Author

Original comment posted by cgdecker@google.com on 2013-07-03 at 07:30 PM


Issue #1466 has been merged into this issue.

@gissuebot
Copy link
Author

Original comment posted by lowasser@google.com on 2014-06-02 at 07:24 PM


Issue #1773 has been merged into this issue.

@gissuebot
Copy link
Author

Original comment posted by cpovirk@google.com on 2014-08-19 at 01:41 PM


Issue #1834 has been merged into this issue.

@kevinb9n
Copy link
Contributor

kevinb9n commented Oct 6, 2016

Another reason to do this, now, is so we have a version of this functionality that's not tied to our Supplier interface (we should use only java.util.function.Supplier for it).

@noahlz
Copy link

noahlz commented Oct 17, 2016

I would also like to see a "refreshing" Supplier memoizeWithExpiration - that simply maintains the existing value if an exception occurs during refresh.

I just coded a single-element refreshOnWrite cache that is wrapped by an anonymous Supplier instance. I think I wrote just as many comments explaining what I was doing and why, as I did lines of code.

@harshssd
Copy link

I have a set of config data based on the locale and I would want to cache that entire data in the memory and keep refreshing it periodically without activating the server.

Is there a way I can achieve this using this library?

I am doing it by adding reload method and calling updateAll to one particular locale within it. But it doesn't look clean according to the library functions. How can I get functionality. Do you suggest any other library for such cases?

@jvdneste
Copy link

jvdneste commented Apr 10, 2017

I kept it simple, I guess (perhaps naive, but we'll see)

public interface ResettableSupplier<T> extends Supplier<T> {
  void reset();
}

public static <T> ResettableSupplier<T> resettableMemoize(final Supplier<T> source) {
  class ResettableMemoizingSupplier implements ResettableSupplier<T> {
    private volatile Supplier<T> memoized = Suppliers.memoize(source);

    @Override
    public T get() {
      return memoized.get();
    }

    @Override
    public void reset() {
      memoized = Suppliers.memoize(source);
    }
  }
  return new ResettableMemoizingSupplier();
}

I try to avoid this anyhow in favor of a better 'bigger picture' solution, but that's not always possible in a legacy context

@kuzma725
Copy link

kuzma725 commented Feb 1, 2018

Just curious if anything was ever done to implement raymond.rishty's original request from 6 years ago :) Seems like a very common use case, based on comments here and elsewhere. In any case, really appreciate the work you guys have done with Guava.

This is probably OT, but could someone possibly suggest an alternative thread-safe cache implementation for Java 7 that supports both time-based expiry and on-demand eviction for storing a single object/value?

@knadjafloo
Copy link

we need a way to invalidate the underlying cache and there's no way to do this. I'll have to create a LoadingCache with a single dummy key. Still would be nice to address this in 2019.

@netdpb netdpb added the P3 label Aug 2, 2019
petergeneric added a commit to petergeneric/stdlib that referenced this issue Nov 19, 2019
…. Ideally would be replaced with google/guava#872 but it's been around for more than 7 years so unlikely
petergeneric added a commit to petergeneric/stdlib that referenced this issue Nov 19, 2019
…. Ideally would be replaced with google/guava#872 but it's been around for more than 7 years so unlikely
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants