RSS 2.0
Sign In
# Thursday, 04 December 2008

Although in last our projects we're using more Java and XSLT, we always compare Java and .NET features. It's not a secret that in most applications we may find cache solutions used to improve performance. Unlike .NET providing a robust cache solution Java doesn't provide anything standard. Of course Java's adept may find a lot of caching frameworks or just to say: "use HashMap (ArrayList etc.) instead", but this is not the same.

Think about options for Java:
1. Caching frameworks (caching systems). Yes, they do their work. Do it perfectly. Some of them are brought to the state of the art, but there are drawbacks. The crucial one is that for simple data caching one should use a whole framework. This option requires too many efforts to solve a simple problem.

2. Collection classes (HashMap, ArrayList etc.) for caching data. This is very straightforward solution, and very productive. Everyone knows these classes, nothing to configure. One should declare an instance of such class, take care of data access synchronization and everything starts working immediately. An admirable caching solution but for "toy applications", since it solves one problem and introduces another one. If an application works for hours and there are a lot of data to cache, the amount of data grows only and never reduces, so this is the reason why such caching is very quickly surrounded with all sort of rules that somehow reduce its size at run-time. The solution very quickly lost its shine and become not portable, but it's still applicable for some applications.

3. Using Java reference objects for caching data. The most appropriate for cache solution is a java.util.WeekHashMap class. WeakHashMap works exactly like a hash table but uses weak references internally. In practice, entries in the WeakHashMap are reclaimed at any time if they are not refered outside of map. This caching strategy depends on GC's whims and is not entirely reliable, may increase a number of cache misses.

We've decided to create our simple cache with sliding expiration of data.

One may create many cache instances but there is only one global service that tracks expired objects among these instances:

private Cache<String, Object> cache = new Cache<String, Object>();

There is a constructor that specifies an expiration interval in milliseconds for all cached objects:

private Cache<String, Object> cache = new Cache<String, Object>(15 * 60 * 1000)

Access is similar to HashMap:

instance = cache.get("key"); and cache.put("key", instance);

That's all one should know to start use it. Click here to download the Java source of this class. Feel free to use it in your applications.

Thursday, 04 December 2008 12:12:38 UTC  #    Comments [2] -
Announce | Tips and tricks
Friday, 05 December 2008 16:25:53 UTC
For some aching applications, SoftReference is a more appropriate choice than WeakReference. As you say, a WeakReference is essentially cleared when there are no hard references to the object (or soon afterwards). This makes WeakReference good for canonicalisation (where you want to ensure that when there are two instances of your object that equals() each other, they are actually the same object). But it doesn't make it a good choice for "find the object that I used recently but then discarded".

With SoftReference, the VM has a policy of letting the reference stay valid even when there are no hard references, subject to constraints such as available heap space and time that the object has had no hard references. This achieves some of what you're trying to achieve with your cache class, although of course with SoftReference you can't determine the time to live. That said, I can see why you might want the finer-grained control. (Another option, though maybe a bit tricker to get right, could be to use WeakReference/SoftReference alongside a ReferenceQueue to implement some control over the 'keeping alive' stage.)

By the way, your code as it stands has a synchronization bug: inside schedulePurge(), you need to check for executor being null INSIDE the synchronized block, or at the very least declare 'executor' volatile (see info on double-checked locking for why). If you want to avoid contention on the global lock, I would really just create the executor in the initial static declaration. This way, the JVM guarantees only creating the executor the first time it's needed (when the cache class is first referenced), and the JVM handles the synchronization during class loading. You can pass a maximum thread time to live to ThreadPoolExecutor; if no executor threads were active at some point, the overhead of having a single unused ThreadPoolExecutor object knocking around really is negligible.

I would also consider using a ConcurrentHashMap for the internal cache structure. Again, if you'll forgive the blatant plug, I put togeteher a bit of info on ConcurrentHashMap performance that people might be interested in.

Good luck with your caching! Neil
Friday, 05 December 2008 19:10:35 UTC
Neil, thank you very much!

I've followed your advices and has fixed the code.
Vladimir Nesterovsky
All comments require the approval of the site owner before being displayed.
Name
E-mail
Home page

Comment (Some html is allowed: a@href@title, b, blockquote@cite, em, i, strike, strong, sub, super, u) where the @ means "attribute." For example, you can use <a href="" title=""> or <blockquote cite="Scott">.  

[Captcha]Enter the code shown (prevents robots):

Live Comment Preview
Archive
<2024 December>
SunMonTueWedThuFriSat
24252627282930
1234567
891011121314
15161718192021
22232425262728
2930311234
Statistics
Total Posts: 387
This Year: 3
This Month: 0
This Week: 0
Comments: 2144
Locations of visitors to this page
Disclaimer
The opinions expressed herein are our own personal opinions and do not represent our employer's view in anyway.

© 2024, Nesterovsky bros
All Content © 2024, Nesterovsky bros
DasBlog theme 'Business' created by Christoph De Baene (delarou)