Version 24

    NOTE: This page and this project is deprecated.  It has been moved to a standalone project on SourceForge - http://cachebenchfwk.sourceforge.net - and has been heavily modernized since moving.  Please refer to the project on Sourceforge for best results.

     

    This page is still retained for historic purposes.

    What is the Cache Benchmark Framework?

     

    It is a framework that allows running multiple performance tests against various cache products. It was primarily conceived as a tool to benchmark different releases of JBoss Cache.

     

    Here are some useful features offered by the framework:

     

    • Works on any caching product - simply write a CacheWrapper implementation, which delegates calls like put(), get() and remove().

    • pluggable tests - JUnit-like tests can be written to mimic your data access patterns and cache usage scenarios, making benchmarks relevant and useful.  We ship with a few typical tests, such as SessionSimulatorTest which mimics storing and replicating of HTTP sessions.

    • pluggable reports - prefer excel spreadsheets to CSV?  Or SVG images to PNGs?  Write your own report generators.  Ships with a basic CSV generator.

    • cluster barriers - guarantee that all running nodes synchronize during each stage of the test, such as warm-up, test and report generation phases, guaranteeing that each of these phases happens at the same time on each cluster node.

    • centralized reports - each node of the cluster produces its own performance report.  The primary node (first node in the cluster) then collates these reports into a single report file.

    • cache warmup - triggers a sequence of operations on the cache before running any tests, to allow for hotspot compilers to optimize.

    • replication assertion - asserts that any values intended to be replicated have in fact been replicated.

     

    Usage

    Download and install

     

    Prerequisites:

    • Apache ANT version 1.7.x or higher is needed for building the binaries.

    • A Java 5 JDK is needed to build and run the binaries.

    • The framework ships with a number of shell scripts to enable proper classpath setting and command execution.

      • Cygwin can be used for running the test on windows; if so add the scripts from  to . Those scripts make an automatic conversion from unix CLASSPATH to win CLASSPATH, needed when executing

    • Running cluster-wide tests require SSH and passphrase-less public keys set up on all servers, to enable cluster-wide execution.

     

    Download

     

    There is no public release of the framework at the moment.  However, it can be built from sources.  The Subversion repository is .

     

    After getting the sources from Subversion,

    1. Download the JARs for specific cache products;  for details pertaining to specific cache products.

    2. Build the sources using  in the source root directory.

     

    Configure and run a test

     

    Configuring

     

    1. Before running a test configure . The configuration settings are documented within the file. IMPORTANT please read this file carefully as it contains important information, like where reports will be generated, etc.

    2. Configure logging by editing . Recommended verbosity levels are WARN or ERROR, so that benchmark measurements won't be affected by unnecessary logging. In order to have individual log files for each node in the cluster, use  rather than log4j's . This extends  (i.e. support its configurations) and prepend the node index on the node's log file name.

    3. Edit  and ensure that the ,  and  variables accurately describe the test configs you intend to run. The framework will execute each config against each existing product, on all specified cluster sizes.

    4. Edit  so it accurately points to the network interface to bind clustered caches to.  It is a good idea for this to point to an environment variable that is set individually by each host.

     

    Running tests

    • some cache products, such as Terracotta, might need additional setup, such as starting of a master servers. All this information is described in an Readme.txt file in the directory which coresponds to the benchmark product, e.g. cache-products/terracotta-2.5.0/Readme.txt. Make sure you follow any steps described in that file before running the tests.
    • will kick a test run, according to the configuration you set when editing it.

    • will kill all instances running all all nodes. If you are not sure whether the benchmark exit gracefully at the previous run, better run this first to make sure that there are no daemon nodes interfering with new ones.

     

     

    Results

     

    Results will appear a format and in the location specified in .  By default this is a CSV file, in the project root directory.

     

    Local mode

     

    The framework can now deal with non-clustered tests as well.  Use the  script instead of the  script which remotely runs .  Note that you should use an appropriately tuned configuration for your cache, so that it does not create unnecessary network resources.  Also, you should NOT use the  when running tests in local mode - you won't get any results!!

     

    This is very useful for benchmarking non-clustered performance.

     

    Write your own tests

     

    • Every newly created test should implement . For more details please refer to 's javadoc.  The framework ships with some tests already, see the  package for details and examples.

    • Archive your new test as a JAR file and drop it in the  directory so that it is picked up by the framework.

    • Specify your new test to be run in , within a  element.  Refer to how existing tests are specified for examples.

     

    Benchmark a new cache product

     

    To add  to the cache benchmark framework, follow these steps:

     

    1. Create a directory

    2. Create  for the cache product distribution jars.

    3. Create  and write a cache wrapper for the product.  See  for example.

    4. Copy  to , and modify the file as per the comments to suit your product.  This script builds necessary classpaths.

    5. Your cache product is now ready to be benchmarked, as per the instructions above.

     

    TODOs

     

    See here

     

    Feedback

     

    Please provide all feedback on this on the JBoss Cache User Forums for now, until a separate mailing list can be set up.