My favorites | Sign in
Project Home Downloads Wiki Issues Source
READ-ONLY: This project has been archived. For more information see this post.
Search
for
JRunr  
Updated Nov 11, 2008 by kd.vi...@gmail.com

Jrunr - Java Record Unit test data N Replay/Reuse

Table of Contents:

Concepts

What is jrunr?

jrunr is a tool that can be used to non-intrusively record and store unit test data from a running instance of an existing java application, which can be reused later on in unit tests. It is specifically intended for use in maintenance scenarios of java applications that are not TDD aware, which as a result cannot be easily unit tested.

Why

Unit tests play a role in not just from-scratch development projects, but also in maintaining existing applications by providing the test base that can be used to validate new builds, ie, enhancements. However, adding to existing code that is not written in a test-aware fashion makes it difficult to narrow creation of any new test cases to the specific problem that is being fixed. Some of these scenarios are:

  1. The code to be tested is a small part of a large method, ie only one of the flows has been modified and needs to be tested. The rest of the code is known to be correct to the extent verified by non-junit methods such as human testing and given that its in production already.
2. The code to be tested depends on a rather costly setup of other objects 3. The code to be tested depeneds on calls to other layers/applications, which are not (desirable to be) available in the unit test scenario.
The traditional methods of attacking these problems is to mock out everything except the ["System under test"] (todo: link to xunit here). However in scenarios 2 and 3 above, the mocking approach might lead to an overly-complex mock layer or mock-calling-mock situation.

Jrunr aims to avoid this, and provide an easier (if hackish) way of increasing legacy app test creation efficiency. It is not intended to replace test driven development or mocking; more to complement them in situations where the cost of elegance is too high :)

Jrunr itself, however, aims to be an elegant tool, and therefore does all its work without any change to application code.

How it works

Jrunr is implemented as two compoents:

  • Recording is implemented as an aspect (todo: link here) that intercepts all configured method calls in the SUT. The configuration specifies whether to store the arguments passed to the method, or its return value, or both.These values are then stored in a configured location as json files keyed on the method's fully qualified name.
  • Replay/Reuse is implemented as a class that can read the json files and return any value stored which can be directly used in the test case.

User Guide

Requirements

Installation

Installation in a Web Container

  • Add jrunr.jar to the classpath
  • Add dependent jars (if required)
  • Add reference to the spring config required
    • if the app doesnt use spring, how:
    • if the app does use spring, how:
  • modify the config to setup methods to be recorded

Installation in a standalone app

Installation in an EJB Container - TBD

Using jrunr

Deciding what to record and how

<Usage scenarios go here>

Configuring the methods to record data from

Recording data

Using the data in your unit tests

Dev guide/notes

Requirements/Feature List

  1. Record data for use in unit tests
    1. Provide a means to record data for a method under test; ie, the input and output parameters should be recorded.
    2. Provide a means to store the recorded data, identified by the method under test.
    3. Allow the user to choose whether the input, output or both should be recorded via declarative configuration.
    4. Record timestamp information for reference and/or comparison of data
  2. Use recorded data in unit tests
    1. Provide an easy to use interface/class to access the stored data
    2. The interface should allow use of any/all data stored
    3. The interface should have minimal impact on the setup of the test case/suite.
  3. Manage Recorded Data
    1. Provide the ability to edit/modify data to change it to suit a particular test case.
    2. Provide the ability to aggregate data from multiple recordings related to a single method
    3. Provide the ability to consolidate data from multiple recording sessions and methods so as to create a database of such test data.
    4. Provide a GUI to do all requirements under this section.
  4. Use recorded data to generate more data
  5. This requirement is to be fleshed out. The general idea, however is to use the recorded data as a template to generate more data either for functional tests, or for load testing.
  6. Documentation
    1. Provide documentation for end users
    2. Demonstrate use of tool in sample applications and include the source as part of distribution.
      1. For standalone apps
      2. For web applications
      3. For enterprise containers.
  7. Non-functional
    1. The tool should not be invasive:
      1. It shouldnt require code changes to record data
      2. It shouldnt require major changes in the build system
    2. The tool shouldnt impact the performance of the system majorly. It is understood that the record phase will be a non-production run of the application for the specific purpose of recording data.
    3. Data storage must be in a format amenable to both human and tool manipulation; ie it must enable both humans and tools to easily read and modify the data.
    4. The tool should provide for jdk versions 1.3 onwards.
    5. The tool could provide a jdk 1.5-specific version.
    6. The configuration of the tool must follow DRY principles.

Project Roadmap

Version Features
1.0 (Basic) Requirement #s 1.1-1.3,2.1-2.3,3.1 by satisfying 6.3,5.1,5.2.1-5.2.3,6.1,6.2 implicitly,6.3,6.4
1.1 (DRY)Requirement #s 6.6
2.0 (JDK1.5)Requirement #s 1.4,5.2.3,6.5
3.0 (GUI)Requirement #s 3.1-3.4, verify 6.2
4.0 (DataGen)Requirement #s 4

Status

  1. /6/2007 All requirements except #4 fleshed out.

Analysis

The implementation will need solutions for the following sub problems:

  • how to record without changing source too much or at all
  • how to record enough information of the original context for use in the test
  • how to view and modify the stored information to suit the same test context or a different one, or to create bulk test data from the single recorded data
  • how to use the stored information in a test case

Each is detailed below:

  • how to record without changing source too much or at all
    • some form of aop
      • annotations: might not work as we need the object values passed from the running context to the annotation processor (AP)
      • any around advice aspect would work, but to make it generic, we'd need a declarative way of defining the variable(s) that the aspect would be tracing.
  • how to record enough information of the original context for use in the test
    • ie, how to record which method, line of code etc for tracing, reporting purposes
      • could probably use annotations here, but can this be passed to the ap?
  • how to view and modify the stored information to suit the same test context or a different one, or to create bulk test data from the single recorded data
    • sub problems:
      • how to store for easy readability,configurability
        • store as serialized objects, write app to read it back and provide a config ui
        • store as xml so std xml editors.etc can be used
        • store as xml that is also spring config
        • store as json,yml or similar such readable format
      • how to view and modify
        • use existing test generator frameworks/UIs
        • which? are there any? research this
        • create custom webapp
  • how to use the stored information in a test case
    • this is pretty straightforward - need to have a way to refer uniquely to the test data that we need in a junit test case. After that its standard junit testing.

Issues

# The current statement of requirements and its implementation are premised on the assumption that its sufficient to store the input and output data of method calls for purpose outlined in the Why section. Need to see if there are real world scenarios different from this. # Is the DataGen requirement valid?

Status

10/6/2007

  • Analysis complete for Version 1.
  • todo: Resolve issues and assign releases to potential code impact.

Architecture/Design

Since the tool is intended for minimally invasive use, the Spring IoC container is chosen as the base framework. This also has the advantage of an installed base of applications, and declarative configuration. The tool itself clearly should comprise of two components:

  • A Recorder that is implemented as an around aspect. It shall intercept all calls that the user is interested to record input and/or output data from, and record such data.
  • A Reuse API that allows access to the stored data from within a Junit testcase.

In future other spring features such as MVC and WebFlow can be used to create an Admin UI.

JSON is chosen as the storage data format as its lightweight, easier to read, and support for reading and writing it are readily available.

Design

Package Design

All jrunr code will be created under the org.vinodkd.jrunr package. Subpackages aspect and data will contain the Recorder Aspect and the Data Access code respectively.

A dummy application is currently bundled in the source under the com.acompany.sut package with a single class called CUT (for Class under Test).

Test code for Jrunr will be created in the same package structure and namespace, but in a differnt source file tree.

The Recorder Aspect

The Recorder Aspect will be implemented in the org.vinodkd.jrunr.aspect.Recorder class. It will be a normal Spring bean with properties injected via spring config. It will support the setting up of method configuration and an output dir for generated json files. Class Prototype:

	public class Recorder
	{
		public void setRecordMap(Properties whatToRecord);
		public void setOutputDir(Resource outputDir);
		public void recordValues(ProceedingJoinPoint jp);
	}

Setting the recordMap attribute should power Recorder up with a list of methods to intercept and record. Setting the outputDir attribute should setup the directory that recordValues() will use to output json files.

recordValues() will internally use the DataAccess API to write a Java Object as a json file.

The DataAccess API

The DataAccess API consists of 2 classes: JrunrStore a MethodExecutionRecord.

MethodExecutionRecord is a value object and will contain the method name, and its input and output values. Like so:

	public class MethodExecutionRecord
	{
		private String methodName; // the fully qualified method name
		private Object args[];	// the arguments
		private Object retVal;	// the return value
		
		// setters and getters
	}

JrunrStore will provide convinent load and store operations using MethodExecutionRecord objects, like so:

	public class JrunrStore
	{
		public static void store(File f,MethodExecutionRecord mer);
		public static MethodExecutionRecord load(File f);
	}

Configuration

User Configuration

Users of jrunr will configure it to define the methods they're interested in recording like so:

<fully qualified class name>.<method name> = ARGS| RETURN | BOTH

As is obvious, setting a method to ARGS will record only its input args, RETURN will record only its return values, and BOTH will record both.

This configuration should be in a file called jrunr.properties.

Internal Configuration

The tool internally will use spring configuration. The Recorder Aspect and DataAccess will be configured like so:

	   <bean id="recorder" class="org.vinodkd.jrunr.aspect.Recorder">
	   	<property name="recordMap">
		    <util:properties location="classpath:jrunr.properties"/>
	   	</property>
	   	<property name="dataStore" ref="dataStore"/>
	   </bean>
	
	   <aop:config>
	      <aop:aspect ref="recorder">
	         <aop:pointcut id="theExecutionOfAMethodUnderTest"
	                    expression=
	                    "	execution(* com.acompany.sut.*.methodToTest(..)) 
	                    or	execution(* com.acomp.sut.*.testMethod(..))

	                    "
	          />
	         <aop:around pointcut-ref="theExecutionOfAMethodUnderTest"
	                  method="recordValues"/>
	      </aop:aspect>
	   </aop:config>
	   
	   <bean id="dataStore" class="org.vinodkd.jrunr.data.JrunrStore">
	   	<property name="location" value="file:E:/vinod/projects/JRUNR/out"/>
	   </bean>

Tools/libraries used

  • Spring 2.0, specifically Spring AOP
  • AsjectJ libraries internally used by Spring
  • XStream with Jettison

Issues/todos

# The schema-based spring AOP doesnt allow for configuration of both the Recording Advice and the Recorder Aspect from the same list of method names. The current implementation, therefore, requires the user to configure this list in two places, which is against the DRY(todo: link here) principle. Reporting to the spring forum has brought the suggestion of writing a custom pointcut implementation. Consider this, or consider use of the Regex based Spring Advisor which should do the same thing.

Status

10/6/2007 todo: finish design doc

Construction

Issues/todos

# Externalize user configuration to props file. # Create dataaccess api objects # Correct Boot and Recorder to use dataaccess # Try out the RegexAdvisor

Status

10/8/2007 Have working code for the Recorder and Dataaccess; but latter is not still refactored into separate code. Need to try it on real code.

Project Management

Issues/Todos

# Setup SCM - done # Add use of savant for dependent libraries # Create sf project

Status

FAQ

# Why not jdk 1.5 # Why not aspectj

Colophon

This document uses the Markdown(todo:link here) syntax.

Powered by Google Project Hosting