http://xmlgraphics.apache.org/http://www.apache.org/http://www.w3.org/

Home

Overview
FAQ
Wiki
License
Download
Install
Demo

In the news

Tools and Apps
Browser
Rasterizer
Font Converter
Pretty-printer

Architecture
API (Javadoc)
Generator
DOM API
JSVGCanvas
Transcoder API

Scripting Intro
Scripting Features
Java Scripting
Security

Extensions

Testing

Contributors
Mail Lists

SVN Repository
Bug Database

Status

Glossary


Introduction

This document describes the Batik test infrastructure whose goals are to:

  • Make it easy to detect regressions
  • Make it easy to run test suites
  • Make it easy to write new tests and add them to an existing test suite

The intent for the test infrastructure is that it grows along with Batik and keeps monitoring the health of the code base.

While the test suites in the infrastructure will be run every day by build/test machines, they are also intended to help the commiters and developers get confident that their code modifications did not introduce regressions.

This document describes:


The Test infrastructure
High-Level Interfaces

The following are the high level interfaces in the infrastructure

  • A Test is performing whatever check is needed in its run method, and each run produces a TestReport
  • A TestReport describes whether a Test run passed or failed and provides a description of the failure in terms of an error code (unique in the context of a given Test) and a set of key/value pairs
  • A TestSuite is a Test aggregation which can run a set of Test instances
  • A TestReportProcessor is used to analyze a TestReport. A specific implementation can choose to create graphs, send an email or write an HTML file

Default Implementations

The test infrastructure comes with a number of default implementations for the interfaces described above. Specifically:

  • AbstractTest. This implementation of the Test interface is intended to make it easier to write a 'safe' Test implementation. See "Writing New Tests" for a description of how to use that class.
  • DefaultTestReport provides a simple implementation of the TestReport interface that most Test implementation will be able to use. See "Writing New Tests" for more details.
  • DefaultTestSuite provides an implementation of the TestSuite interface and makes it easy to aggregate Test instances.
  • SimpleTestReportProcessor is a sample TestReportProcessor implementation that simply traces the content of a TestReport to an output stream
  • TestReportMailer is another implementation of the TestReportProcessor interface that emails a test report to a list of destination emails.

XML Implementations

The test infrastructure is using XML-out (and XML-in too, see "Running a test suite") as a favorite way to generate test reports. The XMLTestReportProcessor implementation of the TestReportProcessor interface. outputs reports in XML in a configurable directory.

The XMLTestReportProcessor can notify a XMLReportConsumer when it has created a new report. There is one implementation of that interface by default that can run an XSL stylesheet on the XML report (e.g., to generate an HTML report) and that is done by the XSLXMLReportConsumer. This is used by the 'regard' rule in the Batik build to produce an HTML report for the default regression test suite.



Managing Test Suites

The infrastructure tries to make it easy to create, update and modify test suites. This section describes how to describe a set of tests to be run and how to actually run that test suite

Describing a Test Suite

Test suites can be described in XML (XML-in refered to earlier in this document). The general format for describing a test suite is:

<testSuite id="testSuiteA" name="MyFavoriteTestSuite">
    <!-- =================================== -->
    <!-- Set of tests to be run              -->
    <!-- =================================== -->
    <test id="t1" class="myFavoriteTestClassA" />
    <test id="t2" class="myFavoriteTestClassB" />
    <test id="t3" class="myFavoriteTestClassC" />
</testSuite>
            

Note that tests can be grouped in <testGroup> elements which can have their own id and class attribute. This is useful because it allows developers to run specific tests or test groups by specifying their ids. In addition, because the class attribute is inherited by <test> elements from its parents, <testGroups> allow the developers to group tests which use the same class and specify that class on the group.

This simply list the test of Test instances that compose a given test suite. For example:

<testSuite id="sampleTestSuite" name="SAMPLE TEST SUITE">
    <!-- ========================================================================== -->
    <!-- Validates that the SVGRenderingAccuracyTest class is operating as expected -->
    <!-- ========================================================================== -->
    <test id="renderingAccuracyTest" class="org.apache.batik.test.svg.SVGRenderingAccuracyTestValidator" />

    <!-- ========================================================================== -->
    <!-- Rendering regression tests                                                 -->
    <!-- ========================================================================== -->
    <test id="anne.svg" class="org.apache.batik.test.svg.SVGRenderingAccuracyTest">
        <arg class="java.net.URL" 
                value="file:samples/anne.svg" />
        <arg class="java.net.URL" 
                value="file:test-references/samples/solaris/anne.png" />
        <property name="VariationURL" 
                     class="java.net.URL" 
                     value="file:test-references/samples/variation/anne.png" />
        <property name="SaveVariation" 
                     class="java.io.File" 
                     value="test-references/samples/variation-candidate/anne.png" />
    </test>

</testSuite>
            

Running a Test Suite

Yet another XML file describes which test to run and how to process the generated test reports. The general syntax is something like:

<testRun id="regard" name="Test Run Name Here">
    <!-- =================================== -->
    <!-- Descriptions of processors that     -->
    <!-- will process the results of the     -->
    <!-- test suite                          -->
    <!-- =================================== -->
    <testReportProcessor class="myFavoriteReportProcessorA" />
    <testReportProcessor class="myFavoriteReportProcessorB" />

    <!-- =================================== -->
    <!-- Set of test suite to run. They will -->
    <!-- produce TestReports.                -->
    <!-- =================================== -->
    <testSuite href="http://url.to.my.first.test.suite"/>
    <testSuite href="http://url.to.my.second.test.suite" />

</testRun>

<testRun> elements can be nested. In a nutshell, you can specify a set of TestReportProcessor which should process the TestReport generated by the TestSuite built from the list of Test instances described in the referenced <testSuite> file. For example:

<testRun name="Batik Standard Regression Test Run">
    <testRun name="REGARD">
        <testReportProcessor class="org.apache.batik.test.xml.XMLTestReportProcessor" > 
            <arg class="org.apache.batik.test.xml.XSLXMLReportConsumer">
                <!-- Stylesheet -->
                <arg class="java.lang.String" value="file:test-resources/org/apache/batik/test/svg/HTMLReport.xsl" />
                <!-- Ouput Directory -->
                <arg class="java.lang.String" value="test-reports/html" />
                <!-- Output file prefix -->
                <arg class="java.lang.String" value="RegardResult" />
                <!-- Output file suffix -->
                <arg class="java.lang.String" value=".html" />
            </arg>
        </testReportProcessor>

        <testSuite href="file:test-resources/org/apache/batik/test/samplesRendering.xml" /> 
        <testSuite href="file:test-resources/org/apache/batik/svggen/regsvggen.xml" />
        <testSuite href="file:test-resources/org/apache/batik/test/unitTesting.xml" /> 
    </testRun>

</testRun>

There is now a rule in our build.xml to run a test suite defined in an XML file as the one above. At the command line, type the following:

build runtestsuite path/to/my/newly/created/testSuite.xml. In addition, the regard rule runs a specific set of tests by default, so that you do not need to pass any testRun file argument.

regard is the project's safeguard against regressions.



regard: the Batik regression test suite

The regard test suite contains all the regression tests for the Batik project. The regard tool is a specific test suite description, regard.xml (which you can find in the test-resources/org/apache/batik/test directory). That file contains a set of test suite files which sould be run.

The following describes how to use the regard tool and some of the most important tests in the regard test suite

Running regard

The regard tool lets you run either all the tests or any specific test you want in the test suite. To run all the tests in the regard test suite, type the following at the command line:

build.sh regard

To run a specific test in the test suite, type the qualified test id or any sub-portion of that id:

build.sh regard <id list>

For example:

build.sh regard unitTesting.ts batikFX.svg

will run all the tests with an id containing unitTesting.ts (i.e., all the test selection unit testing, see the test-resources/org/apache/batik/gvt/unitTesting.xml) and the accuracy rendering test on batikFX.svg (because it is the only test with batikFX.svg it its id).


Rendering Accuracy Tests

There is a Test implementation, SVGRenderingAccuracyTest which checks that Batik's rendering of SVG document stays accurate. It compares reference images with the rendering Batik produces and reports any discrepency.


The SVGRenderingAccuracyTest configuration

An SVGRenderingAccuracyTest's constructor configuration is made of

  • The URL to the SVG it should render
  • The URL to a reference PNG file

The default behavior for the test is to render the SVG into a PNG file and compare with the reference image. If there is not difference, the test passes. Otherwise, it fails.

In addition to this default behavior, the SVGRenderingAccuracyTest can take an optional configuration parameter, an image URL defined as an 'accepted' variation around the reference image. If such a variation image is specified, then the test will pass if:

  • The rasterized SVG is equal to the reference image
  • Or, the difference between the rasterized SVG and the reference image is exactly the same as the accepted variation image

Finally, to ease the process of creating 'accepted' variation images, SVGRenderingAccuracyTest can take an optional file name (called 'saveVariation') describing where the variation between the rasterized SVG and the reference image will be stored in case the rasterized SVG is different from the reference image and the difference is not equal to the variation image, if any was defined. That way, it becomes possible to run a test. It that test fails, the developer can review the saveVariation image and decide whether it is an acceptable variation or not and use it in subsequent test run as the 'accepted' variation image, which will allow the test to pass if that exact same variation remains constant.


Day to day use of regard

Initial set up

To set up the test environment the first time, you need to:

  • Check-out the latest version of the code, including the test-xx directories (sources, resources and references) and the build.xml file
  • Run the regard test suite once:build regard

This will generate an HTML test report (report.html) in the test-reports/yyyy.mm.dd-HHhMMmSSs/html/html directory. Depending on how much different text rendering is between your work environment and the environment used to create the reference images, you will get more or less test that will fail, because of differences in the way text is rendered on various platforms and because of fonts not being available on some platforms. For example, a running the test on a Windows 2000 laptop against images generated on the Solaris platform caused 16 tests out of 71 to fail.

Review the HTML report to make sure that the differences are really due to text variations. This will usually be the case and you can make sure by clicking on the diff images contained in the report to see them at full scale. You can you can then turn the 'candidate' variations generated by the test into 'accepted' variations by moving files from one directory to another:

mv test-references/samples/candidate-variations/*.png test-references/samples/accepted-variations/*.png

mv test-references/samples/tests/candidate-variations/*.png test-references/samples/tests/accepted-variations/*.png

You can now run the test again:

build regard

Check the newly generated HTML report in the test-reports/html directory: there should not longer be any test failure

Daily usage

Once the intial set-up has been done, you can use regard by simply updating your SVN copy, including the test-references. If no change occurs, your test will keep passing with your reference images. If a test fails (e.g., if someone checks in a new reference image from a platform different than the one you are using, you will have to check if it is because of system specific reasons or if there is a bigger problem.


SVG Generator tests

Regard contains over a 100 tests for checking regressions on the SVG Generator. If you use 'svggen' as an argument to regard, all the SVG Generator tests will be run (because regard.xml points to test-resources/org/apache/batik/svggen/regsvggen.xml which is a test suite descriptio for the SVG Generator and that the root <testSuite> element has the 'svggen' id).



Writing new Tests

Writing a new Test involves either configuring a new test or writing a new Test class. In both cases, you will need to add an entry to a test suite's XML description. This section uses two test suites as an example: the 'regard' test suite to show how to configure a new test and the 'unitTests' test suite to show how to add a new Test implementation.

Adding a new test configuration

Imagine that you add a cool new test case to the samples directory, such as linking-viewBox.svg. In order to check for regressions on that file you can add the following entry:

        <arg class="java.net.URL" 
                value="file:samples/tests/linkingViewBox.svg" />
        <arg class="java.net.URL" 
                value="file:test-references/samples/tests/solaris/linkingViewBox.png" />
        <property name="VariationURL" 
                     class="java.net.URL" 
                     value="file:test-references/samples/tests/variation/linkingViewBox.png" />
        <property name="SaveVariation" 
                     class="java.io.File" 
                     value="test-references/samples/tests/variation-candidate/linkingViewBox.png" />
    </test>

to the test-resources/org/apache/batik/test/samplesRendering.xml test suite description, the description of the regard test suite. If you have access to the build machine where the reference images are typically generated, you can check-in the reference image in test-references/samples/tests. Otherwise (and this is ok), you can let the test fail the first time it is run on the build/test machine and that will be a reminder for whoever is responsible for that machine that a valid reference image should be checked in.


Writing a new test

Imagine you want to validate some aspect of your code, and let's take the bridge error handling as an example. You could create a new class in the test-sources area, in the org/apache/batik/bridge in our example, and let's call it ErrorHandlingTest. To simplify the implementation of the Test interface, you can choose to derive from the AbstractTest class and generate a DefaultTestReport.

While writing the Test you may want to use your own XML file with just your test, for example:

    <testReportProcessor class="org.apache.batik.test.SimpleTestReportProcessor" /> 

    <test class="org.apache.batik.bridge.ErrorHandlingTest">
        <!-- Expected error code -->
        <arg class="java.lang.String" value="expected.error.code" />
        <!-- Input SVG that this test manipulates to generate error conditions -->
        <arg class="java.net.URL" value="file:test-resources/org/apache/batik/bridge/ErrorHandlingBase.svg" />
        <!-- Id of the element to test -->
        <arg class="java.lang.String value="rectangle6" />
        <!-- Attribute to test -->
        <arg class="java.lang.String value="x" />
        <!-- Value to test on the attribute -->
        <arg class="java.lang.String value="abcd" />
    </test>

This is just an example and does not pretend to be the right way to go about implementing or specifying this specific type of test. Once done with tuning the test, one or multiple configurations for the test can be added to the relevant test suite's XML description. In some cases, it will be interesting to create a separate test suite.




Copyright © 2000-2005 The Apache Software Foundation. All Rights Reserved.