Apache Accumulo Functional Tests These scripts run a series of tests against a small local accumulo instance. To run these scripts, you must have hadoop and zookeeper installed and running. You will need a functioning C compiler to build a shared library needed for one of the tests. The test suite is known to run on Linux RedHat Enterprise version 5, and Mac OS X 10.5. The tests are shown as being run from the ACCUMULO_HOME directory, but they should run from any directory. Make sure to create "logs" and "walogs" directories in ACCUMULO_HOME. Also, ensure that accumulo-env.sh specifies its ACCUMULO_LOG_DIR in the following way: test -z "$ACCUMULO_LOG_DIR" && export ACCUMULO_LOG_DIR=$ACCUMULO_HOME/logs $ ./test/system/auto/run.py -l Will list all the test names. You can run the suite like this: $ ./test/system/auto/run.py You can select tests using a case-insensitive regular expression: $ ./test/system/auto/run.py -t simple $ ./test/system/auto/run.py -t SunnyDay If you are attempting to debug what is causing a test to fail, you can run the tests in "verbose" mode: $ python test/system/auto/run.py -t SunnyDay -v 10 If a test is failing, and you would like to examine logs from the run, you can run the test in "dirty" mode which will keep the test from cleaning up all the logs at the end of the run: $ ./test/system/auto/run.py -t some.failing.test -d If the test suite hangs, and you would like to re-run the tests starting with the last test that failed: $ ./test/system/auto/run.py -s start.over.test The full test suite can take nearly an hour. If you have a larger hadoop cluster at your disposal, you can run the tests as a map-reduce job: $ python test/system/auto/run.py -l > tests $ hadoop fs -put tests /user/hadoop/tests $ ./bin/accumulo org.apache.accumulo.server.test.functional.RunTests /user/hadoop/tests /user/hadoop/results