tags:

views:

172

answers:

4

When I run JUnit tests failures cause very large stack traces that don't really provide much information. I would like to be able to just see the failing assert statement, so that each failure takes up at most 2 or 3 lines, instead of 20.

I am not using ANT to call JUnit, but am instead running them from a command line in very large batches. The extra output just makes it hard to parse through the data.

Edit: This testing is for student assignments in a programming course, so there will be errors, and there are quite a few programs to be tested.

+3  A: 

Given that this is a relatively rare situation (running JUnit from the command line in large batches, rather than either using an IDE or a tool like Ant) I wouldn't be surprised if there turned out not to be a way of working round it within JUnit itself.

Why not write the result to a file, and then run that through a small parser which recognises the start of a JUnit failure, prints the next few lines, and then skips to the start of the next one?

Jon Skeet
+2  A: 

I'd say your JUnit tests will be nice and quiet once you fix them all. Leave the "noisy" stack traces as an incentive to eliminate them and get back to the green bar.

"...I am not using ANT to call JUnit..." - why not? What is running them in batch buying you? If you use Ant, you'll not only get the tests but the HTML report task as well. This will "quiet" the output, presenting it nicely without throwing away the stack trace detail.

Sounds like you're using submitted assignments as 3rd party JARs in a JUnit test harness that you've written. Nice approach. Should be up or down, red or green. You're the QA department now - the stack traces are the student's responsibility.

You could give them the test classes and tell them to make their stuff pass while running it. Put the responsibility on them, where it belongs.

Students should learn the value of unit testing early. I'd provide them with JUnit and Ant and make them provide running JUnit tests as a prerequisite for a passing assignment.

Here's a sample Ant build.xml that you're free to modify as you see fit. Take special note of the test task:

    <?xml version="1.0" encoding="UTF-8"?>
    <project name="spring-finance" basedir="." default="package">

        <property name="version" value="1.6"/>
        <property name="haltonfailure" value="no"/>

        <property name="out" value="out"/>

        <property name="production.src" value="src/main/java"/>
        <property name="production.lib" value="src/main/webapp/WEB-INF/lib"/>
        <property name="production.resources" value="src/main/resources"/>
        <property name="production.classes" value="${out}/production/${ant.project.name}"/>

        <property name="test.src" value="src/test/java"/>
        <property name="test.lib" value="src/test/lib"/>
        <property name="test.resources" value="src/test/resources"/>
        <property name="test.classes" value="${out}/test/${ant.project.name}"/>

        <property name="exploded" value="out/exploded/${ant.project.name}"/>
        <property name="exploded.classes" value="${exploded}/WEB-INF/classes"/>
        <property name="exploded.lib" value="${exploded}/WEB-INF/lib"/>

        <property name="reports.out" value="${out}/reports"/>
        <property name="junit.out" value="${reports.out}/junit"/>

        <property name="web.src" value="src/main/webapp"/>
        <property name="web.lib" value="${web.src}/WEB-INF/lib"/>
        <property name="web.classes" value="${web.src}/WEB-INF/classes"/>

        <path id="production.class.path">
            <pathelement location="${production.classes}"/>
            <pathelement location="${production.resources}"/>
            <fileset dir="${production.lib}">
                <include name="**/*.jar"/>
                <exclude name="**/junit*.jar"/>
                <exclude name="**/*test*.jar"/>
            </fileset>
        </path>

        <path id="test.class.path">
            <path refid="production.class.path"/>
            <pathelement location="${test.classes}"/>
            <pathelement location="${test.resources}"/>
            <fileset dir="${test.lib}">
                <include name="**/junit*.jar"/>
                <include name="**/*test*.jar"/>
            </fileset>
        </path>

        <available file="${out}" property="outputExists"/>

        <target name="clean" description="remove all generated artifacts" if="outputExists">
            <delete dir="${out}" includeEmptyDirs="true"/>
        </target>

        <target name="create" description="create the output directories" unless="outputExists">
            <mkdir dir="${production.classes}"/>
            <mkdir dir="${test.classes}"/>
            <mkdir dir="${junit.out}"/>
            <mkdir dir="${exploded.classes}"/>
            <mkdir dir="${exploded.lib}"/>
        </target>

        <target name="compile" description="compile all .java source files" depends="create">
    <!-- Debug output
            <property name="production.class.path" refid="production.class.path"/>
            <echo message="${production.class.path}"/>
    -->
            <javac srcdir="src" destdir="${out}/production/${ant.project.name}" debug="on" source="${version}">
                <classpath refid="production.class.path"/>
                <include name="**/*.java"/>
                <exclude name="**/*Test.java"/>
            </javac>
            <javac srcdir="${test.src}" destdir="${out}/test/${ant.project.name}" debug="on" source="${version}">
                <classpath refid="test.class.path"/>
                <include name="**/*Test.java"/>
            </javac>
        </target>

        <target name="test" description="run all unit tests" depends="compile">
    <!-- Debug output
            <property name="test.class.path" refid="test.class.path"/>
            <echo message="${test.class.path}"/>
    -->
            <junit printsummary="yes" haltonfailure="${haltonfailure}">
                <classpath refid="test.class.path"/>
                <formatter type="xml"/>
                <batchtest fork="yes" todir="${junit.out}">
                    <fileset dir="${test.src}">
                        <include name="**/*Test.java"/>
                    </fileset>
                </batchtest>
            </junit>
            <junitreport todir="${junit.out}">
                <fileset dir="${junit.out}">
                    <include name="TEST-*.xml"/>
                </fileset>
                <report todir="${junit.out}" format="frames"/>
            </junitreport>
        </target>

        <target name="exploded" description="create exploded deployment" depends="test">
            <copy todir="${exploded}">
                <fileset dir="${web.src}"/>
            </copy>
            <copy todir="${exploded}/WEB-INF">
                <fileset dir="${web.src}/WEB-INF"/>
            </copy>
            <copy todir="${exploded.classes}">
                <fileset dir="${production.classes}"/>
            </copy>
            <copy todir="${exploded.lib}">
                <fileset dir="${production.lib}"/>
            </copy>
        </target>

        <target name="jar" description="create jar file" depends="test">
            <jar destfile="${out}/${ant.project.name}.jar" basedir="${production.classes}" includes="**/*.class"/>
        </target>

        <target name="war" description="create war file" depends="exploded">
            <war basedir="${exploded}" webxml="${exploded}/WEB-INF/web.xml" destfile="${out}/${ant.project.name}.war"/>
        </target>

        <target name="package" description="create package for deployment" depends="test">
            <antcall target="war"/>   
        </target>

    </project>
duffymo
While that would be nice, the code being tested is actually assignments submitted by students, so it will have failures. The only reason we are not using ANT is because we have not yet invested the time to change over to it. Though it might warrant some investigation.
Mike Cooper
+1  A: 

This forum post seems to be describing a way to filter what is reported by JUnit when you are running the tests from the command line. The approach seems to be to create an alternative entry point class (i.e. a "main" method) that uses an alternative RunListener that logs errors how you want to.

Warning: this would involve a bit of Java coding, trawling through the JUnit javadocs, and (maybe) looking at the JUnit source code for ideas.

Stephen C
A: 

Failing JUnit test-cases are supposed to be noisy. I like them to scream at me ;)

If your unit-test case annoys and breaks constantly, though everything is fine (false positive) rather think of changing the test-case. Also rather wrap the JUnit test runner instead of running test-cases on your own.

manuel aldana