I have (jUnit) unit tests.
My project manager would like a human-friendly list of test cases and scenarios (think: a spreadsheet or report for general distribution to a less technical audience).
What is an effective way to bridge this gap?
I have (jUnit) unit tests.
My project manager would like a human-friendly list of test cases and scenarios (think: a spreadsheet or report for general distribution to a less technical audience).
What is an effective way to bridge this gap?
Unit tests are for testing software components.
A less technical audience is (in my opinion) more interested in functionality than in implemention, in features than components.
That's the gap you need to bridge: the gap between a component and a feature.
The audience may be more interested in system test results than in unit test results; see also Should one test internal implementation, or only test public behaviour?
People might be interested in lists of tests if the tests a) Test something interesting, and b) Have a name which reflects what it's testing. For example, there's a list of my test case names here and if you're interested in that functionality I think you can tell from the names of tests approximately what's being tested (and conversely, what functionality has been broken if a regression test ever failed).
You might want to check out StoryTeller. I haven't used it, but as I understand it it's a way to write acceptance tests in non-technical language, or at least a domain-specific language (DSL).
This interview with Jeremy Miller, the creator, gives a pretty good description.