For example, Reductio (for Java/Scala) and QuickCheck (for Haskell). The kind of framework I'm thinking of would provide "generators" for built-in data types and allow the programmer to define new generators. Then, the programmer would define a test method that asserts some property, taking variables of the appropriate types as parameters. The framework then generates a bunch of random data for the parameters, and runs hundreds of tests of that method.
For example, if I implemented a Vector class, and it had an add() method, I might want to check that my addition commutes. So I would write something like (in pseudocode):
boolean testAddCommutes(Vector v1, Vector v2) {
return v1.add(v2).equals(v2.add(v1));
}
I could run testAddCommutes() on two particular vectors to see if that addition commutes. But instead of writing a few invocations of testAddCommutes, I write a procedure than generates arbitrary Vectors. Given this, the framework can run testAddCommutes on hundreds of different inputs.
Does this ring a bell for anyone?