views:

39

answers:

3

How can I test that resource (file based cache for caching output of a webapp in Perl) behaves sanely under concurrent access to said shared resource?

I wrote a simple file-based cache, written in Perl, which uses locking to serialize write access, i.e. to have only one process that (re)generates cache entry. This cache is to be used for caching output of Perl webapp (gitweb), if it matters.

I'd like to test that said cache behaves sanely under concurrent access, for example that only one process would run subroutine used to generate cache ($cache->compute($key, sub { ... })), that all processes would get generated data, that if process writing to cache entry dies it wouldn't deadlock processes waiting for cache to be (re)generated etc.

How should I do it? Is there a ready Perl module that I can use?

A: 

Have two processes:

  • Write out the time before access.
  • Attempt access
  • Sleep for 5 seconds with the lock
  • Release the lock and write the time.

It should take one process twice the time of the other one.

As for testing whether or not it cleans up when a process dies. die with the lock instead. Or if this is pretty black box, start a thread that calls exit when you expect the process to have the lock.

But, I'm not sure how you cause the whole process to sleep from a single thread.

Axeman
Everything is white box - it is my own code. I can always fork from test, but the problem is with gathering data from children.
Jakub Narębski