I develop a network based, multi-component software system that's designed to run on an arbitrary number of machines. I'm assuming a typical setup of 1 to 4 machines.
I want to be serious about testing the system, and I have set up a network of virtual machines on a strong PC that I can use to simulate the network interaction. However, that setup is not quite sufficient.
For example, because of the virtualization (I use qemu), each node always runs on one core only, so I can't test performance issues on code designed for multicore use. It would be nice if I could sometimes run any one of the virtual machines on the whole PC, to see what difference it makes, especially performance-wise, but also to check on some multithreading issues.
Buying more boxes and using harddisk partitions instead of virtual disk images are options, but are there more elegant approaches? I'm bootstrapping a company here and can't really afford loads of hardware yet and having to deal with real physical media for each machine rather than just moving image files around is certainly more work.
ETA: The application is a middleware system and does not really have a user interface. Testing is done through dummy clients and by feeding data into the system that's extracted again on the other end. It's not a website and it is usually not used over the Internet but rather inside the local network. UI and Internet interfaces are going to be provided at a later stage by some of the components that are hooked up to the system.