Does anyone know of one? I need to test some upload/download scripts and need some really large files generated. I was going to integrate the test utility with my debug script.
To start you could try something like this:
function generate_file($file_name, $size_in_bytes)
{
$data = str_repeat(rand(0,9), $size_in_bytes);
file_put_contents($file_name, $data); //writes $data in a file
}
This creates file filled up with a random digit (0-9).
Do you really need so much variation in filesize that you need a PHP script? I'd just create test files of varying sizes via the command line and use them in my unit tests. Unless the filesize itself is likely to cause a bug, it would seem you're over-engineering here...
To create a file in Windows;
fsutil file createnew d:\filepath\filename.txt 1048576
in Linux;
dd if=/dev/zero of=filepath/filename.txt bs=10000000 count=1
if is the file source (in this case nothing), of is the output file, bs is the final filesize, count defines how many blocks you want to copy.
Why not have a script that streams out random data? The script can take parameters for file size, type etc.
This way you can simulate many scenarios, for example bandwidth throttling, premature file end etc. etc.
Does the file really need to be random? If so, just read from /dev/urandom on a Linux system:
dd if=/dev/urandom of=yourfile bs=4096 count=1024
# for a 4MB file.
If it doesn't really need to be random, just find some files you have lying around that are the appropriate size, or (alternatively) use tar
and make some tarballs of various sizes.
There's no reason this needs to be done in a PHP script: ordinary shell tools are perfectly sufficient to generate the files you need.
If you want really random data you might want to try this:
$data = '';
for ($byteSize-- >= 0) {
$data .= chr(rand(0,255));
}
Might take a while, though, if you want large file sizes (as with any random data).