Humm... A lot?
data.php
will be huge, so it will take quite a while to read the file due to disk I/O. Then it has to stay on memory so even if you don't have memory restrictions it'll affect performance.
Another bottleneck to consider is the max_execution_time
limit. You're most probably doing something wrong if you need 1GB of data in memory... Have you considered storing the raw (padded) data, one element per each line and then just request some specific bytes of that file instead?
Example (Write):
$values = array
(
0 => '127.0.0.1', // 9 chars
1 => '127.0.0.2', // 9 chars
2 => '...', // 3 chars
3 => '255.255.255.255', // 15 chars - this is the max in our set
);
foreach ($values as $key => $value)
{
// lets pad each value to 15 bytes
$values[$key] = str_pad($value, 15, ' ', STR_PAD_LEFT);
}
file_put_contents('./test.data', implode('', $values), LOCK_EX);
Example (Read):
echo ltrim(file_get_contents('./test.data', false, null, 0 * 15, 15)); // 127.0.0.1
echo '<hr />';
echo ltrim(file_get_contents('./test.data', false, null, 1 * 15, 15)); // 127.0.0.2
echo '<hr />';
echo ltrim(file_get_contents('./test.data', false, null, 2 * 15, 15)); // ...
echo '<hr />';
echo ltrim(file_get_contents('./test.data', false, null, 3 * 15, 15)); // 255.255.255.255