You've not provided enough information here to make a sensible answer. By far the biggest benefit is going to come from effective caching of content - but you probably need to look at more than just the headers your are sending - you probably need to start tweaking filenames (or the query part of the URL) to effectively allow the browser to use newer content in place of cached (and not expired) content.
I have a series of files that are requested every page load. On average there are 20 files.
Are these all php scripts? Are they all referenced by the HTML page?
Each file must be read an parsed if they have changed
Do you mean they must be read and parsed on the server? Why? Read this post for details on how to to identify and supply new versions of cacheable static content.
Have you looked at server-side caching? Its not that hard:
<?php
$cached=md5($_SERVER['REQUEST_URI']);
$lastmod=time() - @mtime($cache_dir . $cached)
if ((file_exists($cache_dir . $cached)) && (60 < $lastmod)) {
print file_get_contents($cache_dir . $cached);
exit;
} else {
ob_start();
... do slow expensive stuff
$output=ob_get_contents();
print $output;
file_put_contents($output, $cache_dir . $cached);
}
?>
Note that server-side caching is not nearly as effective as client-side caching.
Once you've got the caching optimal, then you need to look at serving your content from multiple hostnames - even if they all point to the same server - a browser limits the number of connections it makes to each hostname so requests are effectively queued up when they could be running in parallel.
For solving your problem further, we'd need to know a lot more about your application.
C.