On my site I use output buffering to grab all the output and then run it through a process function before sending it out to the browser (I don't replace anything, just break it into more manageable pieces). In this particular case, there is a massive amount of output because it is listing out a label for every country in the database (around 240 countries). The problem is that in full, my preg_match functions seems to get skipped over, it does absolutely nothing and returns no matches. However, if I remove parts of the labels (no particular part, just random pieces to reduce characters) then the preg_match functions works again. It doesn't seem to matter what I remove from the label, it just seems to be that as long as I remove so many characters. Is there some sort of cap on what the preg functions can handle or will it time out if there is too much data to be scanned over?
Edit: Here is the function that it is run through.
public function boom($data) {
$number = preg_match_all("/(<!-- ([\w]+):start -->)\n?(.*?)\n?(<!-- \\2:stop -->)/s", $data, $matches, PREG_SET_ORDER);
if ($number == 0) $data = array("content" => $data);
else unset($data);
foreach ($matches as $item):
//$item[3] = preg_replace("/\n/", "", $item[3], 1);
if (preg_match("/<!-- breaker -->/s", $item[3])) $data[$item[2]] = explode("<!-- breaker -->", $item[3]);
else $data[$item[2]] = $item[3];
endforeach;
//die(var_dump($data));
return $data;
}
And here is the unprocessed output that is sent to the page. I have determined that it is the preg_match_all at the beginning that is returning 0 matches in the variable, so the function is simply throwing the entire string it received into $data['content']
and skipping everything else.
I've tried putting the labels on new lines, collapsing them together, nothing seems to work. But as explained above, if I remove random pieces of it, then it goes through fine. The function works perfectly fine with every other page of normal length.