I'm crawling a website using php. As I don't know much about turning php into a CGI and runing from command line the method I've chosen is, after one succesful iteration, to redirect back to the same php file so it runs again (it's not an infinite loop as I use both a cookie and a timestamp check to make sure it ends within a set time or after a few hundred iterations).
The problem I keep getting is that after 20 iterations or so I get the following error message:
Redirect Loop
Redirection limit for this URL exceeded. Unable to load the requested page. This may be caused by cookies that are blocked.
How can I stop apache/php (I'm using xampp on my localhost) from preventing me from looping as long as i want.