If you are writing scripts that need to run for several minutes or even hours in PHP such as scrapers, I found an easy way to stop the server timing out.
You probably don't want to change the default timeout of say 30 seconds because you want your other buggy scripts to time out in a reasonable time frame.
But in your PHP code, you simply add a line like this to reset the time-out clock:
And if your script is going to run for a long time, you want to have a way to check it is still running OK such as writing files to your server, or at least save a time stamp to a text file e.g.
set_time_limit(20); // reset the clock for another 20 seconds before timing out
And in the loop of your script, to randomize it's web access timing, a simple trick is to add a random delay such as:
And if you are concerned about throwing up a red flag due to your IP address accessing the same site too many times, then A. get drowned in the noise of the other traffic to that site (by not visiting too often) and B. go through a selection of proxy servers (but maybe this would be detected?).
usleep(rand(50000,300000)); // halt for n microseconds
In case you are interested: in my case, I was compiling the destination URL's for the Click Bank redirects since these are not published in their XML database of products. So I think this is a legitimate need to probe the same site many times via a script.
p.s. if anyone has tips on using CURL with a list of proxies with PHP scripting, it would be good to learn about it. Or maybe post links to good tutorials on the topic?