On a http website, I have loads of csv files stored which chronological naming scheme. I wrote a PHP program (running on localhost) to programmatically generate each files name using date functions and use file_get_
contents() to write the files locally.
I have tested for a limited range of dates and am able to get the files (each around 1.3M). However, over a large period (say, 3 years, with a file for each weekday), would this cause a timeout? Or is it not a timeout because the response has not actually stopped?
Here's the code for reference:
<?php
$start_date = '08SEP2009';
$check_date = $start_date;
$end_date = '14SEP2009';
function getNextDate() {
global $check_date;
$check_date = date("dMY", strtotime ("+1 day", strtotime($check_date))); //get next date
return $check_date;
}
function downloadFiles() {
$cur_date = getNextDate();
$url = "http://nse-india.com/content/historical/DERIVATIVES/YYYY/MMM/foDDMMMYYYYbhav.csv"; //this represents the naming scheme for the CSVs
while(strcasecmp($cur_date, $end_date)) {
$year = date("Y", strtotime($cur_date)); //get year (2004)
$month = strtoupper(date("M", strtotime($cur_date))); //get month (SEP)
$day = date("d", strtotime($cur_date)); //get day of month (09)
$filename = str_replace('DD', $day, str_replace('MMM', $month, str_replace('YYYY', $year, $url))); //construct file name for new date
$content = @file_get_contents($filename);
$localfile = array_reverse(explode("/", $filename)); //reverse array so that filename.csv comes as first element
$localfile = $localfile[0]; //filename to store locally
$handle = fopen($localfile, "w");
fwrite($handle, $content); //save file locally
fclose($handle);
}
}
downloadFiles();
?>