Im pulling in search query results using CURL and then iterating through a database to load additional queries then storing the results back in the database. Im running into hassles with php maximum time and have tried setting the maximum time variable to a higher amount which i think isnt working on my host using this:
ini_set('max_execution_time', 600);
in the file that is run by cron so it only changes the max time for the importing process.
The question is, would it be more effecient to store the result of each CURL connection in the database and then having a secondary function that pulls the dataabase results and sorts into the relevant tables and run the secondary function every 10 minutes hypothetically OR is it more effecient to pull the file and insert the sorted records in one go??