tags:

views:

37

answers:

1

I have a report generation functionality. Export to csv or txt . For each month it will be 25000 records each row with 55 columns. For yearly it will be more than 300000!! i try to add memory limit ,but i dont think its good!! Anyway now its 128M.

My expectation

I will split the date range selected by user into a range of 25 days or 30. I will run fetch data for 25 days , then write the csv. Then fetch next 25000 , write that . like this.

How can I attain this?

for fetching am using a function $result= fetchRecords();

For writing csv , I am passing this $result array to view page and by looping and seperating by comma am printing.

So in controller it will be $template->records=$result;

If i do this in a for loop

for(){
  $result= fetchRecords();
  $template->records=$result;
}

I dont hink this will work.

How to do this? execute fetch.write then fetch then write.

Can you please suggest better way to implement this in PHP keeping it in memory limt?

A: 

I was fetching a huge data from db to an array . Then again loop this array to write into csv or text.That was really killing the memory.

Now , as i am fetching data from db , in that same loop am writing to the file. No usage of array. Great difference.

PHP can do better export to csv or text. PHP is really a great language.

Now am downloading a csv of 25 mb . :) i break the excel.. as it exceed 65550 records .:)

More than enough for me.

So the lesson learned is- Dont use arrays for this type of huge data storing.

zod