views:

28

answers:

1

hi, i'm exporting data from my database into an excel file using php and pear extension libraries. it works fine when i download small quantity of data. but if the data is large then the excel sheet gets corrupted, (usually all url gets written in the same cell) the data i'm exporting is of type string, url or date.

why is the excel sheet getting corrupted?

A: 

This is probably due to the maximum execution time of the PHP script (default is 30 seconds I think). That results in a partial file (end of excel file not being written) download. Try to use set_time_limit if your webhost allows it.

Daff
No, i have changed the execution time to a very large number, moreover i'm not getting any script error. its as if there is a export character limit set somewhere
Well if it always stops the download after a certain file size you might as well hit your memory limit.
Daff
ya but i've not set any file size moreover the file gets corrupted... its not incomplete... or maybe it is incomplete i don't know 4 sure.. all that happens is all urls get corrupted after a certain number.
Can you give us some idea of the number of rows when this works, and when it doesn't?
Mark Baker
the last time i tried this the workbook had size 31KB.it had 3 worksheets w/s1 has max 15 rows 2 col{priority, url}(always same).w/s2 has 36 rows and 3 col{ url, discrption, media}(rows and col varies) --some w/s3 has suppose to have 46 rows (with some col missing) and 3 col{url, description, priority}