I want to read the content from website and save then into a csv file in php, can anyone please tell me how I can do this.
+2
A:
How do you want to save a website's content as csv file? CSV means comma separated values, and it's really easy to save things as csv file but a website's content?
You say "the content from a website" - normally you'll start reading one sites content, which includes html markup, scripts and styles. Or do you only want to get the text contents or some meta data?
If your server supports opening urls via fopen
I'd try this one (php.ini option: allow_url_fopen
) - otherwise you'll have to use cURL
or something.
Here's some more information about reading websites in php. Regarding the storage of websites as csv I think you should be more precise what you want to achieve.
Regards, Daniel
dhh
2010-08-31 07:46:58
I want the website content only. the websites have some table etc, so I want to get the table data into a csv file.
Manoj Singhal
2010-08-31 07:51:12
@Manoj scraping data from undefined table structures is a hugely complex issue. There is no general solution
Pekka
2010-08-31 07:55:58
http://www.cricinfo.com/rankings/content/page/211271.html this website have 2 table i want to save the table data into a csv file.
Manoj Singhal
2010-08-31 08:05:00
Hm, you should first get the website's content as described in my above post. Then you'll have to parse the html markup with some kind of regular expression (you'll find dozens of examples all over the internet - have a look at [this][1] ). [1]: http://blog.mspace.fm/2009/10/14/parse-an-html-table-with-php/
dhh
2010-08-31 08:11:24