tags:

views:

25

answers:

2

I want to read the content from website and save then into a csv file in php, can anyone please tell me how I can do this.

+2  A: 

How do you want to save a website's content as csv file? CSV means comma separated values, and it's really easy to save things as csv file but a website's content?

You say "the content from a website" - normally you'll start reading one sites content, which includes html markup, scripts and styles. Or do you only want to get the text contents or some meta data?

If your server supports opening urls via fopen I'd try this one (php.ini option: allow_url_fopen) - otherwise you'll have to use cURL or something.

Here's some more information about reading websites in php. Regarding the storage of websites as csv I think you should be more precise what you want to achieve.

Regards, Daniel

dhh
I want the website content only. the websites have some table etc, so I want to get the table data into a csv file.
Manoj Singhal
@Manoj scraping data from undefined table structures is a hugely complex issue. There is no general solution
Pekka
http://www.cricinfo.com/rankings/content/page/211271.html this website have 2 table i want to save the table data into a csv file.
Manoj Singhal
Hm, you should first get the website's content as described in my above post. Then you'll have to parse the html markup with some kind of regular expression (you'll find dozens of examples all over the internet - have a look at [this][1] ). [1]: http://blog.mspace.fm/2009/10/14/parse-an-html-table-with-php/
dhh
+2  A: 

There's no instant-magic answer for your question. We (You) need to know which website is in question, how the table is presented. If you know everything about your scenario, you should use PHP's DOM functions and parse your table then export it to a CSV.

fabrik