views:

50

answers:

2

if I want to build a complex webiste like google news , which gathers data from oher websites. like data mining , crawling. In which language should i build the website.

Currently i know only PHP. Can i do that in PHP

A: 

It sounds like you need to build two apps, something to crawl the web and store the data in a database, then a website to display the collected data. I would use Perl to crawl the web for it's good string manipulation features.

BG100
ca i do that in python as well . i am confused what to opt for python or perl . Becaus i am also thinking of applying some intelligent algos using python like google does
Mirage
One thing also . so it means it is not necesaary to have one script or application doing ecevrything. i can have combination of many
Mirage
If you're new to either language, I'd go with Python. It has a much simpler syntax, and has a larger community and collection of libraries. You'll find Perl's syntax is much more complicated without adding any extra value, and its userbase is shrinking because people are switching to newer languages like Python or Ruby.
Chris S
+1  A: 

Python is a great language for both of these tasks. I can't easily name all the available packages out there, but the first that come to mind for web crawling are Mechanize and BeautifulSoup. Orange and NLTK implement several data mining algorithms.

Chris S