views:

38

answers:

1

I'm thinking of making a little web tool for analyzing the search engine optimization and web accessiblity of a whole website.

First of all, this is just a private tool for now. Crawling a whole website takes up alot of resources and time. I've found out that wget is the best option for downloading the markup for a whole site.

I plan on using PHP/MySQL (maybe even CodeIgniter), but I'm not quite sure if that's the right way to do it. There's always someone who recommends Python, Ruby or Perl. I only know PHP and a little bit Rails.

I've also found a great HTML DOM parser class in PHP on SourceForge.

But, the thing is, I need some feedback on what I should and should not do. Everything from how I should make the crawl process to what I should be checking for in regards to SEO and WCAG.

So, what comes to your mind when you hear this?

+1  A: 
skyflyer