views:

390

answers:

5

I am working on a course leaflet system for the college I work at, leaflets are stored in a database with primary key course_code. I would like the leaflets ideally to get indexed by google how would I achieve this assuming i develop the system in asp.net 2.0.

I understand part of getting it indexed is to pass the variables around in the link in my case the course_code, this obviously also allows bookmarking of course leaflets which is nice. What are the specifics of getting the googlebot to trawl the system best.

+3  A: 

Look into Google's Webmaster tools

chills42
wasnt aware of this it is very helpful
PeteT
A: 

If the Google bot is able to crawl your page and get everywhere on your site just by following links, without filling in any forms or running any JavaScript, you should be good to go.

(Disclaimer: Although I work for Google, I haven't looked at what the crawler does, and know little beyond public knowledge.)

Jon Skeet
A: 

One big thing is to use a url-rewriting scheme if you can to avoid urls like

http://www.yoursite.com/default.aspx?course_code=CIS612

But with re-writing you could get it to be something like

http://www.yoursite.com/courses/CIS612.aspx

Those types of things really help as querystrings are not all that perfect.

UrlRewriting.net is a good place to start with rewriters.

Mitchel Sellers
A: 

Drop a sitemap.xml file - they help lots.

The Alpha Nerd
A: 

Use wget to crawl your site.

wget -r www.example.com

If wget does not reach some of your URLs, then Google is unlikely to reach them.

Liam