views:

34

answers:

3

I have blog posts stored in an Access database. They are displayed dynamically when ASP pages are loaded. Therefore, there are no publicly accessible files containing the bodies of blog posts except when the user accesses the pages. When search engines index my page, how can I ensure that the blog content is indexed and up to date?

A: 

You could submit a sitemap to each search engine you want to index your site. A sitemap will explcitly tell the search engine each URL you want it to look at.

  • Sitemap on wikipedia. For example using the google webmaster tools you can manually resubmit the sitemap so they know about new content, although they will also periodically download it as well.

When a search engine bot accesses a page it just sees what a user would see in their browser, ie what the webserver presented. It doesn't matter if this is dynamic or not.

Binary Nerd
+1  A: 

Search engines essentially see what a user would see when they access the page (minus styles, images etc).

So as long as the url to each blog post is unique and there are links to that url search engines will be able to index them.

One thing you can do is sign up to google's webmaster tools and use the "Fetch as Googlebot" tool to see exactly what would be returned to the google crawler.

Luke Lowrey
Cool, didn't know about the webmaster tools.
harpo
+1  A: 

The fact that the pages are dynamically generated isn't something that a search engine could tell just by looking at your page. For all Google knows, it could be a static page and you have an army of interns updating the "Recent Comments" section of the sidebar. Google doesn't care where the content comes from as long as it's served up as (X)HTML with a sensible URL and some other page Google knows about links to it.

Chuck