views:

49

answers:

4

Hi there!

I have this drupal website that revolves around a document database. By design you can only find these documents by searching the site. But I want all the results to be indexed by Googlebot and other crawlers, so I was thinking, what if I make a page that lists all the documents, and then tell the robots to visit the page to index all my documents..?

Is this possible, or is there a better way to do it?

+3  A: 

Create a XML sitemap and submit it to Google. Other search engines also support this.

Fabian
Beat me to it :)
Jeriko
+3  A: 

Perhaps a Sitemap

Google introduced Google Sitemaps so web developers can publish lists of links from across their sites. The basic premise is that some sites have a large number of dynamic pages that are only available through the use of forms and user entries. The Sitemap files can then be used to indicate to a web crawler how such pages can be found. Google, Bing, Yahoo and Ask now jointly support the Sitemaps protocol.

Don
Perfect, thanks. :)
Ace
+1  A: 

In fact, you might be interested in a drupal sitemap builder plugin for this..

Jeriko
+3  A: 

You should make a XML sitemap. There is a module for it in Drupal called XML sitemap (it is used by more than 42.000 Drupal sites):

From the module:

The XML sitemap module creates a sitemap that conforms to the sitemaps.org specification. This helps search engines to more intelligently crawl a website and keep their results up to date. The sitemap created by the module can be automatically submitted to Ask, Google, Bing (formerly Windows Live Search), and Yahoo! search engines. The module also comes with several submodules that can add sitemap links for content, menu items, taxonomy terms, and user profiles.

googletorp