You're asking more about Search Engine Optimization (SEO), which has nothing much to do with how you store your content in the server. Whether in static HTML or as a DB-driven application, search engines will still index your pages by trawling from link to link.
Some factors that do affect search engines' ability to index your site:
- Over-dependency on Javascript to
drive dynamic content. If certain
blocks on information can't even be
rendered on the page without
invoking use of Javascript, it will
be a problem. Search engines
typically don't execute the JS on
your page. They just take the
content as-is.
- Not making use of proper HTML tags
to represent varying classes of
data. A
<h1>
tag is given more
emphasis by search engines than a
<p>
tag. Basically, you just need
to have a proper grasp of what HTML
element to tag your content with.
- URLs. Strictly speaking, I don't
think having complicated dynamic
URLs represent a problem for search
engines. However, I've seen some
weird content management systems
that expose several different URL
mappings just to point to the same
content. It would be logical that
the search engines deem this same
content as separate pages, which can
dilute your ranking.
There are other factors. I suggest you look up on "accessible web content" as your Google search key.
As for flat files vs DB-driven content, think about how you're going to manage the system. At the end of the day, it's your own labor (or your subordinates'). I, for one, sure don't want to spend my time managing content manually. So, a convenient content management system is pretty much mandatory. I know that there are a couple of Wiki implementations that write directly to flat files. As long as the management part of it is good enough, I'm sure they'd be fine for your purposes.