Submit a Sitemap to google. Use Google Webmaster Tools to add your site and automatically generate a compressed sitemap.xml. This will tell Google about all the URLs on your site so it can crawl them. You can also monitor how often Google crawls your site and whether it encounters any errors doing so.
EDIT: If you're worried about the Sitemap being too large you can generate a sitemap with a single URL pointing to a master index page. That index page can be generated once a day or one demand and can be segmented however you like. It simply acts as the source for a Google crawl. For example, it could present a list of characters A, B, C, D, E, ... , Z that are links to pages that contain a listing of all pages starting with that character. It doesn't matter, however you want to do it to optimize your database resources.
They key is to get a sitemap.xml into Google's system so they know when and how often to crawl you. There are all sorts of intricacies to generating a sitemap. The above approach with one URL is crude, but it can work. Ideally you'd generate a sitemap with every URL in your system sorted by priority, but that isn't requried.
Look at the sitemap specification for more information. If you just want to seed Google, use the 1 URL approach to get going.