Site Maps must be no larger than 10MB and list no more than 50,000 URLs, so you're going to need to break it up somehow.
You're going to need some kind of sharding strategy. I don't know what your data looks like, so for now let's say every time you create a page entity, you assign it a random integer between 1 and 500.
Next, create a Sitemap index, and spit out a sitemap link for each of your index values:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>http://example.appspot.com/sitemap?random=1</loc>
</sitemap>
<sitemap>
<loc>http://example.appspot.com/sitemap?random=2</loc>
</sitemap>
...
<sitemap>
<loc>http://example.appspot.com/sitemap?random=500</loc>
</sitemap>
</sitemapindex>
Finally, on your sitemap page, query for pages and filter for your random index. If you have 100,000 pages this will give you about 200 URLs per sitemap.
A slightly different strategy here would be to give each page an auto-incrementing numeric ID. To do so, you need a counter object that is transactionally locked and incremented each time a new page is created. The downside of this is that you can't parallelize creation of new page entities. The upside is that you would have a bit more control over how your pages are laid out, as your first sitemap could be pages 1-1000, and so on.