What Python frameworks/libraries you would use to create a data centric REST application. The application will have the following traits:
- read-biased in terms of number of individual ad-hoc requests
- write-biased in number of records added in batch feeds
- scalable (think virtual appliance, cloud)
- variety of data formats such as csv files, XML, Excel, SQL RDBMS, JSON - think uniform shell over disparate storage formats
- multiple RDBMS instances accessible at the same time
- caching - I am biased toward out of the box HTTP cache like Varnish but if someone could comment on trade-off vs. memcache that would be helpful
- search - non-Python Apache Solr (Lucene) is what I have in mind but am curious to find out about other options
- application will generally have no GUI but I would like to be able to create web based usage demos with ease
- interoperability with SOAP based web services (not a high priority)
- URL inference from model - not sure how feasible this will be, after all URL is not a query language, but would be nice if simple filtering did not involve stating URL patterns explicitly
For the main framework I have my eye on web.py, Werkzeug and Pylons. I used Django in the past and like how well supported it is. Some projects such as django-storage could be a great shortcut but I am uncertain whether it's main purpose on display/web GUI apps is the right choice for me.