views:

58

answers:

2

I am trying to build a django service to which numerous clients will send data. Each client will represent an authenticated user, who might be connected to the internet or not, so the client will aggregate the data and send them when a connection is available. The data should also be persisted locally so that they are accessed quickly without hitting the server.

The nature of the data is simple. It has to do with game achievements, so each user will have a collection of achievements they have achieved. As a consequence, there are no consistency issues, as each user will be sending their own achievement stats, and no user will edit someone else's data.

I am trying to find the most suitable medium for this. My first thought was POST HTTP requests, which the django server will handle. A python client will login and 'send' data by performing these requests. Can anyone suggest better alternatives, or give me reasons why this setup is suitable or not?

I'd also like to know what you would suggest for a format/way to obtain the data from the client side. I was thinking json or yaml

EDIT 2: This question has been revamped after S.Lott's recommendation.

A: 

I believe xmlrpc would be a valid solution for this. Here is an example: http://code.djangoproject.com/wiki/XML-RPC We have used it at work and works pretty fine, as our server also provides some services.

gruszczy
Hm. I mainly need to pass data between the server and clients, not make remote process calls. This seems a bit like using a wrench as a hammer. I might be wrong.
FrontierPsycho
Why do you want to use django then? Why don't you simply write your own server in Python?
gruszczy
I have a database and also want to have a website exposing the data to the world. This will be a kind of social networking site.
FrontierPsycho
+2  A: 

Many folks like Piston for this.

We rolled our own (Piston hadn't been published yet). Yes, you can trivially handle a RESTful POST request with a JSON payload in Django. However... Handling REST in general is a pain in the neck because dispatching to a view function based on method (GET, POST, PUT or DELETE) isn't part of Django. You can roll your own method-based dispatcher, but -- in the long run -- you'll be unhappy with a roll-your-own solution.

Piston is not "extra overhead". What you're describing is not -- actually -- simple. Piston is the right amount of overhead for this.


"a client that aggregates these calls in a local database and then syncs them with the server." Too much complexity.

If these events happen at an unthinkably huge rate (100's per second) then you'll need an multi-process Apache front-end running several Django back-ends through mod_wsgi.

But having clients that synchronize to a central database will be too complex. There are too many open questions about clients collection some data and crashing or clients synching twice because someone ran the application twice when they shouldn't have. Too many issues with "database synchronization". Avoid it.

S.Lott
Piston looks nice, but do I really need the extra overhead for these simple operations?
FrontierPsycho
This is very interesting, thank you. However, this app, as I visualize it, must work offline as well, so the aggregation is necessary. Also, because of the nature of the application, no two clients will want to write on the same row, so there probably won't be any consistency issues. Each user will have an account, and their client will write data pertaining only to them (with their own user_id). I feel my english has failed me a bit, but I hope I made myself clear. However, I will definitely look into piston. Thank you.
FrontierPsycho
@FrontierPsycho: Please update your question with all these new facts. You've fundamentally changed the nature of what you're asking for.
S.Lott