views:

60

answers:

1

I am designing a web interface to a certain hardware appliance that provides its own custom API. Said web interface can manage multiple appliances at once. The data is retrieved from appliance through polling with the custom API so it'd be preferable to make it asynchronous.

The most obvious thing is to have a poller thread that polls for data, saves into a process wide singleton with semaphores and then the web server threads will retrieve data from said singleton and show it. I'm not a huge fan of singletons or mashed together designs, so I was thinking of maybe separating the poller datasource from the web server, looping it back on the local interface and using something like XML-RPC to consume data.

The application need not be 'enterprisey' or scalable really since it'll at most be accessed by a couple people at a time, but I'd rather make it robust by not mixing two kinds of logic together. There's a current implementation in python using CherryPy and it's the biggest mishmash of terrible design I've ever seen. I feel that if I go with the most obvious design I'll just end up reimplementing the same horrible thing my own way.

+1  A: 

If you use Django and celery, you can create a Django project to be the web interface and a celery job to run in the background and poll. In that job, you can import your Django models so it can save the results of the polling very simply.

Hank Gay
celery looks like the high level solution my problem, thanks a lot :)
Novikov