Helllo, I would like to share small amounts of data (< 1K) between python and processes. The data is physical pc/104 IO data which changes rapidly and often (24x7x365). There will be a single "server" writing the data and multiple clients reading portions of it. The system this will run on uses flash memory (CF card) rather than a hard drive, so I'm worried about wearing out the flash memory with a file based scheme. I'd also like to use less power (processor time) as we are 100% solar powered.
- Is this a valid worry? We could possibly change the CF card to a SSD.
- Does changing a value using mmap physically write the data to disk or is this a virtual file?
- We will be running on Debian so perhaps the POSIX IPC for python module is the best solution. Has anyone used it?
- Has anyone tried the Python Object Sharing (POSH) module? It looks promising at first glance but it is in "Alpha" and doesn't seem to be actively being developed.
Thank You
UPDATE: We slowed down the maximum data update rate to about 10 Hz, but more typically 1 Hz. Clients will only be notified when a value changes rather than at a constant update rate. We have gone to a multiple servers/multiple clients model where each server specializes in a certain type of instrument or function. Since it turned out that most of the programming was going to be done by Java programmers, we ended up using JSON-RPC over TCP. The servers wil be written in Java but I still hope to write the main client in Python and am investigation JSON-RPC implementations.