views:

133

answers:

2

I am trying to use RabbitMQ for a distributed system that would work something like:

  • a producer puts in a queue a JSON-formatted list of order ids
  • several consumers pull out of that queue, do the business logic with that order ids and the result (JSON formatted) as well is put back into another queue
  • from the second queue, another consumer will take the data and pass it back to the caller

I am still very new to RabbitMQ and I am wondering if this model is the right approach, given the fact that the data should be back as fast as possible (sometimes in the matter of seconds, max 5) so there are real time requirements. Also, how large can the message passed to a queue can be? The JSON that the producer will get back will be fairly large, based on what the consumer does.

Thanks for any ideas!

+1  A: 

There's nothing wrong with the design you suggested.

The slight wrinkle is that enforcing "real time requirements" isn't straightforward. For instance, it's not currently possible to expire messages within a queue, so this would need to be handled by the clients when consuming messages.

The total size of messages in RabbitMQ <=1.8.1 was bounded by the amount of available RAM. As of 2.0.0, it's bounded by the amount of available disk space (i.e. rabbit will page messages to disk if it's running low on memory). Individual message sizes are recorded as 32-bit integers (IIRC), so individual messages cannot be larger than ~4GB; if this is a problem, consider saving the JSONs to network storage and passing some ID to them in the messages. Other than this, there aren't any constraints.

scvalex
Thanks, so the overall messages shouldn't exceed 4GB or each message ? If the second one, then there is no problem ... I won't reach that amount ... the biggest message probably will have a couple of MB. Thanks again.
hyperboreean
Each message cannot exceed 4GB. The total size of the messages in queues cannot exceed the available disk space.
scvalex
I would like to add to also consider using a simpler stream based binary format for faster processing. It takes time to encode/decode json. There can also be problems in the json libraries depending on how they detect and convert data types.
Miguel Morales
+2  A: 

See page 47 in this presentation (InfoQ) for a great comparision between different messaging formats.

oluies