views:

77

answers:

2

From a comment on the announcement blog post:

Regarding JSON: JSON is structured similarly to Protocol Buffers, but protocol buffer binary format is still smaller and faster to encode. JSON makes a great text encoding for protocol buffers, though -- it's trivial to write an encoder/decoder that converts arbitrary protocol messages to and from JSON, using protobuf reflection. This is a good way to communicate with AJAX apps, since making the user download a full protobuf decoder when they visit your page might be too much.

It may be trivial to cook up a mapping, but is there a single "obvious" mapping between the two that any two separate dev teams would naturally settle on? If two products supported PB data and could interoperate because they shared the same .proto spec, I wonder if they would still be able to interoperate if they independently introduced a JSON reflection of the same spec. There might be some arbitrary decisions to be made, e.g. should enum values be represented by a string (to be human-readable a la typical JSON) or by their integer value?

So is there an established mapping, and any open source implementations for generating JSON encoder/decoders from .proto specs?

+1  A: 

May be this is helpful http://code.google.com/p/protobuf-java-format/

Pangea
A: 

From what I have seen, Protostuff is the project to use for any PB work on Java, including serializing it as JSON, based on protocol definition. I have not used it myself, just heard good things.

StaxMan

related questions