In the google protocol buffers encoding overview, they introduce something called "Zig Zag Encoding", this takes signed numbers, which have a small magnitude, and creates a series of unsigned numbers which have a small magnitude.
For example
Encoded => Plain
0 => 0
1 => -1
2 => 1
3 => -2
4 => 2
5 => -3
6 => 3
And so on. The encoding function they give for this is rather clever, it's:
(n << 1) ^ (n >> 31) //for a 32 bit integer
I understand how this works, however, I cannot for the life of me figure out how to reverse this and decode it back into signed 32 bit integers