views:

65

answers:

1

How do I invert a color mapped image?

I have a 2D image which plots data on a colormap. I'd like to read the image in and 'reverse' the color map, that is, look up a specific RGB value, and turn it into a float.

For example: using this image: http://matplotlib.sourceforge.net/_images/mri_demo.png

I should be able to get a 440x360 matrix of floats, knowing the colormap was cm.jet

from pylab import imread
import matplotlib.cm as cm
a=imread('mri_demo.png')
b=colormap2float(a,cm.jet) #<-tricky part
+4  A: 

There may be better ways to do this; I'm not sure. If you read help(cm.jet) you will see the algorithm used to map values in the interval [0,1] to RGB 3-tuples. You could, with a little paper and pencil, work out formulas to invert the piecewise-linear functions which define the mapping.

However, there are a number of issues which make the paper and pencil solution somewhat unappealing:

  1. It's a lot of laborious algebra, and the solution is specific for cm.jet. You'd have to do all this work again if you change the color map. How to automate the solving of these algebraic equations is interesting, but not a problem I know how to solve.

  2. In general, the color map may not be invertible (more than one value may be mapped to the same color). In the case of cm.jet, values between 0.11 and 0.125 are all mapped to the RGB 3-tuple (0,0,1), for example. So if your image contains a pure blue pixel, there is really no way to tell if it came from a value of 0.11 or a value of, say, 0.125.

  3. The mapping from [0,1] to 3-tuples is a curve in 3-space. The colors in your image may not lie perfectly on this curve. There might be round-off error, for example. So any practical solution has to be able to interpolate or somehow project points in 3-space onto the curve.

Due to the non-uniqueness issue, and the projection/interpolation issue, there can be many possible solutions to the problem you pose. Below is just one possibility.

Here is one way to resolve the uniqueness and projection/interpolation issues:

Create a gradient which acts as a "code book". The gradient is an array of RGBA 4-tuples in the cm.jet color map. The colors of the gradient correspond to values from 0 to 1. Use scipy's vector quantization function scipy.cluster.vq.vq to map all the colors in your image, mri_demo.png, onto the nearest color in gradient. Since a color map may use the same color for many values, the gradient may contain duplicate colors. I leave it up to scipy.cluster.vq.vq to decide which (possibly) non-unique code book index to associate with a particular color.

import pylab
import matplotlib.cm as cm
import numpy as np
import scipy.cluster.vq as scv

def colormap2arr(arr,cmap):    
    # http://stackoverflow.com/questions/3720840/how-to-reverse-color-map-image-to-scalar-values/3722674#3722674
    gradient=cmap(np.linspace(0.0,1.0,100))

    # Reshape arr to something like (240*240, 4), all the 4-tuples in a long list...
    arr2=arr.reshape((arr.shape[0]*arr.shape[1],arr.shape[2]))

    # Use vector quantization to shift the values in arr2 to the nearest point in
    # the code book (gradient).
    code,dist=scv.vq(arr2,gradient)

    # code is an array of length arr2 (240*240), holding the code book index for
    # each observation. (arr2 are the "observations".)
    # Scale the values so they are from 0 to 1.
    values=code.astype('float')/gradient.shape[0]

    # Reshape values back to (240,240)
    values=values.reshape(arr.shape[0],arr.shape[1])
    values=values[::-1]
    return values

arr=pylab.imread('mri_demo.png')
values=colormap2arr(arr,cm.jet)    
# Proof that it works:
pylab.imshow(values,interpolation='bilinear', cmap=cm.jet,
                      origin='lower', extent=[-3,3,-3,3])
pylab.show()

The image you see should be close to reproducing mri_demo.png:

alt text

(The original mri_demo.png had a white border. Since white is not a color in cm.jet, note that scipy.cluster.vq.vq maps white to to closest point in the gradient code book, which happens to be a pale green color.)

unutbu
yes, this is essentially what I thought was possible. Your initial solution included reading a line from an image with the same color map, which may be helpful to people who say, scan a figure in, and want to do their own numerical analysis. I was getting stuck on the vector quantization - initially, it seemed like the best bet would be to cycle through each possible color in the lut and calculate a 3d distance from the actual pixel value - which I couldn't see how to do quickly without looping. Thanks!