I'm writing a 2D plotter, and along the X- and Y-axis I have markers with printed values. Now my question is: how do I find suitable distances?
My method (for a numeric axis) is so far:
- I know the height of the axis on screen in pixels
- Decide an optimal distance between markers in pixels (what looks good on screen ...), for example 32 pixels.
- Recalculate the pixel count to axis value (If the axis is 320 pixels and the range of the axis is 0-40 °C for example, the value is 32 * (40/320) = 4. So the optimal distance is 4 °C.)
- Start with a distance of 1 °C. If the distance is bigger than the optimal distance, divide it by two until less then the optimal distance. And likewise, if the distance is less than the optimal distance, multiply it by two until larger than the optimal distance.
This works, but it does not give me the same distances as I would have selected by hand. For example: if the range is 0-1000, I would decide one of the following distances:
- 1000 (0, 1000
- 500 (0, 500, 1000)
- 250 (0, 250, 500, 750, 1000)
- 200 (0, 200, 400, 600, 800, 1000)
- 100 (0, 100, ..., 900, 1000)
- 50 (0, 50, ..., 950, 1000)
and so on ...
Do you have any clever idea?