You can skip to the bottom line if you don't care about the background:
I have the following code in Python:
ratio = (point.threshold - self.points[0].value) / (self.points[1].value - self.points[0].value)
Which is giving me wrong values. For instance, for:
threshold: 25.0
self.points[0].value: 46
self.points[1].value: 21
I got:
ratio: -0.000320556853048
Which is wrong.
Looking into it, I realized that self.points[0].value
and self.points[1].value] are of the type
numpy.uint16`, so I got:
21 - 46 = 65511
While I never defined a type for point.threshold
. I just assigned it. I imagine it got a plain vanilla int
.
The Bottom Line
How can I force the the subtraction of two uint
s to be signed?