I have a variable that very rarely gets an incorrect value. Since the system is quite complex I'm having trouble tracing all the code paths that value goes through - there are multiple threads involved, it can be saved and then loaded from a DB and so on. I'm going to try to use a code graph generator to see if I can spot the problem by looking at the ways the setter can be called, by may be there's some other technique. Perhaps wrapping the value with a class that traces the places and changes it goes through? I'm not sure the question is clear enough, but I'd appreciate input from somebody who encountered such a situation.
[Edit] The problem is not easily reproducible and I can't catch it in a debugger. I'm looking for a static analysis or logging technique to help track down the issue.
[Edit 2] Just to make things clearer, the value I'm talking about is a timestamp represented as the number of milliseconds from the Unix epoch (01/01/1970) in a 64-bit long variable. At some unknown point the top 32 bits of the value are truncated generating completely incorrect (and unrecoverable) timestamps.
[Edit 3] OK, thanks to some of your suggestions and to a couple of hours of pouring through the code, I found the culprit. The millisecond-based timestamp was converted into a second-based timestamp by dividing it by 1000 and stored in an int
variable. At a later point in code, the second-based timestamp (an int
) was multiplied by 1000 and stored into a new long
variable. Since both 1000 and the second-based timestamps were int
values, the result of the multiplication was truncated before being converted to long. This was a subtle one, thanks to everyone who helped.