views:

187

answers:

4

I wrote a little program that solves 49151 sudoku's within an hour for an assignment, but we had to time it. I thought I'd just let it run and then check the execution time, but it says -1536.087 s. I'm guessing it has to do with the timer being some signed dataype or something, but I have no idea what datatype is used for the timer in the console (code::blocks console, I'm not sure if this is actually a separate console, or just a runner that runs the terminal from the local operating system), so I can't check what the real time was. I'd rather not run this again with some coded timer within my program, since I'd like to be able to use my pc again now. Anybody have any idea what this time could be? It should be somewhere between 40 and 50 minutes, so between 2400 and 3000 seconds.

+2  A: 

If you're running on linux, use "time":

$ time ./your_program
real    0m0.003s
user    0m0.004s
sys     0m0.000s
Stephen
-1, "I'd rather not run this again"
Steve Jessop
A: 

I would guess 42 minutes to be exact (ignoring the decimals and assuming a 12 bit signed datatype)... But that is a silly guess since you haven't included any information about how the execution time is being checked.

Kingdom of Fish
I'm not sure how you got this result - or why you think a 12-bit datatype is at likely (good luck declaring one in C++).
Jefromi
@Jefromi: `struct int12 { int val:12; /* many, many operator overloads */ };` ;-)
Steve Jessop
@Jefromi: he's obviously making a ridiculous guess to demonstrate the more information is needed to give a correct response. We don't know the datatype... however, we can make a very good guess given that the OP said "less than an hour", we can assume that it overflowed it less that that. Given the values provided, a 32-bit signed time representation is most likely.
Evan Teran
@Steve Jessop: Ah, of course, why didn't I think of it! @Evan Teran: You're right, sorry, I didn't pick up on how tongue-in-cheek the answer was supposed to be.
Jefromi
+10  A: 

If the time was stored in microseconds in a 32-bit signed int, 2758880296 us (microseconds) would produce this result, since 2758880296-2^32 = -1536087000. In minutes and seconds, that's 45:58.880296. (treat those last few decimal places with a grain of salt, since presumably what you printed was rounded to the nearest millisecond)

But of course, that's just an intelligent guess based on the information you provided.

Jefromi
Thanks, stupid I didn't think of microseconds. And on posting the code: it's no use, since it's the execution time áfter the code has run, it has nothing to do with the code itself. Well, course the code will influence the time, but what I mean is that I didn't code the timer, it's some timer of the console in code::blocks that times the execution of the thread.
FinalArt2005
Oh sorry, you already said you understood that in the comments under the question itself, guess I skimmed too fast as well ;)
FinalArt2005
+2  A: 

I'd guess 46 minutes.

Assume a 32 bit signed integer representing microseconds.

Then -1,536,087,000us would be the same as 2,758,880,296us, which is 45:58.880.

It's possible that there's another representation that gives an equally plausible result in your range, though.

Steve Jessop