views:

77

answers:

5

Hi everyone,

i am writing a program which simulates an activity, i am wondering how to speed up time for the simulation, let say 1 hour in the real world is equal to 1 month in the program.

thank you

the program is actually similar to a restaurant simulation where you dont really know when customer come. let say we pick a random number (2-10) customer every one hour

+1  A: 

You just do it. You decide how many events take place in an hour of simulation time (eg., if an event takes place once a second, then after 3600 simulated events you've simulated an hour of time). There's no need for your simulation to run in real time; you can run it as fast as you can calculate the relevant numbers.

Max Lybbert
+2  A: 

It depends on how it gets time now.

For example, if it calls Linux system time(), just replace that with your own function (like mytime) which returns speedier times. Perhaps mytime calls time and multiplies the returned time by whatever factor makes sense. 1 hr = 1 month is 720 times. Handling the origin as when the program begins should be accounted for:

time_t t0;
main ()
{
     t0 = time(NULL);    // at program initialization

     ....

     for (;;)
     {
           time_t sim_time = mytime (NULL);
           // yada yada yada
           ...
     }
}

time_t mytime (void *)
{
     return 720 * (time (NULL) - t0);   // account for time since program started
                                        // and magnify by 720, so one hour is one month
}
wallyk
A: 

If the simulation is data dependent (like a stock market program), just speed up the rate at which the data is pumped. If it is some think that depends on time() calls you will have to do some thing like wallyk's answer (assuming you have the source code).

ivymike
A: 

If time in your simulation is discrete, one option is to structure your program so that something happens "every tick". Once you do that, time in your program is arbitrarily fast.

Is there really a reason for having a month of simulation time correspond exactly to an hour of time in the real world ? If yes, you can always process the number of ticks that correspond to a month, and then pause the appropriate amount of time to let an hour of "real time" finish.

Of course, a key variable here is the granularity of your simulation, i.e. how many ticks correspond to a second of simulated time.

agam
A: 

It sounds like you are implementing a Discrete Event Simulation. You don't even need to have a free-running timer (no matter what scaling you may use) in such a situation. It's all driven by the events. You have a priority queue containing events, ordered by the event time. You have a processing loop which takes the event at the head of the queue, and advances the simulation time to the event time. You process the event, which may involve scheduling more events. (For example, the customerArrived event may cause a customerOrdersDinner event to be generated 2 minutes later.) You can easily simulate customers arriving using random().

The other answers I've read thus far are still assuming you need a continuous timer, which is usually not the most efficient way of simulating an event-driven system. You don't need to scale real time to simulation time, or have ticks. Let the events drive time!

gavinb