views:

102

answers:

3

I'm constructing a data visualisation system that visualises over 100,000 data points (visits to a website) across a time period. The time period (say 1 week) is then converted into simulation time (1 week = 2 minutes in simulation), and a task is performed on each and every piece of data at the specific time it happens in simulation time (the time each visit occurred during the week in real time). With me? =p

In other programming languages (eg. Java) I would simply set a timer for each datapoint. After each timer is complete it triggers a callback that allows me to display that datapoint in my app. I'm new to C++ and unfortunately it seems that timers with callbacks aren't built-in. Another method I would have done in ActionScript, for example, would be using custom events that are triggered after a specific timeframe. But then again I don't think C++ has support for custom events either.

In a nutshell; say I have 1000 pieces of data that span across a 60 second period. Each piece of data has it's own time in relation to that 60 second period. For example, one needs to trigger something at 1 second, another at 5 seconds, etc.

Am I going about this the right way, or is there a much easier way to do this?

Ps. I'm using Mac OS X, not Windows

A: 

If you are using native C++, you should look at the Timers section of the Windows API on the MSDN website. They should tell you exactly what you need to know.

DeadMG
Thank you. I'm using OS X to develop on. Would this still be relevant?
robhawkes
Oh. No. However, you could look into POSIX timers- they should provide the same functionality, give or take. But I've never used them and have no links for you.
DeadMG
+4  A: 

I would not use timers to do that. Sounds like you have too many events and they may lie too close to each other. Performance and accuracy may be bad with timers.

a simulation is normally done like that: You are simly doing loops (or iterations). And on every loop you add an either measured (for real time) or constant (non real time) amount to your simulation time. Then you manually check all your events and execute them if they have to. In your case it would help to have them sorted for execution time so you would not have to loop through them all every iteration.

Tme measuring can be done with gettimer() c function for low accuracy or there are better functions for higher accuracy e.g. QueryPerformanceTimer() on windows - dont know the equivalent for Mac.

kaptnole
Instead of a ticker time approach which seems generic for simulations, in this case you can also calculate the delay upto the first event and sleep until then using nanosleep.
stefaanv
Or sleep in intervals of your display times like games do. That helps not to consume all processor time.
kaptnole
I chose Tuan the "answer" to my question, but I completely agree with you that timers aren't the best choice. I'm discovering this now I need to use frame-based recording. The timer system doesn't slow down in line with a drop in framerate. I'm going to convert it into a frame-based timing system like you suggest, it seems the only way.
robhawkes
@stefaanv: Why is a generic solution not one I should use?@kaptnole: Do you have any links to techniques for putting that into practice? It sounds interesting but I'm not entirely sure how I'd achieve it.
robhawkes
@rob: There is nothing wrong with a generic solution, I just suggested another approach, where the simulation is only active when there is an event, not every tick.
stefaanv
+1  A: 

Just make a "timer" mechanism yourself, that's the best, fastest and most flexible way.

-> make an array of events (linked to each object event happens to) (std::vector in c++/STL) -> sort the array on time (std::sort in c++/STL) -> then just loop on the array and trigger the object action/method upon time inside a range.

Roughly that gives in C++:

// action upon data + data itself
class Object{
public:
  Object(Data d) : data(d) {

  void Action(){display(data)};

  Data data;
};

// event time + object upon event acts
class Event{
public:
   Event(double t, Object o) time (t), object(o) {};

   // useful for std::sort
   bool operator<(Event e) { return time < e.time; }


  double   time;
  Object   object;


}
//init
std::vector<Event> myEvents;

myEvents.push_back(Event(1.0, Object(data0)));
//...
myEvents.push_back(Event(54.0, Object(data10000)));

// could be removed is push_back() is guaranteed to be in the correct order
std::sort(myEvents.begin(), myEvents.end());

// the way you handle time... period is for some fuzziness/animation ?
const double period = 0.5;
const double endTime = 60;
std::vector<Event>::iterator itLastFirstEvent = myEvents.begin();
for (double currtime = 0.0; currtime < endTime; currtime+=0.1)
{
  for (std::vector<Event>::iterator itEvent = itLastFirstEvent ; itEvent != myEvents.end();++itEvent)
  {
    if (currtime - period < itEvent.time)
      itLastFirstEvent = itEvent; // so that next loop start is optimised
    else if (itEvent.time < currtime + period) 
      itEvent->actiontick(); // action speaks louder than words
    else 
      break; // as it's sorted, won't be any more tick this loop
  }
 }

ps: About custom events, you might want to read/search about delegates in c++ and function/method pointers.

Tuan Kuranes
Thanks Tuan! The code makes sense and is very simple, in a good way. It's a perfect extension to the code I'm currently writing so I'll see how it goes.
robhawkes