I am writing some code for handling data. There are a number of groups of processing functions that can be chosen by the user that are then applied to the dataset. I would like to implement all these groups in separate places, but since they all take the same parameters and all do similar things I would like for them to have a common interface.
Being a good little c++ programmer my first thought was to simply use polymorphism. Just create some abstract class with the desired interface and then derive each set of processing objects from that. My hopes were quickly dashed however when I thought of another wrinkle. These datasets are enormous, resulting in the functions in question being called literally billions of times. While dynamic lookup is fairly cheap, as I understand it, it is a good deal slower than a standard function call.
My current idea to combat this is to use function pointers, in a manner something like this:
void dataProcessFunc1(mpz_class &input){...}
void dataProcessFunc2(mpz_class &input){...}
...
class DataProcessInterface
{
...
void (*func1)(mpz_class);
void (*func2)(mpz_class);
...
}
With some sort of constructor or something for setting up the pointers to point at the right things.
So I guess my question is this: Is this a good method? Is there another way? Or should I just learn to stop worrying and love the dynamic lookup?