Found a pretty good hack to this problem. I'm not repeatedly re-filtering the data, but re-assigning weights-values to individual values of the 2d graph. This weight-value tells me how much filtering should be applied to corresponding graph locations. This weight-value also tells us the width of the gaussian filter (the range the value affects). Here's the code that needs to be executed every time graph values change.
vector<float> graph_values(100);
vector<float> graph_weights(100);
vector<float> graph_filtered_values(100);
// temp
vector<float> accumulated_weights(graph_values.size());
for(int x1=0;x1<graph_values.size();x1++)
{
graph_filtered_values[x1] = 0;
for(int x2=x1-30;x2<=x1+30;x2++)
{
float w = expf(-.5*(float)(x2-x1)*(x2-x1)/(graph_weights[x2]*graph_weights[x2]));
if( x2==x1&&!_finite(w) )
w = 1;
if( w<0.0001 )
w = 0;
graph_filtered_values[x1] += graph_values[x2] * w;
accumulated_weights[x1] += w;
}
}
for(int x1=0;x1<graph_values.size();x1++)
{
graph_filtered_values[x1] /= accumulated_weights[x1];
}
This algorithm uses triple the amount of memory: for graph_values, graph_weights and graph_filtered_values. This can be optimized by dropping first two arrays in the end product, when graph values no longer change.