I just got a code handed over to me. The code is written in C# and it inserts realtime data into database every second. The data is accumulated in time which makes the numbers big.
The data is updated within the second many times then at the end of the second result is taken and inserted.
We used to address the dataset rows directly within the second through the properties. For example many operations like this one 'datavaluerow.meanvalue += mean; could take place. we figured out that this is degrading the performance after running the profiler becuase of the internal casting done so we created 2d array of decimals on which the updates are carried out then the values are assigned to the datarows only at the end of the second. I ran a profiler and found out that it is still taking a lot of time (although less than the time spent accessing datarows frequently when added up).
The code that is exectued at the end of the second is as follows
public void UpdateDataRows(int tick)
{
//ord
//_table1Values is of type decimal[][]
for (int i = 0; i < _table1Values.Length; i++)
{
_table1Values[i][(int)table1Enum.barDateTime] = tick;
table1Row[i].ItemArray = _table1Values[i].Cast<object>().ToArray();
}
// this process is done for other 10 tables
}
Is there a way to further improve this approach.