views:

66

answers:

3

Hi,

I have an array that i use to lookup values. I use the first 2 values to get n rows. for example all rows that have 2 in the first column and 7 in the second. What is the fastest (i mean micro-optimized) way to get these values? I now use a for loop to get the values:

 int l = SpreadLookupTable.GetLength(0);

 for (int iCombo = 0; iCombo < l; iCombo++) {                        
      bool o = SpreadLookupTable[iCombo, 0] == perWeek 
      && SpreadLookupTable[iCombo, 1] == workDays;

     if (o) {
         // do stuff
     }
 }

Edit: It only has some 60 rows. What if i make 3 nested arrays so i could use the first 2 columns as an index directly like t[2][7] and then i would only iterate over the rows i really need. would that be faster?

the table:

private static int[,] SpreadLookupTable = {
                                                  {2, 7, 1, 0, 0, 1, 0, 0, 0},
                                                  {2, 7, 1, 0, 0, 0, 1, 0, 0},
                                                  {2, 7, 0, 1, 0, 0, 1, 0, 0},                                                                                                        
                                         ...
                                                  {2, 3, 1, 1, 0, 0, 0, 0, 0},
                                                  {2, 3, 1, 0, 1, 0, 0, 0, 0},
                                                  {2, 3, 0, 1, 1, 0, 0, 0, 0}
                                                  };
+1  A: 

You could keep a list of indexes of iCombo that match the criteria, so after your first loop through ALL data, all subsequent times, you would only have to loop through the indexes, and skip the comparisons.

Neil N
That is similar to what @TechNeilogy is saying?
Jeroen
+1  A: 

If the table is static and the combination of search values is known, you could combine both search search values into a hash and map that hash into the table using a dictionary with a list.

If the search values are not known, you could build a multi-level dictionary (or dictionaries) and use the same technique.

(I cross-posted with Neil N and your edit, but this is basically a version of the same general idea: pre-process the indices into some kind of lookup structure. Whether a dictionary or list is more efficient depends on the characteristics of your data.)

TechNeilogy
It's what i am saying in the 'edit' part of my question?
Jeroen
Sry, I cross-posted with your edit. I edited *my* answer to refer to your edit, lol.
TechNeilogy
Thanks. I need to squeeze every millisecond out of this and these suggestions help a lot.
Jeroen
+1  A: 

An example I saw in a project a colleague of mine was doing was he took a multicolumn grid view and exported the cells to an array set. A new array was used to index the rows with "A", and another new array was used to index the rows with "B". Then a following array made to index those where the index positions in Array1 and Array 2 matched. Using this you can then micro manage these rows to use in your

 if (o) {
         // do stuff
    }

loop, therefore taking out all need to process the rows as they appear to fulfil the criteria set in

     bool o = SpreadLookupTable[iCombo, 0] == perWeek 
  && SpreadLookupTable[iCombo, 1] == workDays;

In our tests it reduce work time done by the app by about 13%, not much but we had several thousand rows to work with so 60 rows may show an even better improvement from the smaller size of your indexes.

Blue
Hope this helps!
Blue
Good answer. I think it is similar to all of the above answers and my Edit. My situation is similar, the table is accessed hundreds or thousands of times. So every millisecond counts.
Jeroen