tags:

views:

62

answers:

1

good morning.

i have a jagged array declared like

int[][][] tmpA = new int[INT_WORKING_SIZE * 2][][];

I trying to sort this array with this code:

Array.Sort(tmpA, 0, INT_WORKING_SIZE*2, new MyArrayComparer());

and my class:

  public int Compare(object x,object y)
    {
        if (x == null || y == null)
            return 0;
        int[][] arrayA = (int[][])x;
        int[][] arrayB = (int[][])y;

         int resultA = arrayA[1].Sum();
         int resultB = arrayB[1].Sum();

        return resultA.CompareTo(resultB);          
    }

each row of jagged array has 2 arrays with 12 ints.

I want to sort the array by adding all the 12 ints of the second array and the smallest should be first.

However my major problem is that object x,y are often nulls and the sorted array gets all zeros.

any tips?

+1  A: 

If I understand you correctly, your problem is you're returning 0 when either of the arrays is null, when you should be returning 1 or -1 depending on which is not null, and 0 only when both are null.

public int Compare(object x,object y)
{
    // changed code
    if (x == null && y == null)
        return 0;
    if (x == null)
        return 1;
    if (y == null)
        return -1;
    // end of changed code
    int[][] arrayA = (int[][])x;
    int[][] arrayB = (int[][])y;

     int resultA = arrayA[1].Sum();
     int resultB = arrayB[1].Sum();

    return resultA.CompareTo(resultB);          
}
manixrock
yeap. this did the job.
Ray