views:

219

answers:

5

How can I make this function more efficient. It's currently running at 6 - 45 seconds. I've ran dotTrace profiler on this specific method, and it's total time is anywhere between 6,000ms to 45,000ms. The majority of the time is spent on the "MoveNext" and "GetEnumerator" calls.

and example of the times are

71.55% CreateTableFromReportDataColumns - 18, 533* ms - 190 calls
 -- 55.71% MoveNext - 14,422ms - 10,775 calls 

can I do to speed this method up? it gets called a lot, and the seconds add up:

    private static DataTable CreateTableFromReportDataColumns(Report report)
    {
        DataTable table = new DataTable();
        HashSet<String> colsToAdd = new HashSet<String> { "DataStream" };
        foreach (ReportData reportData in report.ReportDatas)
        {
            IEnumerable<string> cols = reportData.ReportDataColumns.Where(c => !String.IsNullOrEmpty(c.Name)).Select(x => x.Name).Distinct();

            foreach (var s in cols)
            {
                if (!String.IsNullOrEmpty(s))
                    colsToAdd.Add(s);
            }
        }

        foreach (string col in colsToAdd)
        {
            table.Columns.Add(col);
        }

        return table;
    }

If you need the sql table definitions here they are:

ReportData

ReportID            int

ReportDataColumn

ReportDataColumnId  int
ReportDataId        int 
Name                varchar(255)    
Value               text    
+3  A: 

I believe you should be able to simplify your function into something like this

var columnsToAdd = report.ReportDatas
                    .SelectMany(r => r.ReportDataColumns)
                    .Select(rdc => rdc.Name)
                    .Distinct()
                    .Where(name => !string.IsNullOrEmpty(name));

And from there add the names to your table.

Anthony Pegram
what about removing that null test from `where` and testing it after distinct?
Rubens Farias
That may be a good suggestion if there are lot of duplicates. I'll move it.
Anthony Pegram
Nice and short, I'd like to hear (from the OP) what the profiler thinks of it.
Henk Holterman
Mörk suggested same thing down there, take a look
Rubens Farias
@Rubens: On the other hand, doing the test earlier will reduce the amount of data the later operations will have to process. This is also something that should be benchmarked.
Matti Virkkunen
I do agree, @Matti
Rubens Farias
I'm getting 30,701ms for 14 calls with this.
Michael G
@Michael, maybe you should to check how your data is loaded; isn't make any sense take 30s to compile a distinct column list.
Rubens Farias
@Michael: You mean it's slower? And 14 calls to what?
Henk Holterman
that's 30 seconds for 14 calls to that function. 14 datatables returned.
Michael G
WRT "and from there add the names to your table", this seems to be missing the "DataStream" column behavior from the original code. If the OP knows that "DataStream" won't show up in any of the 'real' columns, then we can just add that specific column to the final datatable, but otherwise we'd need to add it in to the results so that it was in columnsToAdd. BTW, after that is done with, if you add another Select(colName => new DataColumn(colName)) then you can dataTable.Columns.AddRange in 1 line :)
James Manning
sorry, i stand corrected, this is running much faster. I had it profiling an older version of the method. 4,465ms for 50 calls to this function.
Michael G
+3  A: 

Your code (only) runs foreach loops so the conclusion that the method spends most of its time in MoveNext() et al is not so surprising.

You are doing double work on both the isnullOrEmpty and the Distinct (is repeated by the HashSet).

My version would be:

private static DataTable CreateTableFromReportDataColumns(Report report)
{
    DataTable table = new DataTable();
    HashSet<String> colsToAdd = new HashSet<String> { "DataStream" };
    foreach (ReportData reportData in report.ReportDatas)
    {

        foreach (var column in reportData.ReportDataColumns)
        {
            if (!String.IsNullOrEmpty(column.Name))
                colsToAdd.Add(column.Name);
        }
    }

    foreach (string col in colsToAdd)
    {
        table.Columns.Add(col);
    }

    return table;
}

But I don't expect a huge improvement

Henk Holterman
This is in the lead so far with 18,428ms for 187 calls
Michael G
A: 
  • the string.isnullorempty check is repeated
  • you can get rid of the foreach's by doing SelectMany (i see Anthony just posted the same :)
  • to keep the same semantics for the "DataStream" column (after switching to Anthony's version), you could do new HashSet(columnsToAdd) { "DataStream" } but it might be easier/faster to just add (via concat or union or whatever) the "DataStream" string and then Distinct() the result and avoid the HashSet creation (might as well profile both)

It might be overkill for this (depending on the number of entries in ReportDatas, number of columns in each ReportDataColumns, number of cores on host machine, etc) but you could also potentially parallelize.

If you decided to process the ReportDatas entries in parallel, for instance, you could have each one either create its own collection of columns or have them all write into a ConcurrentBag that you Distinct when it's all done or whatever.

James Manning
James, that "DataStream" problem is best solved with a simple if/then on the final DataColumns.
Henk Holterman
A: 

This might be a slight improvement on Hank's code. It takes advantage of the fact that the HashSet will tell you if the Add operation was successful or the element already existed.

private static DataTable CreateTableFromReportDataColumns(Report report)
{
    HashSet<string> uniqueNames = new HashSet<string> { null, "", "DataStream" };

    DataTable table = new DataTable();
    table.Columns.Add("DataStream");

    foreach (ReportData reportData in report.ReportDatas)
    {
        foreach (var dataColumn in reportData.ReportDataColumns)
        {
            if (uniqueNames.Add(dataColumn.Name))
            {
                table.Columns.Add(dataColumn.Name);
            }
        }
    }

    return table;
}

Edit: I went ahead and added null and "" to the hash set at the beginning, so we no longer need the check for null or empty.

Scott P
+1  A: 

You should have mentioned LinqToSql when you asked the question, then you would have gotten some responses to look into your database to see if it's a long running query or repeated round trip querying

private static DataTable CreateTableFromReportDataColumns(Report report) 
{ 
    DataTable table = new DataTable(); 
    table.Columns.Add("DataStream");
    IEnumerable<string> moreColumns = report.ReportDatas
      .SelectMany(z => z.ReportDataColumns)
      .Select(x => x.Name)
      .Where(s => s != null && s != "")
      .Distinct();

    foreach (string col in moreColumns) 
    { 
        table.Columns.Add(col); 
    } 

    return table; 
} 

Also, capture the query issued by using the sql profiler. Then analyze the IO and TIME of the query by running it with these statements before

SET STATISTICS TIME ON
SET STATISTICS IO ON
  --your query here

Lastly, you may need an index or two to bring the IO down. Column order is important here.

CREATE INDEX IX1_ReportData ON ReportData(ReportID, Id)
CREATE INDEX IX1_ReportDataColumn ON ReportDataColumn(ReportDataId, Name)
David B