I have a list of entites that contains ~137000 records that I loop through I then need to linq to a list of Tuple of additional params this contains ~ 150000
Why does it keep taking longer, the more iterations it does? Here is from the stopwatch Found: 136770 items that match the criteria.
10,000 items processed EllapsedTime: 5473That is: 0.0912166666666667 minutes.
20,000 items processed EllapsedTime: 15307That is: 0.255116666666667 minutes.
30,000 items processed EllapsedTime: 30065That is: 0.501083333333333 minutes.
50,000 items processed EllapsedTime: 74507That is: 1.24178333333333 minutes.
75,000 items processed EllapsedTime: 157836That is: 2.6306 minutes.
100,000 items processed EllapsedTime: 272495That is: 4.54158333333333 minutes.
EllapsedTime: 499663That is: 8.32771666666667 minutes.
Is there some way to optimize this?
List<Entites> alMatched
List<Tuple<int, double, int, int>> lsItems = new List<Tuple<int, double, int, int>>();
IEnumerable<Tuple<int, double, int, int>> enumThingy = lsItems;
for (int z = 0; z <= alMatched.Count() - 1;z++ )
{
Entity a = alMatched[z];
var newRepl = enumThingy.Where(d => d.First == a.ID).First();
if (newRepl != null)
{
}
switch (z)
{
case 10000:
Debug.Print("10,000 items processed " + ElapsedTime(sw.ElapsedMilliseconds));
break;
case 20000:
Debug.Print("20,000 items processed " + ElapsedTime(sw.ElapsedMilliseconds));
break;
case 30000:
Debug.Print("30,000 items processed " + ElapsedTime(sw.ElapsedMilliseconds));
break;
case 50000:
Debug.Print("50,000 items processed " + ElapsedTime(sw.ElapsedMilliseconds));
break;
case 75000:
Debug.Print("75,000 items processed " + ElapsedTime(sw.ElapsedMilliseconds));
break;
case 100000:
Debug.Print("100,000 items processed " + ElapsedTime(sw.ElapsedMilliseconds));
break;
}
}
Regards
_Eric