I have a class A { public float Score; ... }
and an IEnumerable<A> items
and would like to find the A
which has minimal score.
Using items.Min(x => x.Score)
gives the minimal score and not the instance with minimal score.
How can I get the instance by iterating only once through my data?
Edit: So long there are three main solutions:
Writing an extension method (proposed by Svish). Pros: Easy to use and evaluates Score only once per item. Cons: Needs an extension method. (I choosed this solution for my application.)
Using Aggregate (proposed by Daniel Renshaw). Pros: Uses a built-in LINQ method. Cons: Slightly obfuscated to the untrained eye and calls evaluator more than once.
Implementing IComparable (proposed by cyberzed). Pros: Can use Linq.Min directly. Cons: Fixed to one comparer - can not freely choose comparer when performing the minimum computation.