My LINQ query is not producing the expected output below. Basically, it's the sum of table3.cost corresponding to table2.code and table2.class categorized by table1.alias and ordered by table1.priority. I added two DataRelation's to the DataSet:
ds.Relations.Add("Table1Table2", ds.Tables[1].Columns("ID"), ds.Tables[2].Columns("ParentID");
ds.Relations.Add("Table2Table3",
new DataColumn[] { ds.Tables[2].Columns["Code"], ds.Tables[2].Columns["Class"] },
new DataColumn[] { ds.Tables[3].Columns["Code"], ds.Tables[3].Columns["Class"] });
var query = from aliases in table1.AsEnumerable()
join children in table2.AsEnumerable() on aliases("ID") equals children("ParentID")
orderby aliases("priority")
select new
{
Name = aliases("alias"),
Cost =
from data in table3.AsEnumerable()
group data by new { code = data("code"), classcode = data("class") }
into datatemp
select new
{
cost = datatemp.Sum(x => x("cost"))
}
};
Any ideas what I'm doing wrong? Thanks in advance!
table1: list of aliases and priorities ----------------- alias priority alias1 1 alias2 2 alias3 4 alias4 3
table2: children records joined to table1 by ParentID (1-to-many) ----------------- code class ParentID code1 class1 1 code2 class2 1 code3 class3 2 code4 class4 4
table3: data, joined to table2 by class and code (1-to-many) ----------------- code class cost code1 class1 1.00 code1 class1 10.00 code1 class1 26.00 code2 class2 5.00 code2 class2 0.00 code3 class3 1000.00
Expected output:
alias1 42.00 alias2 1000.00 alias4 0.00 alias3 0.00