I just asked this question. Which lead me to a new question :)
Up until this point, I have used the following pattern of selecting stuff with Linq to SQL, with the purpose of being able to handle 0 "rows" returned by the query:
var person = (from p in [DataContextObject].Persons
where p.PersonsID == 1
select new p).FirstOrDefault();
if (person == null)
{
// handle 0 "rows" returned.
}
But I can't use FirstOrDefault()
when I do:
var person = from p in [DataContextObject].Persons
where p.PersonsID == 1
select new { p.PersonsID, p.PersonsAdress, p.PersonsZipcode };
// Under the hood, this pattern generates a query which selects specific
// columns which will be faster than selecting all columns as the above
// snippet of code does. This results in a performance-boost on large tables.
How do I check for 0 "rows" returned by the query, using the second pattern?
UPDATE:
I think my build fails because I am trying to assign the result of the query to a variable (this._user
) declared with the type of [DataContext].User
.
this._user = (from u in [DataContextObject].Users
where u.UsersID == [Int32]
select new { u.UsersID }).FirstOrDefault();
Compilation error: Cannot implicitly convert type "AnonymousType#1" to "[DataContext].User".
Any thoughts on how I can get around this? Would I have to make my own object?