There are lots of discussion about using MONEY or DECIMAL datatypes in SQL Server for holding financial data. It seems all it is about possible lost of precision when using MONEY datatype.
If I understand it right, this situation may take place when we do calculation with these values in stored procedures using T-SQL.
Do I assume right, that, in case I use LINQ to SQL and all results of queries from MONEY fields are held in c# decimal variables, (and so calculation are done using decimals) it doesn't matter what datatype to use in database (MONEY or DECIMAL) to represent money values?