views:

5151

answers:

6

I'm curious as to whether or not there is a real difference between the money datatype and something like decimal(19,4) (which is what money uses internally, I believe).

I'm aware that money is specific to SQL Server. What I want to know is if there is a compelling reason to choose one over the other; most SQL Server samples (e.g. the AdventureWorks database) use money and not decimal for things like price information.

Should I just continue to use the money datatype, or is there a benefit to using decimal instead? Money is fewer characters to type but that's not a valid reason :)

+7  A: 

Never ever should you use money it is not precise and it is pure garbage, always use decimal/numeric

run this to see what I mean

DECLARE
    @mon1 MONEY,
    @mon2 MONEY,
    @mon3 MONEY,
    @mon4 MONEY,
    @num1 DECIMAL(19,4),
    @num2 DECIMAL(19,4),
    @num3 DECIMAL(19,4),
    @num4 DECIMAL(19,4)

    SELECT
    @mon1 = 100, @mon2 = 339, @mon3 = 10000,
    @num1 = 100, @num2 = 339, @num3 = 10000

    SET @mon4 = @mon1/@mon2*@mon3
    SET @num4 = @num1/@num2*@num3

    SELECT @mon4 AS moneyresult,
    @num4 AS numericresult

Output: 2949.0000 2949.8525

To some of the people who said that toy don't divide money by money

Here is one of my queries to calculate correlations, changing that to money gives wrong results

select t1.index_id,t2.index_id,(avg(t1.monret*t2.monret) 
    -(avg(t1.monret) * avg(t2.monret)))
            /((sqrt(avg(square(t1.monret)) - square(avg(t1.monret)))) 
            *(sqrt(avg(square(t2.monret)) - square(avg(t2.monret))))),
current_timestamp,@MaxDate
            from Table1 t1  join Table1 t2  on t1.Date = traDate
            group by t1.index_id,t2.index_id
SQLMenace
"Never" is a strong word. "Money" is useful for casting results to that type for display to the user in a culture-sensitive way, but you're right that it's very bad to use for the calculations itself.
Joel Coehoorn
Your example is not meaningful, since nobody will ever multiply two objects of type money. If you want to prove your point, you need to compare multiplying a money by a decimal to multiplying a decimal by a decimal.
Brian
I don't think there would be an instance of multiplying a money by money; money by int or money by float perhaps.
Wayne M
.. but it's still puzzling why money * money would not have the precision of money.
Learning
@Learning: it does have a precision of money. However, you still end up with rounding errors that can accumulate over time. The decimal type doesn't use binary arithmetic: it guarantees it gets the same base 10 results you would from doing it on paper.
Joel Coehoorn
it's really worth to read @configurator's answer right below.
Cawas
+7  A: 

SQLMenace said money is inexact. But you don't multiply/divide money by money! How much is 3 dollars times 50 cents? 150 dollarcents? You multiply/divide money by scalars, which should be decimal.

DECLARE
@mon1 MONEY,
@mon4 MONEY,
@num1 DECIMAL(19,4),
@num2 DECIMAL(19,4),
@num3 DECIMAL(19,4),
@num4 DECIMAL(19,4)

SELECT
@mon1 = 100,
@num1 = 100, @num2 = 339, @num3 = 10000

SET @mon4 = @mon1/@num2*@num3
SET @num4 = @num1/@num2*@num3

SELECT @mon4 AS moneyresult,
@num4 AS numericresult

Results in the correct result:

moneyresult           numericresult
--------------------- ---------------------------------------
2949.8525             2949.8525

money is good as long as you don't need more than 4 decimal digits, and you make sure your scalars - which do not represent money - are decimals.

configurator
how many 1 cent coins can a dollar bill get you? an answer to this requires money / money.
Learning
I added a correlation query to my answer that I run for monthly performance, what would happen if I used money data type?
SQLMenace
@Learning in that case the result is not money, but a float. (Basic dimensional analysis.)
Richard
+1: Good explanation.
John Sansom
@Learning: Do you ask the database how many cents in a dollar a lot? Anyway, that would return the right result. His problem was that money / money was only precise to four digits (it was 0.2949), then when multiplied by 10000 it became 2949.0000.
configurator
@SQLMenace: I don't know; I guess in that case you would have to use a datatype with more than 4 decimal digits. That still doesn't mean you should never use money.
configurator
@SQLMenace: In my (financial) program I also don't use money; that is because I need to make calculations with 12 (or 18, I'm not sure) decimal digits (we have multipliers like 0.027681943654)
configurator
@configurator : I understood the fault in my logic (as you explained the precision was 4 and multiplication by 10K wiped it ). Thanks.
Learning
fwiw : I do not ask the db about how many cents to a dollar a lot. That was an expample as I think there should be no corner cases in a db implementation.
Learning
A: 

I just saw this entry... http://thavash.spaces.live.com/blog/cns!CF6232111374DFD2!223.entry

which basically says that money has a precision issue....

declare @m money
declare @d decimal(9,2)

set @m = 19.34
set @d = 19.34

select (@m/1000)*1000
select (@d/1000)*1000

For money type, you will get 19.30 instead of 19.34. I am not sure if there is application scenario that divides money into 1000 parts for calculation but this example does expose some limitations.

Harsha

Harsha
A: 

@configurator you may be right when you said

you don't multiply/divide money by money!

but as per Data Type Precedence standards when an operator combines two expressions of different data types, the rules for data type precedence specify that the data type with the lower precedence is converted to the data type with the higher precedence.

so in your example when you divided money with decimal, money was implicitly converted to decimal as decimal has higher precedence than money and thus you had same result for decimal/decimal and money/decimal calculations.

N30
Yes, that is why the result is correct. My point was that for storage, money can be the right type, while for the actual calculation you'll probably want to use a decimal.
configurator
A: 

You have to be careful when sum/multiple/divide/subtract variable that are of different types as SQL automatically cast the result to one of the two types. It's a good practice to operate on variable of the same type, or always explicitly cast the result (or even each variable within the operation/expression).

Damiano Fusco
A: 

This works: ROUND(yourmoneyvalue, 2) and if you need to do it in the code Math.Round(yourmoneyvalue, 2).

Andy