Why does the select statement return two different values?
declare @tempDec decimal
set @tempDec = 1.0 / (1.0 + 1.0)
select @tempDec, 1.0 / (1.0 + 1.0)
Why does the select statement return two different values?
declare @tempDec decimal
set @tempDec = 1.0 / (1.0 + 1.0)
select @tempDec, 1.0 / (1.0 + 1.0)
I found out from a coworker just as I posted this.
You need to specify the default precision and scale.
This works in this scenario: declare @tempDec decimal(3,2)
From MSDN:
decimal[ (p[ , s] )] and numeric[ (p[ , s] )] Fixed precision and scale numbers. When maximum precision is used, valid values are from - 10^38 +1 through 10^38 - 1. The SQL-92 synonyms for decimal are dec and dec(p, s). numeric is functionally equivalent to decimal.
p (precision) The maximum total number of decimal digits that can be stored, both to the left and to the right of the decimal point. The precision must be a value from 1 through the maximum precision of 38. The default precision is 18.
s (scale) The maximum number of decimal digits that can be stored to the right of the decimal point. Scale must be a value from 0 through p. Scale can be specified only if precision is specified. The default scale is 0; therefore, 0 <= s <= p. Maximum storage sizes vary, based on the precision.
That's fine for literals like 1.0, but if you're pulling the data from table columns, you need to cast/convert the first evaluated number in your equation:
convert(decimal, [col1]) / ([col2] + [col3])
-or-
convert(decimal(15, 2), [col1]) / ([col2] + [col3])