views:

4265

answers:

4
Declare @BadDecimal varchar(5)
Set @BadDecimal = '4.5'
Declare @GoodDecimal Decimal
Set @GoodDecimal = @BadDecimal
Select @GoodDecimal

--Outputs 5

Why?

+1  A: 

Try Set @BadDecimal = '4,5'

A: 

casting or converting to explicit decimal(digits, precision) works:

SELECT CONVERT(DECIMAL(5, 2), '4.5')
SELECT CAST('4.5' AS DECIMAL(5,2))

SQLServer Help states that each combination of (digits, precision) is handled as separate data type.

devio
This doesn't work if you use the parameter @BadDecimal in the string
digiguru
+5  A: 

Try

Declare @GoodDecimal Decimal(2,1)

edit: changed to (2,1) after request.

Coentje
Very concise - cheers
digiguru
Thank you, i am always happy to be of service
Coentje
You should probably edit your post. Decimal(2,2) isn't right. The minimum values for precision and scale to store 4.5 is Decimal(2,1).
G Mastros
You are right, 2,1 is enough
Coentje
+3  A: 

Coentjie beat me to it arrggg...

You need to declare your decimals using this format Decimal(p,s)

p = The maximum total number of decimal digits that can be stored, both to the left and to the right of the decimal point. The precision must be a value from 1 through the maximum precision of 38. The default precision is 18.

s = The maximum number of decimal digits that can be stored to the right of the decimal point. Scale must be a value from 0 through p. Scale can be specified only if precision is specified. The default scale is 0; therefore, 0 <= s <= p. Maximum storage sizes vary, based on the precision.

Gareth
Good to be first ;-) normally i am the one that is just a little too slow.
Coentje