Declare @BadDecimal varchar(5)
Set @BadDecimal = '4.5'
Declare @GoodDecimal Decimal
Set @GoodDecimal = @BadDecimal
Select @GoodDecimal
--Outputs 5
Why?
Declare @BadDecimal varchar(5)
Set @BadDecimal = '4.5'
Declare @GoodDecimal Decimal
Set @GoodDecimal = @BadDecimal
Select @GoodDecimal
--Outputs 5
Why?
casting or converting to explicit decimal(digits, precision) works:
SELECT CONVERT(DECIMAL(5, 2), '4.5')
SELECT CAST('4.5' AS DECIMAL(5,2))
SQLServer Help states that each combination of (digits, precision) is handled as separate data type.
Try
Declare @GoodDecimal Decimal(2,1)
edit: changed to (2,1) after request.
Coentjie beat me to it arrggg...
You need to declare your decimals using this format Decimal(p,s)
p = The maximum total number of decimal digits that can be stored, both to the left and to the right of the decimal point. The precision must be a value from 1 through the maximum precision of 38. The default precision is 18.
s = The maximum number of decimal digits that can be stored to the right of the decimal point. Scale must be a value from 0 through p. Scale can be specified only if precision is specified. The default scale is 0; therefore, 0 <= s <= p. Maximum storage sizes vary, based on the precision.