Is there any difference between DECIMAL and NUMERIC data types in MS SQL?
When should I use DECIMAL and when NUMERIC?
Is there any difference between DECIMAL and NUMERIC data types in MS SQL?
When should I use DECIMAL and when NUMERIC?
The are the same. Numeric is functionally equivalent to decimal.
MSDN: decimal and numeric
To my knowledge there is no difference between NUMERIC and DECIMAL data types. They are synonymous to each other and either one can be used. DECIMAL and NUMERIC data types are numeric data types with fixed precision and scale.
Edit:
Speaking to a few collegues maybe its has something to do with DECIMAL being the ANSI SQL standard and NUMERIC being one Mircosoft prefers as its more commonly found in programming languages. ...Maybe ;)
This is what then SQL2003 standard (§6.1 Data Types) says about the two:
<exact numeric type> ::=
NUMERIC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
| DECIMAL [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
| DEC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
| SMALLINT
| INTEGER
| INT
| BIGINT
...
21) NUMERIC specifies the data type
exact numeric, with the decimal
precision and scale specified by the
<precision> and <scale>.
22) DECIMAL specifies the data type
exact numeric, with the decimal scale
specified by the <scale> and the
implementation-defined decimal
precision equal to or greater than the
value of the specified <precision>.