tags:

views:

344

answers:

2

What's the difference between the SQL datatype NUMERIC and DECIMAL ? If databases treat these differently, I'd like to know how for atleast: SQL Server Oracle Db/2 MySQL PostgreSQL

Furthermore , are there any differences in how database drivers interpret these types ?

+2  A: 

They are synonyms, no difference at all.

At least on SQL Server in the ANSI SQL standards. This SO answer shows some difference in ANSI but I suspect in implementation they are the same

gbn
Same for Oracle: http://techonthenet.com/oracle/datatypes.php
OMG Ponies
+2  A: 

They are the same for almost all purposes.

At one time different vendors used different names (Numeric/Decimal) for almost the same thing. SQL-92 made them the same with one minor difference which can be vendor specific:

Decimal Numeric must be exactly as precise as it is defined - so if you define 4 decimals places, the DB must always store 4 decimal places.

Numeric Decimal must be at least as precise as it is defined. This means that the database can actually store more digits then specified (due to the behind the scenes storage having space for extra digits). This means the database might store 1.00005 instead of 1.0000, affecting future calculations.

In SQL Server Numeric is defined as being identical to Decimal in everyway - both will always store only the specified number of decimal places.

David
is the decimal/numeric thing not the other way around? http://stackoverflow.com/questions/759401/is-there-any-difference-between-decimal-and-numeric-in-ms-sql/759606#759606
gbn
Yes, that is correct.
David