To see the differences we can look at the documentation:
Documentation for money:
Data type Range Storage
money -922,337,203,685,477.5808 to 922,337,203,685,477.5807 8 bytes
smallmoney -214,748.3648 to 214,748.3647 4 bytes
The money and smallmoney data types are accurate to a ten-thousandth of the monetary units that they represent.
Compare to decimal:
When maximum precision is used, valid values are from -10^38 + 1 through 10^38 - 1.
Precision Storage
1 - 9 5 bytes
10 - 19 9 bytes
20 - 28 13 bytes
29 - 38 17 bytes
So they're not exactly equivalent, just similar. A DECIMAL(19,4) has a slightly greater range than MONEY (it can store from -10^15 + 0.0001 to 10^15 - 0.0001), but also needs one more byte of storage.
In other words, this works:
CREATE TABLE Table1 (test DECIMAL(19,4) NOT NULL);
INSERT INTO Table1 (test) VALUES
(999999999999999.9999);
SELECT * FROM Table1
999999999999999.9999
But this doesn't:
CREATE TABLE Table1 (test MONEY NOT NULL);
INSERT INTO Table1 (test) VALUES
(999999999999999.9999);
SELECT * FROM Table1
Arithmetic overflow error converting numeric to data type money.
There's also a semantic difference. If you want to store monetary values, it makes sense to use the type money.