The Numeric data type is similar to Decimal. The only difference is that implementation may have higher precision than what the user specifies. However, if the user does not specify any precision, the implementation will use the default values.

The decimal data type will allow overflow based on system defaults, but a numeric data type may not.

BY Best Interview Question ON 26 Jan 2019