Lets say we declare and initialize two decimal variables in SQL Server like this:
DECLARE @num1 DECIMAL = 1.98
DECLARE @num2 DECIMAL = 2.2
then we use the following piece of code to compare the two values:
IF (@num1 != @num2)
SELECT 0
ELSE IF (@num1 = @num2)
SELECT 1
Contrary to what one might expect the above code returns a 1
,
implying the two values are equal! Click here for
a live demo.
This is due to the fact that we have not included the precision and scale values
in our declaration of @num1, @num2
variables. Hence, default values
are being used.
According to the documentation:
Scale can be specified only if precision is specified. The default scale is 0; therefore, 0 <= s <= p. Maximum storage sizes vary, based on the precision.
So the scale of both variables defaults to 0 and thus both variables are being set to 2.
We can easily get around the problem by using declaration statements like:
DECLARE @num1 DECIMAL(10,2) = 1.98
DECLARE @num2 DECIMAL(10,2) = 2.2
No comments:
Post a Comment