Seriously? This topic! Yes. JavaScript actually has two different representations for ZERO:
- Positive zero, represented by +0 / 0
- Negative zero, represented by -0
This is because JavaScript implements the IEEE Standard for Floating-Point Arithmetic (IEEE 754),
which has signed zeroes.
And on a note, both zeroes are equal to one another.
+0 === -0 // true
+0 == -0 // true
The only difference between both is in dealing with Infinity
1 / +0 // Infinity
1 / -0 // -Infinity
-1 / +0 // -Infinity
-1 / -0 // Infinity
Numbers always need to be encoded to be stored digitally. But why do some encodings have two Zeros?
Let us look at encoding an integer as a 4-digit binary number by the sign-and-magnitude method.
Here,
- One bit denotes the sign. (0 if positive, 1 if negative)
- Remaining bits for the magnitude. (absolute value)
Therefore, -2 and +2 are encoded as,
1 | 0 | 1 | 0 | => -2
0 | 0 | 1 | 0 | => 2
Which means, we will also have two zeroes!
1000 // -0
000 // +0
Hunt 🤔
We saw that -0 and 0 are equal.
Imagine, you came across a use case to return false while comparing -0 and 0.
How would you do that??? (Comment below)
😎Thanks for Reading | Happy Coding⚡
Comments (0)