Why do computers suck at math?

Archit Sharma - Jul 25 '22 - - Dev Community
I discovered something strange when experimenting with JavaScript a few days ago, so I did some research and found my answers.
I'm writing this article in the hopes that some of you will find it useful or may learn something new.

I used to believe that computers were better at math than humans, until I tried adding 0.1 + 0.2 and got the result 0.30000000000000004 in my browser console.

0.1 + 0.2
Then I performed this comparison 0.1+0.2===0.3 and got the output false.

0.1+0.2===0.3
I initially believed it was a bug in JavaScript, so I attempted the same thing in Java and Python and had the same result in both.

After doing a lot of research, I concluded this is not an bug.
I found out this is math: floating-point arithmetic.
Let's go a little deeper to see what's going on behind the scenes.

Computers have a limited amount of memory and so they need to make a trade-of between range and precision.

Numbers in JavaScript should be stored within 64 bits, which means we can have integers accurate up to 15 digits and a maximum of 17 numbers after the decimal point. It is called a floating point because there is no fixed number of digits before or after the decimal point, allowing it to represent a wide range of numbers both big and small.

The problem is that computer use a Base-2 system i.e. binary while humans use a Base-10 system that leads to rounding errors when all the memory has been used up.

base2 to base10

This is the reason behind getting 0.1 + 0.2 = 0.30000000000000004.

Thank you for reading this article; I have not detailed the entire math behind it, but only enough for you to comprehend what's going on.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player