<script> var x = 5; // number without decimal var y = 5.42; // number with decimal </script>
<script> //Non-decimal number accuracy test var x = 999999999999999; //15 digits var y = 9999999999999999; //16 digits var z = 8999999999999999; //16 digit console.log(x); //999999999999999 console.log(y); //10000000000000000 console.log(z); //8999999999999999 //Decimal number accuracy test var x = 1.999999999999999; //16 digits var y = 1.9999999999999999; //17 digits var z = 11.999999999999999; //17 digits console.log(x); //1.999999999999999 console.log(y); //2 console.log(z); //11.999999999999998 var x = 16.99999999999999; //16 digits var y = 161.9999999999999; //16 digits var z = 161.99999999999999; //17 digits console.log(x); //16.99999999999999 console.log(y); //161.9999999999999 console.log(z); //162 </script>
If you want to get accurate result, take non-decimal number up to 15 digits and decimal numbers up to 16 digits. Since floating point arithmetic is not always 100% accurate.
<script> console.log(typeof(Infinity)) //Number console.log(typeof(NaN)) //Number </script>
<script> console.log(5/0); //Infinity console.log(-5/0); //-Infinity </script>
NaN - Not a Number
<script> console.log("Hello"/4); //NaN console.log("Hello"-4); //NaN </script>
When you will use
<script> console.log("Hello"+4); //Hello4 </script>
Numbers As Objects
You can also defined numbers as objects with the help of new keyword. Using number as an object is not recommended since they slow down execution speed.
<script> var x = 4; //number var y = new Number(4); //object console.log(typeof(x)) //number console.log(typeof(y)) //object </script>
What do you think?