Numbers and Math

Master JavaScript's number type — arithmetic operators, rounding, and what happens when math goes wrong.

Step 1 of 5

JavaScript has one number type

Many programming languages distinguish between integers (whole numbers like 42) and floating-point numbers (decimals like 3.14). JavaScript keeps it simple: there's just one `number` type that handles both. Whether you write `10` or `10.5`, JavaScript treats them as the same kind of data.

This simplicity is great for beginners — you don't have to think about which number type to use. But it does come with one quirk: floating-point math can produce tiny rounding errors. For example, `0.1 + 0.2` doesn't equal `0.3` exactly — it equals `0.30000000000000004`. This isn't a JavaScript bug; it's how all computers store decimals in binary. For most purposes, it doesn't matter, but it's worth knowing about.

Think of it this way: JavaScript's calculator is always in your pocket — it handles basic arithmetic, rounding, and even tells you when something doesn't compute. Unlike a cash register that separates dollars from cents, JavaScript's calculator treats everything as one continuous number line.
Web Standard
JavaScript numbers follow the IEEE 754 double-precision floating-point standard. This gives you safe integer precision up to 2^53 - 1 (about 9 quadrillion). For everyday web development — calculating prices, scores, dimensions — this is more than enough.
Learn more on MDN
JAVASCRIPTREAD ONLY
CONSOLE
Click "Run" to execute your code...