Numbers and Math
Master JavaScript's number type — arithmetic operators, rounding, and what happens when math goes wrong.
JavaScript has one number type
Many programming languages distinguish between integers (whole numbers like 42) and floating-point numbers (decimals like 3.14). JavaScript keeps it simple: there's just one `number` type that handles both. Whether you write `10` or `10.5`, JavaScript treats them as the same kind of data.
This simplicity is great for beginners — you don't have to think about which number type to use. But it does come with one quirk: floating-point math can produce tiny rounding errors. For example, `0.1 + 0.2` doesn't equal `0.3` exactly — it equals `0.30000000000000004`. This isn't a JavaScript bug; it's how all computers store decimals in binary. For most purposes, it doesn't matter, but it's worth knowing about.