> JSON numbers, just like all human readable formats, are decimal...
All JSON numbers are implemented as integers or floating point, and as a result, have to be cast as a decimal (a decimal type is generally something that meets this specification: http://speleotrove.com/decimal/) when you import them.
Decimal types differ from floating point types in three ways: they are accurate, and they take into account rounding rules and precision. Decimal math is slower, can have greater precision and is better suited to domains where finite precision is needed. Floating point is faster, but is not as precise, so it's good for some scientific uses... or where perfect precision isn't important but speed is... say 3d graphics.
I've billed lots of hours over the years fixing code where a developer used floats where they should have used decimals. For example, if you are dealing with money, you probably want decimal. It's one of those problems like trying to parse email addresses with a regex or rolling your own crypto... it will kind a work until someone finds out it really doesn't (think accounting going, our numbers are off by random amounts, WTF?).
A binary double can hold any decimal value to 15 digits of precision, so as a serialisation format it's a bit of a non-issue... you just need to convert to decimal and round appropriately before doing any arithmetic where it matters.
And you're confusing JSON the format with typical implementations. Open a JSON file and you see decimal digits. There is no limit to the number of the digits in the grammar. Parsing these digits and converting them to binary doubles, for example, is actually slower than parsing them as decimals, because you have to do the latter anyway to accomplish the former. Almost all JSON libraries convert to binary (e.g. doubles) because of their ubiquitous hardware and software support...but some libraries like RapidJSON expose raw numeric strings out of the parser if you want to plug in a decimal library
> And you're confusing JSON the format with typical implementations. Open a JSON file and you see decimal digits. There is no limit to the number of the digits in the grammar. Parsing these digits and converting them to binary doubles, for example, is actually slower than parsing them as decimals, because you have to do the latter anyway to accomplish the former.
JSON spec for numbers: integer or float (implemented as a double precision float). JSON libraries read numbers as double precision float because that is the correct type for JSON numbers, not for any other reason.
Sure 99% of decoders convert them to and from binary doubles, but that's purely an implementation choice.