The latter. Such as big.js (not an endorsement, my understanding is too weak for that).
Just about every practically used programming environment from every walk of life has such a thing in the standard library, and I suppose they are there for good reasons. However, the fact is that Javascript doesn't have one, so one will have to weigh dependencies into their decision.
Go - math/big
Java - BigDecimal
Python - decimal
Ruby - BigDecimal
PostgreSQL - numeric
Interesting mention: decimal in C#, deemed sufficient for monetary calculations at 128 bits of precision.
Another interesting mention: Oracle, which in my understanding handles just about all its numbers this way, by default. This might tell you something about their early customer base.
CS history lesson -- don't use rationals for money. You have no business messing with the denominator, so the extra freedom of using arbitrary rationals is just more rope to hang yourself with.