Depends what you mean by decimal. Decimal is a system of notation, does it count as a decimal number if it cannot be written in decimal notation (in finite time)?
if not then they are equal, if yes then there are more decimal numbers between 0 and 1 than integers.
I don't usually use that term, but I take it to mean "number you (may, and, if not using e.g. fractions, must) write using a decimal point" because that seems to always be what people intend by it.
Everybody experienced writing irrational numbers using decimal notation in school, so those definitely count.
You truly thought I didn't realize that "3.14" is an abbreviated representation of π, or that I somehow missed years and years of using the "repeating" sign above various decimal representations, or all those "..."s, such that it was plausible I meant the obviously-wrong thing rather than the correct thing? This stuff is hammered in in US K-12 school.
[EDIT] Look, I don't mean to be a dick, performative misreading and plainly-unnecessary "correction" are just two of my least-favorite types of HN post. I probably should have just downvoted the original performative misreading (not yours, the one up-thread) and not Assumed Good Faith that the original poster genuinely doesn't understand what every non-math-nerd means when they say or write "decimal number" (it's the ones you write with a decimal. It's... so very simple, that's why non-math-nerds use that and not "real number", the definition of which they've long since forgotten. "Well but you can't actually represent irrationals them entirely in decimal notation" great, wonderful, has zero bearing on what people mean by it).
You really are being mean about someone trying to help you. Not sure why.
“Decimal numbers” is not a term routinely used by mathematicians (quite distinct from primary and secondary teachers of arithmetic who are, unfortunately, rarely mathematicians), precisely because of the confusion you, perhaps unwittingly, elicited. If you mean by this phrase all infinite series with a decimal approximation, then you’re talking about the reals. Some people thought you meant this!
Other people, also quite reasonably, interpret “the Decimal numbers” to mean all numbers that can actually be expressed with (finite) decimal notation, in which case you are talking about (a subset of) the rationals.
It is extremely important, when discussing different sets, to be clear about the difference between these two.
Yes, but you'll see "decimal" more in the wild, and that's what people mean by it. "You write it with a decimal point", and they do usually mean to include the irrationals. So, yes, real numbers, but the reasoning behind their usage is "you write it with a decimal point". I'd bet more people understand "decimal number" used in that sense, than understand "real number".
> Yes never, not in school, not in analysis, and certainly not in numerical analysis.
Weird, I just assumed that was normal in most education systems. I don't know how you'd get a sense of the rough scale of various common irrationals, without having some idea what they look like when represented in decimal notation. Such representations are normal starting not later than when we start seriously working with circles, in US school, and never really stop coming after that. Estimation exercises lean heavily on having some idea of the decimal representation.
> You've proved my point. It's either π or 3.14. Except that the latter is a rational number :)
Never claimed π is 3.14, so no, I didn't at all prove your point. I wrote that it's very well known that it starts that way. When a normal person says "decimal number" they mean to include π, because any usefully-precise decimal representation of it's going to involve a decimal point. At least in the US, they saw it represented "3.14..." or "3.1459..." or whatever, many, many times in school. It's obviously, to a non-mathematician, a "decimal number". They mean "the real numbers" (or, perhaps, depending on context, exclusively the parts of the reals that aren't whole integers), except that name is harder to remember than the incorrect (but more common and intuitive) "decimal numbers".
if not then they are equal, if yes then there are more decimal numbers between 0 and 1 than integers.