- No, even if you include all the decimals which can be individually described in any notation whatsoever.
You can disregard all the arguments in the other comments about whether "decimals" includes fractions like 1/7 using decimal repeat notation, or irrationals described by a formula like sqrt(2), or transcendentals from mathematical definitions like pi and e.
Those are interesting and deep rabbit holes, but they don't change the answer to your question, because it is still "no" with all of those. Even with all possible definitions which can be written in any symbolic language. This is because the set of all possible definitions which can be written can be enumerated systematically in a list, and mapped 1:1 to all the integers.
- Yes, if you include all the other numbers in the range 0 to 1 which are not ones you can individually describe. Most real numbers in the range 0 to1 are actually these type of "individually undescribables". But I can't point out an individual one, of course.
The "real numbers" contain these. They are present due to a consequence of logic that keeps regular math simpler and more consistent than it would be otherwise.
(Aside: The question of whether 1/3 = 0.3(repeating), times 3 = 0.9(repeating), is equal to 1 is an example of choosing the simpler and more consistent logic. Of course 1/3 times 3 is 1 so 0.9(repeating) must be defined as equal to 1, or fractions wouldn't be consistent with decimals...)
But the rationals (fractions), algebraic numbers (solutions to polynomials with integer coefficients, such as square roots), computables (numbers you can define by any algorithm), and even some types of uncomputables (such as all Chaitin's constants for all enumeration rules), and all mathematically individually definable transcendental numbers like pi/4 and e/3 do not contain these.
It follows that all the "individually undescribables" in the real numbers can only, conceptually, be imagined as infinitely long decimals with no repeats and no pattern to the digits definable by a finite-length rule in any language. You obviously can't write one of them down, you can only conceptualise what one already written down might look like. (For example a spiral of digits of ever descreasing size would fit one in finite area.) And we can only reason about them as a set by logical construction.
If you were to pick a random real number uniformly (ie. fairly) from the range 0 to 1 by picking a sequence of random decimal digits, it would ɓe one of these infinitely long decimals with probability 1. Because simple random values from a continuous range are like this, perhaps this explains why they are actually a natural and not unreasonable concept.
So the answer depends on whether your meaning of "decimal numbers" means the "real numbers" in the continuous range 0 to 1, or just certain ways of writing numbers. From the other comments, evidently some people include all sort of things in their idea of "decimals" including 1/3 = 0.333...(repeating) and the exact value of pi/4 for example, not just finite strings of digits. While other people think of "decimals" as being only strings of digits you can write down, so they would not include the exact value of pi/4 for example. These two meanings of "decimal numbers" give different answers to your question.
- No, even if you include all the decimals which can be individually described in any notation whatsoever.
You can disregard all the arguments in the other comments about whether "decimals" includes fractions like 1/7 using decimal repeat notation, or irrationals described by a formula like sqrt(2), or transcendentals from mathematical definitions like pi and e.
Those are interesting and deep rabbit holes, but they don't change the answer to your question, because it is still "no" with all of those. Even with all possible definitions which can be written in any symbolic language. This is because the set of all possible definitions which can be written can be enumerated systematically in a list, and mapped 1:1 to all the integers.
- Yes, if you include all the other numbers in the range 0 to 1 which are not ones you can individually describe. Most real numbers in the range 0 to1 are actually these type of "individually undescribables". But I can't point out an individual one, of course.
The "real numbers" contain these. They are present due to a consequence of logic that keeps regular math simpler and more consistent than it would be otherwise.
(Aside: The question of whether 1/3 = 0.3(repeating), times 3 = 0.9(repeating), is equal to 1 is an example of choosing the simpler and more consistent logic. Of course 1/3 times 3 is 1 so 0.9(repeating) must be defined as equal to 1, or fractions wouldn't be consistent with decimals...)
But the rationals (fractions), algebraic numbers (solutions to polynomials with integer coefficients, such as square roots), computables (numbers you can define by any algorithm), and even some types of uncomputables (such as all Chaitin's constants for all enumeration rules), and all mathematically individually definable transcendental numbers like pi/4 and e/3 do not contain these.
It follows that all the "individually undescribables" in the real numbers can only, conceptually, be imagined as infinitely long decimals with no repeats and no pattern to the digits definable by a finite-length rule in any language. You obviously can't write one of them down, you can only conceptualise what one already written down might look like. (For example a spiral of digits of ever descreasing size would fit one in finite area.) And we can only reason about them as a set by logical construction.
If you were to pick a random real number uniformly (ie. fairly) from the range 0 to 1 by picking a sequence of random decimal digits, it would ɓe one of these infinitely long decimals with probability 1. Because simple random values from a continuous range are like this, perhaps this explains why they are actually a natural and not unreasonable concept.
So the answer depends on whether your meaning of "decimal numbers" means the "real numbers" in the continuous range 0 to 1, or just certain ways of writing numbers. From the other comments, evidently some people include all sort of things in their idea of "decimals" including 1/3 = 0.333...(repeating) and the exact value of pi/4 for example, not just finite strings of digits. While other people think of "decimals" as being only strings of digits you can write down, so they would not include the exact value of pi/4 for example. These two meanings of "decimal numbers" give different answers to your question.