I don't see this article doing anything to help define intelligence in a useful way.
1) Defining "intelligence" as ability to "understand" isn't actually defining it at all, unless you have a rigorous definition of what it means to understand. It's basically just punting the definition from one loosely defined concept to another.
2) The word "intelligence", in common usage, is only loosely defined, and heavily overloaded, and you'll get 10 different definitions if you ask 10 different people. It's too late to change this, since the meaning of words comes from how they are used. If you want to know the various ways the word is used then look in a dictionary. These are literally the meanings of the word. If you want something more precise then you are not looking for the meaning of the word, but rather trying to redefine it.
3) When we talk about "intelligence" with regards to AI, or AGI, it seems that what people really want to do is to define a new word, something like "hard-intelligence", something rigorously defined, that would let us definitively say whether, or to what degree, an "intelligent" system (animal or machine) has this property or not.
Of course to be useful, this new word "hard-intelligence" needs to be aligned with what people generally mean by "intelligence", and presumably in the future the one of the dictionary senses of "intelligence" will be hard-intelligence.
I think the most useful definition of this new word "hard-intelligence" is going to be a functional one - a capability (not mechanism) of a system, that can be objectively tested for, even with a black box system. However, since the definition should also align with that of "intelligence", which historically refers to an animal/human capability, then it seems useful to also consider where does this animal capability come from, so that our definition can encompass that in most fundamental way possible.
So, with that all said, here's how I would define "hard-intelligence", and why I would define it this way. This post is already getting too long, so I'll keep it brief.
The motivating animal-based consideration for my definition is evolution, and what is the capability that animals evolved to possess intelligence (to varying degrees) have that other animals do not, and what survival benefit does this bring that compensates for the huge cost of large brains in animals with advanced intelligence?
I consider the essence of evolved animal intelligence to be prediction, which means that the animal is not restricted to reacting to the present, but also can plan for the predicted future, which obviously has massive survival benefit - being able to predict where the food and water will be, how the predator is going to behave, etc, etc.
The mechanics of how functional prediction has evolved in different animals varies from something like a fly, whose hard-coded instincts help it avoid predicted swats (that looming visual input predicts I'm about to be swatted by the cow's tail, so I better move), all the way to up to species like ourselves where we can learn predictive signals, outcomes, and adaptive behaviors, rather than these being hard coded. It is widely accepted that our cortex (and equivalent in birds) is basically a prediction machine, which has evolved under selection pressure of developing this super-power of being able to see into the future.
So, my definition of "hard-intelligence" is degree of ability to use, and learn from, past experience to successfully predict the future.
That's it.
There are of course some predictive patterns, and outcomes, that are simple to learn and recognize, and others that are harder, so this is a matter of degree and domain, etc, but at the end of the day it's an objective measure that can be tested for - given the same experiential history to learn from, can different systems correctly predict the continuations of new inputs that follow a similar pattern.
This definition obviously captures the evolutionary super-power of predicting the future, which is at least one of the things that intelligent animals can do, but my assertion, on which the utility of this definition of "hard-intelligence" is based, is that prediction is in fact the underlying mechanism of everything that we consider as "intelligent" behavior. For example, reasoning and planning is nothing more than predicting the outcomes of a sequence of hypothetical what-ifs.
tl/dr - "intelligence" is too fuzzy of a concept to be useful. We need to define a new rigorously defined word to discuss and measure machine (and animal) intelligence. I have suggested a definition.
1) Defining "intelligence" as ability to "understand" isn't actually defining it at all, unless you have a rigorous definition of what it means to understand. It's basically just punting the definition from one loosely defined concept to another.
2) The word "intelligence", in common usage, is only loosely defined, and heavily overloaded, and you'll get 10 different definitions if you ask 10 different people. It's too late to change this, since the meaning of words comes from how they are used. If you want to know the various ways the word is used then look in a dictionary. These are literally the meanings of the word. If you want something more precise then you are not looking for the meaning of the word, but rather trying to redefine it.
3) When we talk about "intelligence" with regards to AI, or AGI, it seems that what people really want to do is to define a new word, something like "hard-intelligence", something rigorously defined, that would let us definitively say whether, or to what degree, an "intelligent" system (animal or machine) has this property or not.
Of course to be useful, this new word "hard-intelligence" needs to be aligned with what people generally mean by "intelligence", and presumably in the future the one of the dictionary senses of "intelligence" will be hard-intelligence.
I think the most useful definition of this new word "hard-intelligence" is going to be a functional one - a capability (not mechanism) of a system, that can be objectively tested for, even with a black box system. However, since the definition should also align with that of "intelligence", which historically refers to an animal/human capability, then it seems useful to also consider where does this animal capability come from, so that our definition can encompass that in most fundamental way possible.
So, with that all said, here's how I would define "hard-intelligence", and why I would define it this way. This post is already getting too long, so I'll keep it brief.
The motivating animal-based consideration for my definition is evolution, and what is the capability that animals evolved to possess intelligence (to varying degrees) have that other animals do not, and what survival benefit does this bring that compensates for the huge cost of large brains in animals with advanced intelligence?
I consider the essence of evolved animal intelligence to be prediction, which means that the animal is not restricted to reacting to the present, but also can plan for the predicted future, which obviously has massive survival benefit - being able to predict where the food and water will be, how the predator is going to behave, etc, etc.
The mechanics of how functional prediction has evolved in different animals varies from something like a fly, whose hard-coded instincts help it avoid predicted swats (that looming visual input predicts I'm about to be swatted by the cow's tail, so I better move), all the way to up to species like ourselves where we can learn predictive signals, outcomes, and adaptive behaviors, rather than these being hard coded. It is widely accepted that our cortex (and equivalent in birds) is basically a prediction machine, which has evolved under selection pressure of developing this super-power of being able to see into the future.
So, my definition of "hard-intelligence" is degree of ability to use, and learn from, past experience to successfully predict the future.
That's it.
There are of course some predictive patterns, and outcomes, that are simple to learn and recognize, and others that are harder, so this is a matter of degree and domain, etc, but at the end of the day it's an objective measure that can be tested for - given the same experiential history to learn from, can different systems correctly predict the continuations of new inputs that follow a similar pattern.
This definition obviously captures the evolutionary super-power of predicting the future, which is at least one of the things that intelligent animals can do, but my assertion, on which the utility of this definition of "hard-intelligence" is based, is that prediction is in fact the underlying mechanism of everything that we consider as "intelligent" behavior. For example, reasoning and planning is nothing more than predicting the outcomes of a sequence of hypothetical what-ifs.
tl/dr - "intelligence" is too fuzzy of a concept to be useful. We need to define a new rigorously defined word to discuss and measure machine (and animal) intelligence. I have suggested a definition.