depends on how it's grown. If it's a black box that keeps improving but not by any means the developer understands, then maybe so. If we manage to decode the concepts of motivation as pertains to this hypothetical AGI, and are in control of it, then maybe no.
There's nothing that says a mind needs an ego is an essential element, or an id, or any of the other parts of a human mind. That's just how our brains evolved, living in a society over millions of years.
Wealth isn't the same thing to all people, wealth as humans define it isn't necessarily going to be what a superintelligence values.
The speed difference between transistors and synapses is the difference between marathon runners and continental drift; why would an ASI care more about dollars or statues or shares or apartments any more than we care about changes to individual peaks in the mid-Atlantic ridge or how much sand covers those in the Sahara?
Wealth doesn't have to be the same thing for everyone for someone to care about. That's evident already because some people care about wealth and others don't.
What does the speed difference of transistors have to do with anything? Transistors pale in comparison to the interconnection density of synapses, yet it has nothing to do with wealth either...
Everything you and I consider value is a fixed background from the point of view of a mind whose sole difference from ours is the speedup.
I only see them valuing that if they're also extremely neophobic in a way that, for example, would look like a human thinking that "fire" and "talking" are dangerously modern.
> Transistors pale in comparison to the interconnection density of synapses
Not so. Transistor are also smaller than synapses by about the degree to which marathon runners are smaller than hills.
Even allowing extra space for interconnections and cheating in favour of biology by assuming an M1 chip is a full millimetre thick rather than just however many nanometers it is for the transistors alone, it's still a better volumetric density than us.
(Sucks for power and cost relative to us when used to mimic brains, but that's why it hasn't already taken over).
>Everything you and I consider value is a fixed background from the point of view of a mind whose sole difference from ours is the speedup.
This is completely made up and I already pointed that out.
>Not so. Transistor are also smaller than synapses by about the degree to which marathon runners are smaller than hills.
So, brains are connected in 3d, transistors aren't. Transistors don't have interconnection density like brains do. By orders of magnitude greater than what you point out here.
>Even allowing extra space for interconnections and cheating in favour of biology by assuming an M1 chip is a full millimetre thick rather than just however many nanometers it is for the transistors alone, it's still a better volumetric density than us.
Brains have more interconnection density than chips do by orders of magnitude. This is all completely besides the point as it has nothing to do with why people value things and why an AI would or wouldn't.
> This is all completely besides the point as it has nothing to do with why people value things and why an AI would or wouldn't.
You already answered that yourself: it's all made up.
Given it's all made up, nothing will cause them to value what we value — unless we actively cause that valuation to happen, which is the rallying cause for people like Yudkowsky who fear AGI takeover.
And even then, anything you forget to include in the artificial values you give the AI is permanently lost forever, because an AI is necessarily a powerful optimiser for whatever it was made to optimise, and that always damages whatever isn't being explicitly preserved even when the agents are humans.
> Transistors don't have interconnection density like brains do.
Only limit is the heat. They are already packed way tighter than synapses. An entire Intel 8080 processor made with SOTA litho process is smaller than just the footprint of the soma of the smallest neuron.