Hacker Newsnew | past | comments | ask | show | jobs | submit | more jibal's commentslogin

People who are paying for it to happen ... which is the standard mode of operation for this administration. This is just one of many such disasters they are generating.

The only reason to think that is not knowing when Google switched to using LLMs. The radical change is well documented.

You missed the point ... going to distant galaxies is physically impossible.

> Where you change your entire system's resonant frequency, to match what exists in the distant galaxy.

This collection of words does not describe a physical reality.


Consciousness is a physical phenomenon; rainbows, their ends, and pots of gold at them are not.

> Consciousness is a physical phenomenon

This can mean one of 50 different physicalist frameworks. And only 55% of philosophers of mind accept or lean towards physicalism

https://survey2020.philpeople.org/survey/results/4874?aos=16

> rainbows, their ends, and pots of gold at them are not

It's an analogy. Someone sees a rainbow and assumes there might be a pot of gold at the end of it, so they think if there were more rainbows, there would be more likelihood of pot of gold (or more pots of gold).

Someone sees computing, assuming consciousness is at the end of it, so they think fi there were more computing, there would be more likelihood of consciousness.

But just like the pot of gold, that might be a false assumption. After all, even under physicalism, there is a variety of ideas, some of which would say more computing will not yield consciousness.

Personally, I think even if computing as we know can't yield consciousness, that would just result in changing "computing as we know" and end up with attempts to make computers with wetware, literal neurons (which I think is already an attempt)


I'm well aware that many people are wrong about consciousness and have been misled by Searle, Chalmers, Nagel, et. al. Numbers like 55% are argumentum ad populum and are completely irrelevant. The sample space matters ... I've been to the "[Towards a] Science of Consciousness" conferences and they are full of cranks and loony tunes, and even among respectable intelligent philosophers of mind there is little knowledge or understanding of neuroscience, often proudly so. These philosophers should read Arthur Danto's introduction to C.L. Hardin's "Color for Philosophers". I've partied with David Chalmers--fun guy, very bright, but has done huge damage to the field. Roger Penrose likewise--a Nobel Prize winning physicist but his knowledge of the brain comes from that imbecile Stuart Hameroff. The fact remains that consciousness is a physical function of physical brains--collections of molecules--and can definitely be the result of computation--this isn't an "assumption", it's the result of decades of study and analysis. e.g., people who think that Searle's Chinese Room argument is valid have not read Larry Hauser's PhD thesis ("Searle's Chinese Box: The Chinese Room Argument and Artificial Intelligence") along with a raft of other criticism utterly debunking it (including arguments from Chalmers).

> It's an analogy.

And I pointed out why it's an invalid one -- that was the whole point of my comment.

> But just like the pot of gold, that might be a false assumption.

But it's not at all "just like the pot of gold". Rainbows are perceptual phenomena, their perceived location changes when the observer moves, they don't have "ends", and there certainly aren't any pots of gold associated with them--we know for a fact that these are "false assumptions"--assumptions that no one makes except perhaps young children. This is radically different from consciousness and computation, even if it were the case that somehow one could not get consciousness from computation. Equating or analogizing them this way is grossly intellectually dishonest.

> Someone sees computing, assuming consciousness is at the end of it, so they think fi there were more computing, there would be more likelihood of consciousness.

Utter nonsense.


> The fact remains that consciousness is a physical function of physical brains--collections of molecules--and can definitely be the result of computation

Ok, so are LLMs conscious? And if not, what’s the difference between them and a human brain - what distinguishes a non-conscious machine from a conscious entity? And if the consciousness is a consequence of computation, what causes the qualitative change from blind, machine like execution of instructions? How would such a shift in the fundamental nature of mechanical computation even be possible?

No neuroscientist currently knows the answer to this, and neither do you. That’s a direct manifestation of the hard problem of consciousness.

> I've partied with David Chalmers

Sadly, not at an intellectual level.


There's no path from LLMs to AGI.

> spinning up chip fabs that much easier

AI already accounts for 92% of U.S. GDP growth. This is a path to disaster.


What "gets better"? Rapid global warming will lead to societal collapse this century.

The 95th percentile IQ is 125, which is about average in my circle. (Several of my friends are verified triple nines.)

Yeah, that comment is heavy on survivor bias. The universal theme is that things go the way they go.

This is why TFA used Segway as an example.

Lots of things might be nice when the expenditure accounts for 92% of GDP growth.

Nothing is free, especially not AI, which accounted for 92% of U.S. GDP growth in the first half of 2025.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: