Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It's a pet peeve of mine that we get these kinds of articles without a baseline established of how people do on the same measure

I don’t have a personal human news summarizer?

The comparison is between a human reading the primary source against the same human reading an LLM hallucination mixed with an LLM referring the primary source.

> cynic in me want another question answered too: How often does reporters misrepresent the news?

The fact that you mark as cynical a question answered pretty reliably for most countries sort of tanks the point.





> I don’t have a personal human news summarizer?

Not a personal one. You do however have reporters sitting between you and the source material a lot of the time, and sometimes multiple levels of reporters playing games of telephone with the source material.

> The comparison is between a human reading the primary source against the same human reading an LLM hallucination mixed with an LLM referring the primary source.

In modern news reporting, a fairly substantial proportion of what we digest is not primary sources. It's not at all clear whether an LLM summarising primary sources would be better or worse than reading a reporter passing on primary sources. And in fact, in many cases the news is not even secondary sources - e.g. a wire service report on primary sources getting rewritten by a reporter is not uncommon.

> The fact that you mark as cynical a question answered pretty reliably for most countries sort of tanks the point.

It's a cynical point within the context of this article to point out that it is meaningless to report on the accuracy of AI in isolation because it's not clear that human reporting is better for us. I find it kinda funny that you dismiss this here, after having downplayed the games of telephone that news reporting often is earlier in your reply, thereby making it quite clear I am in fact being a lot more cynical than you about it.


> You do however have reporters sitting between you and the source material a lot of the time

In cases where a reporter is just summarising e.g. a court case, sure. Stock market news has been automated since the 2000s.

More broadly, AI assistants misrepresenting news content may sometimes direct reference a court case. But they often don't. Even if they only could, that covers a small fraction of the news, much of which the AI will need to rely on reporters detailing the primary sources they're interfacing with.

Reporter error is somewhat orthogonal to AI assistants' accuracy.


> Reporter error is somewhat orthogonal to AI assistants' accuracy.

It is not at all. Journalists are wrong all the time, but you still treat news like record and not a sample. In fact I'd put money that AI mischaracterizes events at a LOWER rate than AI does: narratives shift over time, and journalists are more likely to succumb to this shift.


> Journalists are wrong all the time, but you still treat news like record and not a sample

Straw man. Everyone educated constantly argues over sourcing.

> I'd put money that AI mischaracterizes events at a LOWER rate than AI does

Maybe it does. But an AI sourcing journalists is demonstrably worse. Source: TFA.

> narratives shift over time, and journalists are more likely to succumb to this shift

Lol, we’ve already forgotten about MechaHitler.

At the end of the day, a lot of people consume news to be entertained. They’re better served by AI. The risk is folks of consequence start doing that, at which point I suppose the system self resolves by making them, in the long run, of no consequence compared to those who own and control the AI.


> I don’t have a personal human news summarizer?

Is this not the editorial board and journalist? I'm not sure what the gripe is here.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: