Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs are much better at answering math when told to take the character of a drunk mathematician



It assumes this character by default. I asked several AI engines (via poe.com, which includes ChatGPT) to compute Galois groups of polynomials like x^5+x+1 and a couple of others, and in each case got not only a wrong answer, but a total non sequitur reasoning.


Just tried your query on GPT-4 preview: https://pastebin.com/6wPPCdBW

I have no expertise with this area, but it looks plausible to me - i.e. "You didn't give me enough info" vs "lol heres some fan fic about math".


This is exactly the problem. It looks plausible. Every sentence makes sense. But they don't add up. Quote:

> The polynomial given is f(x) = x^5 + x + 1. Since the polynomial has no rational roots (by the Rational Root Theorem) and it is a polynomial with integer coefficients, it is irreducible over the rationals

The polynomial has no rational roots - true. But it's not irreducible. Irreducibility doesn't follow from the absence of rational roots. Here's the factorization:

x^5 + x + 1 = (x^2 + x + 1)*(x^3 - x^2 + 1).


Thank you for clarifying.

I put your remarks into the same prompt and it essentially devolved into recursive garbage after agreeing with you.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: