Hacker News new | past | comments | ask | show | jobs | submit login

Realistically that's the actual headline. Only another AI can replace AI, pretty much like LLMs / Transformers have replaced "old" AI models in certain task (NLP, Sentiment Analysis, Translation etc) and research is in progress for other tasks as well performed by traditional models (personalization, forecasting, anomaly detection etc).

If there's a better AI, old AI will lose the job first.




> NLP, Sentiment Analysis, Translation etc

As somebody who got to work adjacent to some of these things for a long time, I've been wondering about this. Are LLMs and transformers actually better than these "old" models or is it more of an 80/20 thing where for a lot less work (on developers' behalf) LLMs can get 80% of the efficacy of these old models?

I ask because I worked for a company that had a related content engine back in 2008. It was a simple vector database with some bells and whistles. It didn't need a ton of compute, and GPUs certainly weren't what they are today, but it was pretty fast and worked pretty well too. Now it seems like you can get the same thing with a simple query but it takes a lot more coal to make it go. Is it better?


It's 80/20, but in some tasks it's much better (e.g. translation)

Nonetheless the fact that you can just change a bit the prompt to instruct the model to do what you want makes everything much faster.

Yes the trade-off is that you need GPUs to make it run, but that's why we have cloud


Yep, it's an 80/20 thing. Versatility over quality.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: