Hacker News new | past | comments | ask | show | jobs | submit login

I'm hoping there will always be a good LLM option, for the following reasons:

1) The Pareto frontier of open LLMs will keep expanding. The breakneck pace of open research/development, combined with techniques like distillation will keep the best open LLMs pretty good, if not the best.

2) The cost of inference will keep going down as software and hardware are optimized. At the extreme, we're lookin toward bit-quantized LLMs that run in RAM itself.

These two factors should mean a good open LLM alternative should always exist, one without ulterior motives. Now, will people be able to have the hardware to run it? Or will users just put up with ads to use the best LLM? The latter is likely, but you do have a choice.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: