Hacker News new | past | comments | ask | show | jobs | submit login

Llama2 doesn’t and outclasses llama. I believe GPT was trained in same manner



> I believe GPT was trained in same manner

If you are talking about GPT-4, unless you are insider (doubt it) you'd have no way of proving it either way because that info is not public.

> Llama2 doesn’t and outclasses llama.

My point still stands, llama2 is just one llm, and we still don't know the distribution of their training set.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: