If you are talking about GPT-4, unless you are insider (doubt it) you'd have no way of proving it either way because that info is not public.
> Llama2 doesn’t and outclasses llama.
My point still stands, llama2 is just one llm, and we still don't know the distribution of their training set.