Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
euclaise
on March 25, 2023
|
parent
|
context
|
favorite
| on:
GPT4 Has 1T Parameters
I don't trust this. The article cites semafor (
https://www.semafor.com/article/03/24/2023/the-secret-histor...
), but semafor states the 1T parameter count without any source.
QuadrupleA
on March 25, 2023
|
next
[–]
Yeah seems spotty. Especially considering recent "chinchilla scaling" laws suggesting training set size is generally the current bottleneck, the mileage llama/alpaca gets out of 7b/13b, the huge inference cost of 1T, etc.
brianjking
on March 25, 2023
|
prev
|
next
[–]
Yeah, I'm highly suspicious too. Even the arxiv article from the MS researchers doesn't have specifics about the # of parameters in GPT-4.
lambo4bkfast
on March 26, 2023
|
prev
[–]
Sam altman said it had 1T parameters in the Lex Fridman podcast
Consider applying for YC's Summer 2026 batch! Applications are open till May 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: