Hacker News new | past | comments | ask | show | jobs | submit login

OpenAI came up with the term Generative Pre-Training with the paper introducing what is now called GPT-1. Not arguing that they should own a trademark for it but saying they are “hijacking” the term is disingenuous.



No, they took a pre-existing term "Generative Pretraining"[0] and applied it to Transformers (a Google innovation [1]) to get Generative Pretrained Transformers [2]. Even if you looked at the full name, that paper doesn't use GPT or Generative Pretrained Transformers at all from what I can tell, this commenter [3] claims that the name was first used in the BERT paper.

[0] See this 2012 example: http://cs224d.stanford.edu/papers/maas_paper.pdf

[1] https://proceedings.neurips.cc/paper/2017/file/3f5ee243547de...

[2] https://cdn.openai.com/research-covers/language-unsupervised...

[3] https://news.ycombinator.com/item?id=39381802


It's a pretty dumb acronym though since it's doubly redundant.

The "T" is the only descriptive bit. The transformer is inherently a generative architecture - a sequence predictor/generator, so "generative" adds nothing to the description. All current neural net models are trained before use, so "pretrained" adds nothing either.

It's like calling a car an MPC - a mobile pre-assembled car.


Descriptive terms cannot be trademarks, for obvious reasons. Corporations are not permitted to own the English language.

https://www.uspto.gov/trademarks/basics/strong-trademarks


Shilling for a multi billion dollar corporation makes you look like a fool and even more so when clearly they are on the wrong side of the argument here




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: