Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't understand why they thought it was a good idea to use an obscure academic acronym for customer facing products. I guess they didn't expect their "API demo" to blow up the way it did and are now stuck with the name.



Isn't "Generative Pre-Trained Transformer" an obscure acronym that openai came up with though?

I know openai didn't invent the transformer, but "Attention Is All You Need" is about translation, they don't mention the word "generative"


> Isn't "Generative Pre-Trained Transformer" an obscure acronym that openai came up with though?

Nope, they didn't coin the acronym. First appearance of OpenAI GPT was in the BERT paper by Devlin et al. 2018.


But that class of models has been known as generative for years even before the attention paper.


I think that's a weak reason to deny a trademark, given WIMP interfaces were well known before MS Windows.


Yep, and had they applied for a trademark back then, it would very likely have been granted.

Like Apple didn't wait the iPhone 4 to trademark the iPhone brand…

But now the term has been used colloquially for years, so it's out of reach of trademark protect at this point.


Who cares if it's an "obscure academic acronym"? Nobody cares. "Laser" is an "obscure academic acronym", it didn't prevent it from catching on. In ChatGPT's case, you should ask around - people remember the name just fine.


> people remember the name just fine.

What about all the people who mistakenly call it ChatGTP? (and yes I have heard this more than once in real life). I assume this is also the origin of the HN user with the same name.


Unlike laser, it's difficult to pronounce and remember.


ChatGPT is not really a customer facing product. It's a crude interface over an LLM, directed more towards developers and tech enthusiasts who will use it to create customer facing products, rather than towards the general public. It only exploded in popularity because of its novelty, but it's far from being generally accessible or useful.


Countless non-techies at my work use ChatGPT. Several members of my family use ChatGPT, most of them haven’t done as much as write an Excel formula. My partner and her friends all use ChatGPT, none of them techies. South Park literally had an episode (over a year ago?) where the entire plot was about using ChatGPT to cheat at both performing in AND marking assessments, as well as responding to needy romantic partners. You are completely, incomprehensibly incorrect.


I'm not saying that it's not popular or not used by non-technical people. I'm saying that its utility as a general consumer product is limited and unclear. Most people don't have a need for a text generator or chatbot, but they would find LLMs useful if they're integrated into other products they already use.

Think of the difference between the Rabbit R1 device and ChatGPT. One is, or attempting to be, something that makes a concrete difference in people's lives. The other is a glorified tech demo trying to find a use case. I'm not vouching for the Rabbit R1 device, just pointing out the difference between a consumer product and ChatGPT.

Most of the gold rush and buzz about LLMs today is in delivering a consumer product, not about GPT-5 or whatever the smarter chatbot is.


The idea of narrowing the use cases is absolutely non-natural and probably comes from the marketing area for projects that are otherwise bad in most regards.

Most people don't have a need for a text generator or chatbot

I believe you never realized why you’d need it yourself.


Most people I've seen are using it as a replacement for google search. Its genericness is the killer feature.


I can also say from in-depth professional experience that OpenAI’s “GPT” models, both via ChatGPT and via the API, can be used to assess student performances in a way that correlates very highly with a human judge. So it’s not just people fooling themselves into thinking that ChatGPT is useful.


People I don't find technical in my circle used it before me and still use it. In my circle it was more correlated to younger age - early 20s, than to technical ability.


The usage is so widespread already, it's really amazing.


ChatGPT is still the only (free of cost? haven't tried anything paid) chat assistant things that allow me to have a conversation in that I push a button and then it keeps listening. I say something it replies and goes back to listening mode without me having to press a button.


I know that was the initial plan, hence why I referred to it as an "API demo", but they now reportedly have hundreds of million active users.


My grandmother uses ChatGPT


Haha. It has always been targeted towards consumers. They even removed the `temperature` setting.


Not _always_. It was just a demo they chucked online. It very quickly _became_ a consumer app after it got traction, though.


I mean we can find countless other products this occurred with in history so I'm not sure what exactly you're trying to measure by it?


It's just that it wasn't something they expected or designed to get mass adoption.


I've been generating images with stable diffusion in the hopes of creating a snowboard design using my own personal computer and the results so far are actually incredible. And we're still in early days.

Setting up a text only LLM is just as trivial. And frankly better cause you can train it for yourself and tag out a model and save it.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: