You're essentially programming using English. Anything that isn't mentioned explicitly - the model will have a tendency to misinterpret. Being extremely exact is very similar to software engineering when coding for CPU's.
1. The text is _engineered_ to evoke a specific response.
2. LLM's can do more than answer questions.
3. Question answering usually doesn't need any prompt engineering, since you're essentially asking an opinion where any answer is valid (different characters will say different things to same question, and that's valid).
4. LLM's aren't humans, so it misses nuance a lot and hallucinates facts confidently, even GPT4, so you need to handhold it with "X is okay, Y is not, Z needs to be step by step", etc.
I want, for example, to make it write an excerpt from a fictional book, but it gets a lot of things wrong, so I add more and more specifics into my prompt. It doesn't want to swear, for example - I engineer the prompt so that it thinks it's okay to do so, etc.
"Engineer" is a verb here, not a noun. It's perfectly valid to say "Prompt Engineering", since this is the same word used in 'The X was engineered to do Y' sentence.
>The text is _engineered_ to evoke a specific response.
My grandma can say she engineered Google search to give search results from her location.
> "Engineer" is a verb here, not a noun. It's perfectly valid to say "Prompt Engineering", since this is the same word used in 'The X was engineered to do Y' sentence. >
You guys are just looking for ways to make people feel like they are doing something big in prompting AI models for whatever tasks, even with custom instructions etc
I know the word Engineer can be used in various ways, "John engineered his way to premiership", "The way she engineered that deal" etc, if it's the way it's being used here fine then.
There is a reason why graphic designers have never called themselves graphic engineers
Your grandma can say she engineered Google but clearly you cant because all it takes is a few minutes to look at the history of the term to answer your own questions. I realize some folks are salty they paid a ton of money for the idea that a piece of paper gives them some sort of prestige. And it does, to 0.001 of humans in the world who are associated with whatever cul...I mean institution that sold you something that is free, with a price premium and a cherry of interest on top. All so you would feel satisfied someone, anyone, finally acknowledged your identity. A great deal of the engineers that built the modern internet never got a formal degree. But they did get something better: real practical experience attained via tinkering.
> I realize some folks are salty they paid a ton of money for the idea that a piece of paper gives them some sort of prestige.
Actually the paper does, but my issue is not papers, rather knowledge. The level of knowledge needed for something to be called engineering
And I have noticed your answers relate prompt engineering to software engineering/programming questions. But if you look at that OpenAI doc, even asking to summarise an article is prompt engineering.
> A great deal of the engineers that built the modern internet never got a formal degree. But they did get something better: real practical experience attained via tinkering.
We have a lot of carpenters, builders, mechanics with no formal education that we call Engineers in our everyday life without any qualm because of their knowledge and experience. Don't look at it only from the lens of software engineering.
I still maintain prompting an AI model doesn't need to be called engineering.
If you are a developer doing it through an API or whichever way, you still doing whatever you've been doing before prompting entered the chat.
Maybe the term will be justified in the future.
Side Note: This conversation led me to Wikipedia (noticed some search results along the way). This prompt business is already lit, I shouldn't have started it