I’ve also had my writing misidentified as being LLM-produced on multiple occasions in the last month. Personally, I don’t really care if some writing is generated by AI if the contents contain solid arguments and reasoning, but when you haven’t used generative AI in the production of something it’s a weird claim to respond to.
Before GPT3 existed, I often received positive feedback about my writing and now it’s quite the opposite.
I’m not sure whether these accusations of AI generation are from genuine belief (and overconfidence) or some bizarre ploy for standing/internet points. Usually these claims of detecting AI generation get bolstered by others who also claim to be more observant than the average person. You can know they’re wrong in cases where you wrote something yourself but it’s not really provable.
Before GPT3 existed, I often received positive feedback about my writing and now it’s quite the opposite.
I’m not sure whether these accusations of AI generation are from genuine belief (and overconfidence) or some bizarre ploy for standing/internet points. Usually these claims of detecting AI generation get bolstered by others who also claim to be more observant than the average person. You can know they’re wrong in cases where you wrote something yourself but it’s not really provable.