I don't think current AI has blown away the Turing test yet, depending on what you mean by the Turning test specifically I suppose. But I'm sympathetic to your overall sentiment.
> We need a way to tell the difference between something a human wrote by "mashing up" it's "training data" and a something a computer wrote doing the same thing.
I don't think people really even think that way, they would be confused by the question. This is more of a niche subject of AI research and scifi, these concepts are not understood at all by normal people, normal people say things like:
Why do we need to tell the difference? We know if something is being generated by a machine unless you hide that fact to cheat on your homework or something. A machine is made by humans, using some wires or a bunch of if then else programming or something, its always going to be different to a human of course, humans are sentient, machines are not.
If you tell them about the control problem they just tell you to pull the plug if it does start acting weird. This stuff is very challenging to communicate to people who aren't already reading lesswrong.
Reading lesswrong? I've never read lesswrong and I've have a general awareness of the problem for a really long time. I guess AI boxing (and why it actually does not work) somehow got into physical books I read some time around 2010 or earlier. Even if you watched 2001 you might get thinking about other ways "just pulling the plug" won't work. Let alone stuff like Colossus: The Forbin Project.
> We need a way to tell the difference between something a human wrote by "mashing up" it's "training data" and a something a computer wrote doing the same thing.
I don't think people really even think that way, they would be confused by the question. This is more of a niche subject of AI research and scifi, these concepts are not understood at all by normal people, normal people say things like:
Why do we need to tell the difference? We know if something is being generated by a machine unless you hide that fact to cheat on your homework or something. A machine is made by humans, using some wires or a bunch of if then else programming or something, its always going to be different to a human of course, humans are sentient, machines are not.
If you tell them about the control problem they just tell you to pull the plug if it does start acting weird. This stuff is very challenging to communicate to people who aren't already reading lesswrong.