> Possibly I don’t know how this all works, but I think if the host of a ChatGPT interface were willing to provide their own API key (and pay), they could then provide a “service” to others (and collect all input).
Well, GP was referring to blocking ChatGPT as a federal contractor. I suspect that as a federal contractor, they are also vetting other people that they share data with, not just blocking ChatGPT as a one-off thing. I mean, generic federal data isn’t as tightly regulated as, say, HIPAA PHI (having spent quite a lot of time working for a place that handles both), but there are externally-imposed rules and consequences, unlike simple internal-proprietary data.
But it really seems like a cat and mouse game. For example, a very determined bad actor could infiltrate some lesser approved government contractor and provide an additional interface/API which would invite such information leaking, and possibly nobody would notice for a long time.
And then they could face death penalty for espionage if they leaked sensitive enough data. You would have to be really stupid to build such a service for government contractors unless you actually are a foreign spy.
Well, GP was referring to blocking ChatGPT as a federal contractor. I suspect that as a federal contractor, they are also vetting other people that they share data with, not just blocking ChatGPT as a one-off thing. I mean, generic federal data isn’t as tightly regulated as, say, HIPAA PHI (having spent quite a lot of time working for a place that handles both), but there are externally-imposed rules and consequences, unlike simple internal-proprietary data.