Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why not talk to the employee first?


This could be valid...but with something as powerful as ChatGPT, if it is providing huge benefits for employee productivity, they are unlikely to dump it based off a co-worker's suggestion. Also, unless managing security is within your roles and responsibilities, this approach would likely turn messy from an interpersonal aspect. Lastly, the security issue has already happened, so if this is truly a security concern, the security team should know that (a) something is already out there (b) this could be a widespread problem in the future.

FWIW I don't think the employee should be fired for this or anything, if anything a company could embrace these new technological advances and provide training on using ChatGPT in a more secure manner(ie don't paste your customer's PII into a prompt, etc...).


> if it is providing huge benefits for employee productivity

With this particular employee, using chatGPT has not increased his productivity or the quality of his work by any noticeable degree.

> I don't think the employee should be fired for this or anything

The problem isn't using the technology. The problem is sharing confidential information with an unapproved entity. That is specifically and clearly spelled out as a firing offense, for pretty obvious reasons.

Even if some people feel that it's an overly tight policy, it's a the stated policy and the company has every right to put and enforce whatever rules it wishes about the use of its own data.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: