Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ideally secrets never leave secure enclaves and humans at the organization can't even access them.

It's totally insane to send them to a remote service controlled by another organization.



Essentially, it’s straddling two extremes:

1) employees are trusted with secrets, so we have to audit that employees are treating those secrets securely (via tracking, monitoring, etc)

2) we don’t allow employees to have access to secrets whatsoever, therefore we don’t need any auditing or monitoring


employees are trusted with secrets, so we have to audit that employees are treating those secrets securely

IMHO needing to be monitored constantly is not being "trusted" by any sense of the word.


I can trust you enough to let you borrow my car and not crash it, but still want to know where my car is with an Airtag.

Similarly employees can be trusted enough with access to prod, while the company wants to protect itself from someone getting phished or from running the wrong "curl | bash" command, so the company doesn't get pwned.


Exporting to a SIEM does not correlate to either of those extremes. It’s stupidity and makes auditing worse


SIEM = Security Information & Event Management

Factually, it is necessary for auditing and absolutely correlates with the extreme of needing to monitor the “usage” of “secrets”.

In a highly auditable/“secure” environment, you can’t give secrets to employees with no tracking of when the secrets are used.


That's far from factual and you are making things up. You don't need to send the actual keys to a siem service to monitor the usage of those secrets. You can use a cryptographic hash and send the hash instead. And they definitely don't need to dump env values and send them all.

Sending env vars of all your employees to one place doesn't improve anything. In fact, one can argue the company is now more vulnerable.

It feels like a decision made by a clueless school principle, instead of a security expert.


A secure environment doesn't involve software exfiltrating secrets to a 3rd party. It shouldn't even centralize secrets in plaintext. The thing to collect and monitor is behavior: so-and-so logged into a dashboard using credentials user+passhash and spun up a server which connected to X Y and Z over ports whatever... And those monitored barriers should be integral to an architecture, such that every behavior in need of auditing is provably recorded.

If you lean in the direction of keylogging all your employees, that's not only lazy but ineffective on account of the unnecessary noise collected, and it's counterproductive in that it creates a juicy central target that you can hardly trust anyone with. Good auditing is minimally useful to an adversary, IMO.


> In a highly auditable/“secure” environment, you can’t give secrets to employees with no tracking of when the secrets are used.

This does not seem to require regularly exporting secrets form the employee's machines though. Which is the main complaint I am reading. You would log when the secret is used to access something, presumably remote to the users machine.


I’m well aware of what a SIEM does. You do not need to log a plaintext secret to know what the principal is doing with it. In a highly auditable environment (your words) this is a disaster


In a highly secure environment, don't use long lived secrets in the first place. You use 2FA and only give out short lived tokens. The IdP (ID Provider) refreshing the token for you provides the audit trail.

Repeat after me: Security is not a bolt on tool.


More like a triple lock steel core reinforced door laying on its side in an open field?

Good start, might need a little more work around the edges.


> In a highly auditable/“secure” environment, you can’t give secrets to employees with no tracking of when the secrets are used.

Yeah. So you track them when they are used (which also gives you a nice timestamp). Not when they’re just sitting in the env.


You give employees the ability to use the secrets, and that usage is tracked and audited.

It works the same way for biometrics like face unlock on mobile phones


> Ideally secrets never leave secure enclaves and humans at the organization can't even access them.

Right, but doesn't that mean there is no risk from sending employee laptop ENV variables, since they shouldn't have any secrets on their laptops?


I mean it's right there in the name. They're not really secrets any longer if you're sharing them in plaintext with another company.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: