Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So basically tracking "business-as-usual" attacks (probably 99% low-tier, low effort attacks that didn't go anywhere) or serious attack examples on other companies isn't going to change the culture. But a full blown, highly skilled attack, with a fully dedicate adversary, specifically targeted at your business and business value, with potentially devastating consequences - will do a better job of waking people up?


Yes. It seems stupid, but what you outline is a 100% match with what I’ve seen in practice. Until it happened to them, the other company was stupid and careless - not them.

At risk of repeating myself I’m chalking this up to basic human behaviour in all fields of life and the lack of taking responsobility by 80% of all people.

Security is very lopsided in that you just need 1 person to be careless for the attacker to get in, while the defender needs to be 100% secure across all vectors.

I could discuss this all day, and you know the importance of the topic, I know it, but the fact of the matter is that most non-tech people think of security as an annoyance. The solution? No idea yet, other than finding the right chord to strike and “fix” this psychological problem. We’ve made significant strides the last few months but getting companies more security conscious has been a tougher nut to crack than I first anticipated.

Feel free to email me at stan@site.security if you want to exchange thoughts on the topic. I’d love to take a deeper dive into the matter with anyone that’s passionate about solving the security problem in any way shape or form :)


Sounds like you've learned from experience.

Infosec in practice (not imaginary scenarios) is also about good hygiene by the regular plebs and investing in proper QA by hiring some people who are naturally paranoid and with enough clout to push back on bad lazy/ideas.

That plus regular fixed check ups, where more deep dives are done.

Like you said, and as the article points out, it seems to be as much a cultural day-to-day thing as it is about technical searches for vulnerabilities. Or worse renstalling noisey monitoring systems with a bunch of full positives and pointless investigative rabbit holes.


I think it depends on what your objective is.

Strictly helping quanitfy technical and personnel risk? Grey all the way.

Getting the C-class to understand what a worst case scenario could be from a completely external threat. Black drops the most jaws.

Ultimately the later is a cultural hack. The absolutely hardest hack of all in my opinion. A way to get the herd moving in a different direction.

So it depends on what you are "hacking."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: