Hacker News new | past | comments | ask | show | jobs | submit login
Flush times for hackers in booming cyber security job market (reuters.com)
162 points by mancerayder on July 28, 2017 | hide | past | favorite | 71 comments



What saddens me is that, while red team pen testing is a very "hot" (high employer demand, high salaries) job market, people don't generally care about the blue team. It's easy to get a pentesting gig that pays well, but employers don't ask for/value people with the competence to build/maintain secure applications/networks/solutions in my experience. Instead they pay for recurring pen tests which results in internal tickets/issues/BUGs, while the development/operation practices are kept the same.


It's because internal econometrics are resulting in perverse incentives.

Defensive security is a cost center without clear, deterministic metrics for success. Let's say you pay X on defensive security (which is an oversimplification when you're talking about a cultural change, but that cultural change involves people learning how to pay attention to security, and paying attention is a form of man-hours, for which a cost can be calculated). If you don't get attacked, is it because the X you paid is high enough to deter/foil attackers, or could you have paid less and achieved the same result? If you are attacked and the attackers get past your defenses, is it because the X you paid wasn't enough, or if you had spent more, would the attackers have succeeded anyway, because of their relative power and motivation? For defensive security, it's very, very hard to justify to bean counters that X was the correct amount of money spend, no matter what the real outcome is, because it's hard to understand X's affect on that outcome.

Pentests which result in tickets/issues/etc. are much easier to justify. The company spent X on the pentest, and it got Y feedback in return. Simple, and effective, at least in the short-term.

It's part of the overall challenge that organizations face when they become metrics-driven. People choose the path of least resistance, so if you ask people to measure data, they'll measure the data that's easiest to measure. Data that's harder to measure - culture and social attitudes - becomes "not a priority" to measure.


You probably meant Economics rather than Econometrics. Economics concerns itself with incentives and payoffs, whereas Econometrics is the statistical study of economic systems. Apologies for the pedantry


Isn't a result of pentest a direct measure of defensive security as well?


It's not only "internal". There isn't really a functioning market for "middleware" in much of software, creative areas excluded. I don't really believe that managers, users or anyone else is primarily responsible. At the end of the day most developers aren't very good at security and they aren't necessarily willing to pay for it either.


it's like everything in a big company. every round of red team on my team's applications we just sit and laugh as they find nothing, yet we serve data to billions of customers every hour, from a myriad of complex entry points. nobody cares, and when I mention that on my self reviews it looks like I am padding it.

then the other teams only handle requests from the ios app they own, and red team finds tons of amateur attacks that work. they spend a quarter fixing it, and boast that they worked with the red team to patch hundreds of vulnerabilities. and everyone is promoted.

but that is not new. it always happened with teams that causes outages, or teams that miss out obvious revenues stream for years. remedial action for some reason is always rewarded in troubled big corps.


>remedial action for some reason is always rewarded in troubled big corps. reply

>for some reason

I would go out on a limb to say it's definitional. A troubled Big Corp is troubled precisely because it focuses on the wrong thing.


That's sad to hear. Possibly you could demonstrate "We defend against XXX attacks, compare that to the iOS app which defends against XX"


My experience is inconsistent with yours. In my 12 years of experience in the security industry, I have found that the blue teams (engineers who develop, maintain and secure network, data and applications) have higher demand and higher salaries than the red teams or pen-testing teams.

My experience is limited to security software development in e-banking, e-commerce, network security and data security domains in technology areas like cryptography, PKI, deep packet inspection, and network protocols. I know my experience may not be representative of the entire security industry and there is a possible selection bias too (i.e. I may have seen more demand for blue team engineers because I have belonged to blue teams myself), but I thought I should share my experience here to present the other side of the story.


As a Blue team member who works in the medical field, I must agree, that demand is high. We hire a firm to do Red team, but we do have our internal folks too, along with us Blue team folks who learn Red team stuff. We worked with a major security firm and together we accidentally created Purple team.

We had the Red team come in and while pentesting share his screen with us all. Another Red team member explained what he was doing and after an attack was launched and we would see if our tools detected the activity. If they didn't, we went out to find out why. This was huge. It showed us where we needed to tune some things and where we needed newer and/or different tools.

This isn't the only way we get pen tested. They do their annual "regular" pentest. The Purple team thing was awesome though. We learned a ton. Since I happen to own most of our tools and am secondary on the ones I don't own, I have learned a tremendous amount and I've been in IT for 20 years.


Thanks for sharing this. You mention banking - protecting money seems like a much higher incentive for blue team than say, the security of a forum or online game server.


Unfortunately not at the FIs I have worked for. Their approach seems to be "give the illusion of security while covering any actual losses with insurance".

I think there needs to be greater punishment for companies that lose customer information. Only then will the incentives be large enough for something to be done.


It's hard to put a price on good will and customer trust. Every C level should realize that their company is an information company and take appropriate action.


I used to work for a pentest company and everytime we engage with a customer, we discover a lot of problems and security issues which we document and submit to the customer.

We discover after a while that another company gets a contract 10x our price fixing the issues we discovered


10x sounds kind of reasonable, it's harder to make things than break things.


Sounds like a missed opportunity to at least get a referral fee for resolution.


Conflict of interest.


Assuming the company is voluntarily hiring them, and not being required by a contract or law that needs an independent 3rd party for auditing purposes, it seems like the pentesting company would do an even better job on the inspection if they had a good chance of getting repair work down the line for every issue they found.

If they make up a bunch of minor things that don't matter, you can ignore those and focus on the important ones. I suppose if you don't have any in-house expertise at all to evaluate what they say, the conflict would be more important?


It's simply bad practice to have the people that are involved in the 'checking' making money or being involved in the 'fixing' in any way shape or form.

You'll see this in almost every situation that is somehow related to auditing.


Like colleges.


I'm hiring for blue team at Akamai. We've picked up some fantastic early-career people into our training program; also looking for mid-career architects, senior architects.

https://akamaijobs.referrals.selectminds.com/jobs/senior-lea...

https://akamaijobs.referrals.selectminds.com/jobs/security-a...

https://akamaijobs.referrals.selectminds.com/jobs/manager-in...


Honestly, as someone who is leaving "blue-team" network security work for a "DevOps" production team (I know, I know), it's really a mixed bag. I've done blue-team for 2 companies now and honestly the job is more project management than anything else. I found that I was very rarely actually getting hands-on with technology. When implementing a new piece of security tech, we were simply directing other teams to perform most of the the actual technical work (this was the case at both of my "security engineering" jobs). I didn't get any of the satisfaction of building anything, solving problems etc.

The other big thing to note is that a lot of companies have security teams solely to meet audit requirements. If you find yourself on a team like that, you'll be spending a lot of time just gathering evidence for audits, remediating findings and writing policy. I really loved security intellectually, but in practice, the blue-team side of things wasn't my cup of tea.


This will change with time. I have thought, studied, talked, and written extensively on this general theme. What comes to mind to frame it for you is my "four principles:" [a]

1. Compromise is inevitable 2. Default-allow products always fail 3. 1 and 2 are not opinion or marketing spin, just simple truths 4. As an industry, we are still learning 1 and 2

Security is slowly shifting from an administrative IT function to an operational function. In IT, the business value comes from the products and people are a tax required to administer the products. In security operations, the business value comes from the people, products are just tools in their toolbag. [c]

Keep walking this dog and you realize basic IT activities for core infrastructure are critical for security, to the point the CIO will report to the CISO -- unless the CIO steps up. [b]

So - in short - your frustrations are accurate, but the winds are shifting. Companies will incresingly value top people for their internal staff/blue teams. It's going to take a few more years, but I believe it is inevitable.

[a] - https://www.linkedin.com/pulse/my-four-cybersecurity-princip... [b] - https://www.linkedin.com/pulse/cio-report-ciso-j-j-guy [c] - https://www.linkedin.com/pulse/cio-report-ciso-why-j-j-guy


I'm lucky enough to be with a company with a couple people on both sides. Originally we were all blue team (that was VERY poorly integrated with the engineering group), but ive started shifting one guy over to full time pentesting and working on integration with engineering.

> results in internal tickets/issues/BUGs, while the development/operation practices are kept the same.

You could not be more accurate; this also applies to groups that maybe started out as corporate infosec (virus protection, simple application scanning, etc...) and were never really tightly coupled with engineering. We have identified essentially identical authorization issues in a pre-release version of one of our products two or three times this year, which was also present in the last 3rd party pentest of the same product before my time (which was pretty scathing). Its incredible.


The company I work for is aggressively hiring blue teamers. The core of our product is essentially a security product, so we need someone who has at least some professional engineering experience as you would be responsible for helping to lead teams building certain parts of our infrastructure and product. Someone who can hop onto code reviews that deal with sensitive areas would be fantastic.

We're located in Boulder but for the right candidate we'd consider remote, although that might involve relatively frequent travel.

cGhpbGlwLmRldWNobGVyQGp1bXBjbG91ZC5jb20= for contact


In theory you could take advantage of this by starting a company that hires people from the blue side at a premium and create products with top notch security.

Unfortunately I am not sure if currently consumers care that much about security of their products relative to convenience, price, and eye candy.

But in theory those who spend lots on red side will end up having a more expensive product and a bad reputation, so perhaps investing more on blue will win in the long run.


The trouble with that is that b2b contracts will specify pen tests so you'll incur that cost anyway to demonstrate you don't have security holes.

"Regular pen test" is seen as demonstrating security, which is as little perverse because the results don't typically get published so you could be having the same issues year after year and look just as good as someone who gets a clean bill each time.


This. Not interested in pen testing at all, just want to build things that don't suck from a security perspective. Happy to break things along the way to gain perspective, and I like to know how things work, but I'm a software engineer who likes security. A builder, not a breaker, by nature.


I am totally into this field as a bystander. When many people would watch late night TV or listen to music or a podcast, I'll scour YouTube for defcon and CCC talks I haven't seen yet. I am good with python, Javascript, web and graphic design, technical writing, all kinds of stuff. I live in rural neighbor-island Hawaii and the only tech jobs I ever see out here are military, which I deeply respect but don't think would be a good culture fit for me. I've just transitioned to working part time and intend to dive in to some open source projects with my extra time. Is there a good path of entry for someone with a deep natural curiosity about the field, self-trained in coding but with no industry connections or much in the way of related professional experience?


1) Schneier's advice from a few years ago is still accurate:

https://www.schneier.com/blog/archives/2012/07/how_to_become...

2) The Reddit NetSec FAQ has a good list of resources for beginners (and those starting to specialize):

https://www.reddit.com/r/netsec/wiki/start

3) Finally, each of these popular "Getting Started in Security" guides has a slightly different, but useful, opinion on the specifics of the path to take:

https://medium.freecodecamp.org/so-you-want-to-work-in-secur...

https://danielmiessler.com/blog/build-successful-infosec-car...

https://www.trustwave.com/Resources/SpiderLabs-Blog/Getting-...

https://tisiphone.net/2015/10/12/starting-an-infosec-career-...


Correct URL for Reddit NetSec FAQ: https://www.reddit.com/r/netsec/wiki/start


Thanks for the correction - fixed the typo!


Hit some bounties. Earn some income and it generally looks good on a CV that you can deliver real world results. Just try and go deeper.


Fastest way would be OSCP cert.


OSCP, red teaming, possibly CISM or CISSP for upward mobility.


I would not waste time and money on certifications.


As someone not in the field, but curious of getting in, could you explain why not?


There is a bias against certifications by some (but by no means all) professionals in InfoSec, since it is a heavily "hands-on" field. There is more emphasis on demonstrating actual ability through CTFs, bug bounties, published exploits, etc.

However, unlike Certified Ethical Hacker, CISSP, and other "mile wide, but inch deep" certs, the OSCP is a heavily hands-on certification that tests actual ability. No knowledgeable employer would discriminate against you for earning it.

And CISSP or CISSM are valuable if you're applying for a management job. For government defense-sector jobs, they are often required.


I saw a talk by the founder of this company: https://radicallyopensecurity.com/

They're based in Amsterdam, but she said that a lot of their pentesters and engineers are remote (all over the world).

Might be worth reaching out!


Try the military. It's usually better than people imagine.


Military are the best at exploiting cybersecurity skills for their advantage


I advise companies on tech security, and talent is very much needed.

What's surprising (at first glance) is that the security talent need is very strong in UI/UX/CX.

For example, security is needed to gradually escalate a user's own identity verification -- think of things like two-factor auth and multi-factor auth, that can phase in (or ramp up) when a user's actions enter a gray area of risk.

Some examples: when a user signs in from a new location, or a user does an especially large money transfer, or a user resumes an account that's been dormant for years, etc.

The UI/UX/CX is especially necessary to 1) explain to the user that there's a security issue, 2) how the user can fix it, and 3) how to improve backend systems to handle users who ask for help.


I work in infosec as a security engineer, I agree with this more than anything else anyone has posted about the state of the security industry on HN, ever.

For security being a human problem, I have yet to meet infosec types with strong backgrounds in human factors or product design. Nearly everyone came to this industry from netsec/IT/SOC work, or low-level programming, and neither group has a decent understanding of the usability issues that plague the security posture of common users. What works for a CLI junkie with deep systems knowledge absolutely fails people who barely know how to navigate their Android phone.

If anyone's interested in attempting to solve some of these design pattern issues, please reply here or DM me on Twitter. I'd love to actually get a group of people together trying to come up with standard, secure UX paradigms that can be referenced by others.


>What's surprising (at first glance) is that the security talent need is very strong in UI/UX/CX.

Which technologies specifically?


For example some keywords: rapid web/mobile prototyping, info visualization, split test planning, throttle rollouts, accessibility areas, i18n/l10n, risk management, compliance verification, pattern recognition, time series analysis, threat modeling of web usage, relevance ranking, bloom filters, HIPAA/FERPA/SOX/ISAE etc., client-side cryptography, graylisting, social proof verification, identity theft mitigation, etc.


>Which technologies specifically?

I think that's part of the issue, security minded folks are often very analytical and come from CS backgrounds and the demand is for people who understand how design interacts with technology to, in this case, create secure methods of actually using a technology.

So, to answer your question, none. It's about the mindset of the designer.


How can I get into this field?

I used to love doing pen-testing when I was a teenager, and paid for my first car out of bug-bounties.

Unfortunately, I got distracted by girls and booze at university and didn't keep it up, now I work in sigh enterprise C#/WPF land.


Daniel Miessler has a good general guide: https://danielmiessler.com/blog/build-successful-infosec-car... tptacek, who posts often on HN, also has some wise words: https://krebsonsecurity.com/2012/06/how-to-break-into-securi...

There are so many sources of information and learning grounds available now - bug bounties, certifications, war games, online tutorials, blogs, conferences etc.

I would suggest choosing a particular area of interest to begin with and deep-diving on that subject. Look for mentors or perhaps someone to knowledge share / skill exchange with.

You could do pretty well with a base in C#. Through pentest engagements, I've come across quite a few C# apps in my time and even with my limited knowledge of the language, found some interesting vulnerabilities ;)

Edit: Added tptacek link


I'm in my late 20s and got into the field professionally just a couple years ago. Prior to that I had been working as a software developer.

I believe what helped was a handful of personal projects related to security: reverse engineering firmware, finding bugs in web apps. Also I emphasised the parts of my software development work that had some overlap, such as debugging Windows kernel drivers, and doing security reviews of network services we were writing and deploying.

Now I'm doing full-time vulnerability research and writing software to help do that. Much more enjoyable and pays better too.


How'd you get into enterprise C#/WPF land?


Start as a pen tester then get distracted by girls and booze.


You say that like its a bad thing. ;-)


Just sort of fell into it. Did an internship at university in $genericbigcorp and didn't bother looking for other/better jobs at graduation. (To be fair they paid a generous joining bonus - well £2000 "generous" to my broke 21 year old self).


That's the path of least resistance. Just passively accept calls and interviews via recruiters, and next thing you know you're in the enterprise. It's what happens when you don't have a plan.


You don't need a college degree to do pentesting. I don't think anyone credible in the industry cares where you got your education, as long as you know what you're doing (or in the case of juniors, are able to learn).


Didn't we all.

(Similar story:) )


Don't regret the good times for one second, but do get very jealous of people on HN talking about all the exciting tech they're using and awesome work environment. I am actively taking steps to move to a better company, but my god I'm finding Cracking The Interview Code a slog.


Same here...and then add on that I'm actually over 40...and oh boy, does that code get tougher to track. Funny how in some circles deep and long experience in tech is respected...but companies often look at me and ask "What will you do for me lately?"...and look at my age, and likely assume that I'm slowing down; when just the opposite is happening, i'm speeding up in terms of complexity of tech i'm diving into. Weird; but i feel your pain!


Try it at 50. It's not a good field to be in at either of these ages unless you are in technical mgmt, architecture or direction.


Source code review is in high demand. Many companies are happy to train a developer on how to do it.


Always cracks me up thinking about how much the press talks about cybersecurity being in demand then looking at the actual IOT and embedded cybersecurity market.

Everyone should be very, very scared about infrastructure/medical security and the effective lack of anyone doing anything about it.

I'm sure web/network pentesting is doing well though.


https://www.youtube.com/watch?v=pd91c5ZwSf0

Excellent talk from last year's HOPE about the vulnerability of medical devices - he even connects to some live


What are some realistic salary ranges for people in this field?

And is it largely on site work, or is it more common as a remote consultant?


NB: Assuming SF, NYC or a similar locale.

Junior: $120k base. $10-15k annual bonus.

Mid: $150k base. $20-30k annual bonus.

Senior: $180k base. $30k annual bonus.

Slightly skewed towards consulting; notch it up a bit to account for stock grants in addition to the bonus if you're thinking of something like Google, Facebook or Amazon. Sometimes the consulting firm pays separate bonuses for research time and bench projects that result in something useful or beneficial for the firm's brand.

That's based on my own experience and talking with probably well over a hundred people in the industry about their salary by now.

You can do remote, I've been remote almost my entire career thus far. Remote is more likely in consulting than it is at one of the reputable internal security teams, but it's not uncommon at smaller tech companies.


Regarding onsite/remote, I think it depends on what you choose to specialise in. Most of the web application assessments I conduct are remote but there is still demand for onsite work.


Central Texas 5 yrs experience, 95k when bonuses and 401k match are thrown in.


Low to mid 100k is very realistic in my experience. A lot more can be made with the right skills.


So standard software engineer salary in the USA?


Yes, like all tech jobs that pretend to be in demand.

Security is usually worse than regular engineering, because it tend to be a succession of short quick gigs.


I'm looking to get out of security after being in IR and security research at a large startup for 5 years. I am having no luck finding anything besides glorified helpdesk jobs and other entry level positions.


I'd love to get back to hacking and be paid to do it legally.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: