>Capture the Flag (CTF) is a special kind of information security competitions. There are three common types of CTFs: Jeopardy, Attack-Defence and mixed.
>Jeopardy-style CTFs has a couple of questions (tasks) in range of categories. For example, Web, Forensic, Crypto, Binary or something else. Team can gain some points for every solved task. More points for more complicated tasks usually. The next task in chain can be opened only after some team solve previous task. Then the game time is over sum of points shows you a CTF winer. Famous example of such CTF is Defcon CTF quals.
>Well, attack-defence is another interesting kind of competitions. Here every team has own network(or only one host) with vulnarable services. Your team has time for patching your services and developing exploits usually. So, then organizers connects participants of competition and the wargame starts! You should protect own services for defence points and hack opponents for attack points. Historically this is a first type of CTFs, everybody knows about DEF CON CTF - something like a World Cup of all other competitions.
>CTF games often touch on many other aspects of information security: cryptography, stego, binary analysis, reverse engeneering, mobile security and others. Good teams generally have strong skills and experience in all these issues.
I disagree that its harder. Sure to be an expert security engineer is harder than being an average programmer, but so is being an expert at algorithms.
I also don't think ctf's are a good model for security questions during interviews. Knowing how to do something securely is not the same as knowing how to exploit a vuln.
On the other hand, I'd far rather hire someone who can explain why TLS works and what the weaknesses and tradeoffs are than someone who can reverse a binary tree on a whiteboard, and I'd far rather work somewhere where that was standard of questioning.
That doesn't make any sense whatsoever. You do not hire someone because he knows about a certain technology. You hire people because they will provide long term value to your company and are able to adapt a rapidly changing technology space.
Reversing a binary tree on a whiteboard is certainly a bad question to ask. But I would argue, for all intends and purposes still a miles better indicator about future potential than if someone knows how/why TLS work. Yeah you can read that in a book. I can google it. Useless for interviews.
If you are hiring for a position that requires you to implement TLS, sure go for it. But that is not the rule. And what are you going to do after he has implemented TLS? Will he be able to work on something completely different?
If you hire based on algorithms, there's a good chance you'll end up with a bunch of people who are good at algorithms, or at least willing to put in the effort to study the algorithms and be able to recite them back with some variance.
If you hire based on knowing how the the internet works (TLS, HTTP, BGP, whatever), then you'll be working with a bunch of people that understand how the internet works.
I guess the idea is that TLS is sufficiently complicated that you can take tangents during the interview and establish if the candidate can understand and communicate complex concepts
When I'm looking to hire a software developer, I would personally would rather hire someone who can write well-structured and maintainable code, can describe a system architecture they worked on in a way that is accessible to people not familiar with the domain space, and is able to say "I don't know that, but I know how to Google it".
Now, if I'm hiring a sysop / devop / security engineer, it's going to be differently focused to some extent, but the same principles apply - core knowledge, communication, humility, ability to research.
These are not even close to equivalent. I agree the algos interview process is broken in many ways and can be gamed to some degree by grinding Leetcode or whatever, but implementing an algorithm is generally much more difficult than recalling a fact.
(Admittedly, candidates are likely to have memorized the algorithm for reversing a binary tree since that is such a common interview question)
You're testing recall only when asking about TLS. You may be testing recall for "how to implement a known algorithm", but there's still plenty of room for testing actual problem solving too.
If you are the interviewer, even if you are asking a standard "how to invert a binary tree" type algorithm question, you hold the cards to keep pushing the bounds for problem solving by extending the question.
Well if you're willing to push boundries, you can certainly do the same with TLS. Ask why the TLS spec does X instead of Y, for various subtle design decisions. This is probably actually much easier to do than with a binary tree, as TLS is a lot more complex and a lot more subtle. It would certainly be an appropriate line of questioning if you were hiring a crypto engineer, not sure it would be relavent to a security engineer or software engineer.
You can ask algorithmic questions that are not easily googleable or standard problems (basically where they have to invent the solution themselves). Although it takes some effort to find the right difficulty problems (tip: the right difficulty is pretty low) so you don't end up wasting time on stuff nobody can solve or stuff everyone easily solves. You can ask people you know to take a stab at a problem to gauge its difficulty. Or you may even be able to eyeball it.
Is fast really the goal? Or is it just that most engineers don't want to spend time interviewing so that's what we've optimized for?
Some of the absolute best engineers that I've worked with take their time to wrap their heads completely around a problem before diving in. They aren't slow thinkers, but they aren't people who excel at these kinds of interviews either.
I've been doing interviews for a long time now and I find it more effective to surface strong opinions about things they've worked on -- good and bad.
I'm not hiring into a feature factory -- I don't care about fast cogs. I'm hiring people who care about what they do and giving them an environment to thrive in.
I guess it depends but at the internship level an intern whos gast vs an intern whos slow is like 10x things done dometimes. I work in a math heavy environment tho.. so i can understand your point of view
I see yours also, but we definitely shouldn't be skewing hiring process to interns unless at your company you're mostly hiring interns (I'd think that's uncommon?).
The vast majority of web developers will never touch things like TLS configurations. We have stuff like nginx and WSGI for a reason!
Meanwhile there's a decent chance of them running into a problem that looks like binary tree manipulation.
Despite all the handwringing about btree reversal, not being able to improvise a solution to that is a much more useful indicator than not being able to describe even the basics of TLS.
Security engineers don't actually recite trivia facts about TLS in their job. Their job is more like, know when you should use TLS and when you shouldn't and know what it does and does not protect against.
A security engineer is just as likely to derive a tls cipher config from first principles instead of googling/looking at ssllabs as a programmer is likely to reverse a binary tree from scratch instead of using a library or searching stack overflow.
Engineers (of any stripe) aren't hired to recite random knowledge, they're hired to know what knowledge is appropriate to apply to a particular situation.
I've been doing web development for a long time, and I've never had to do anything like btree reversal in a web development context.
It's probably not even that hard, but I doubt I'd be able to improvise a solution in an interview setting that's nothing like a real work environment. (Memories of trying to work out some kind of graph traversal while someone was basically just staring at me.)
It seems to me that people who are doing this kind of work behind the scenes of a web app aren't really doing web development.
uhh...setting SSL parameters is part of defining an HTTP block in nginx.
As someone in a public company with actual customers, we have to deal with TLS configuration all the time (as customers come with requirements and being public comes with compliance/security) and it's important to know what we're doing there...
I'm responsible for millions of dollars in infrastructure and the way that I got here was being a web developer who knows how the internet works. And in an engineering organization with hundreds of engineers, most of them tend to come to me first with questions.
I have never once in my career had to reverse a btree.
Setting your security test labs takes way more effort than opening IDE, LeetCode, checking informatic olympics tasks or maybe some book/pdf/write up/wiki
When developing algos you're in your own world meanwhile security often has to mess with other things like software, standards and stuff
e.g you're interested in web sec/hacking, then except understanding of standards (what it allows and what not, etc.) then you have various implementations to care about like web browser - chromium, gecko, ie, safari and stuff. It's a lot of effort!
Let's say that you want to find vulns in PDF parsers/renderers by checking their code source code - I think it'd take a lot of effort to check and understand (let alone exploit) those implementations in two major browsers (I suppose they're different, but I've never checked that)
>I also don't think ctf's are a good model for security questions during interviews
I didn't meant that, sorry if I made it sound as if I expected people to solve CTF tasks during SE interviews
Just wanted to encourage people to do cool stuff :)
The core doesn't really move that fast. 90% of everything is still an injection vulnerability of one form or another, and if you go above the level of vulns, most organizations are really looking for people to do risk analysis, in which case, the actual vulns somewhat become less central (however there are lots of different types of security engineers, and lots of them have very different responsibilities)
It depends a lot on what you're doing of course. I do web application security stuff (glorified xss detector), i've personally felt that the most useful tools by far are the browser dev console and curl, but ymmv.
> I didn't meant that, sorry if I made it sound as if I expected people to solve CTF tasks during SE interviews
>Just wanted to encourage people to do cool stuff :)
"Maybe it's time to ask security questions during SE interviews instead of algos?"
I don't want to hear the answers. My faith in humanity is already hanging by threads.
I know this is my problem. It's really hard for me to not flip the Bozo Bit. To take both the good along with the bad every person has to offer.
Story time.
I worked near "Chris", a tech lead senior architecture fella. Same product group, different code bases, thank god. I disagreed with pretty much everything Chris said, but I accepted that we're all opinionated and it's more important to be consistent than to be right (correct). So long as his team kept shipping, Chris could do whatever he wants.
Until.
Because Chris was loud, large, older, and a boar -- exemplar of mansplainer -- he was influential. He convinced most of the people within ear shot (half of the floor) that password wallets were a terrible idea. I kept listening, to hear the rationale. Maybe he had a legit reason and I was the clueless noob.
Nope. "Single point of failure. If someone cracks your wallet, you're fully pwned."
Chris is why that product group could not, would not adopt any credentials (secrets) management strategy. It was all wikis and PostIt notes.
Reflecting back, it now occurs to me that hoarding the credentials may have been Chris' gatekeeping power move. Hmmm.
> Maybe it's time to ask security questions during SE interviews instead of algos?
Please no, but I agree that security questions could be a very good addition. But please let's not replace something used as an under par proxy for the ability to program with something likely even worse.
Maybe it's time to accept that sufficiently scaled companies need to hire more specialized roles, rather than having a single job title responsible for an array of technical specializations?
Unless you work as a security engineer building security tools security should be following basics as specified in common security standards (PCI DSS, ISO 27001 etc). It's comprehensive and not that hard to learn. Twitter are not remotely close to operating like that given their recent hacks.
> security should be following basics as specified in common security standards (PCI DSS, ISO 27001 etc)
Would be nice but no, not really. The standards you mentioned are mostly compliance requirements. To be honest, a good chunk of the industry considers them as kind of a joke from a security perspective.
> It's comprehensive and not that hard to learn
Have you even read these standards? I mean, they might be "not hard to learn" but they are far from comprehensive (or specific, depending on which one you are looking at)
> Twitter are not remotely close to operating like that given their recent hacks
Ask literally any security professional you trust, companies compliant with PCI DSS and ISO27k1 get security incidents and breaches all the time just like everyone else and possibly more (given that if they need compliance with these standards they are probably big enough to have very wide/heterogeneous infrastructure/applications portfolio/administration practices/etc). If they claim they don't, that's most likely because their telemetry sucks (so, it still happens they just don't know about it)
My point isn't that these standards will protect you from any incidents. My point is that these standards are the low hanging fruit and Twitter hasn't even picked it yet so it's a bit redundant to get super technical. Twitter is suffering from governance failures.
No. Engineers introduce vulnerabilities through their normal day to day duties.
The thing to do is familiarize yourself with the OWASP Cheat Sheets directly related to the things you do day-to-day...and to check if the thing you're working on relates to one you're less familiar with.
You don't need to know much about security to write safe programs. Most of the work is done by others. You need to know how to avoid common pitfalls (i.e. yeah, if your service uses 3rd party code to parse customer provided data and your colleague suggests using this awesome C++ library for it, maybe some alarm bells should go off).
Advanced security knowledge is not needed for developing software. What you need is a security team that reviews the work your developers produce. That's it.
> bad algo can almost always be rewritten, leak cannot be reverted.
And while a bad algo can be rewritten, bad software often can not. Bad programmers are a disaster for scalable software projects. A leak can not be reverted, but so can't the product you didn't ship because you hired the wrong people. Or the company who goes bankrupt because you weren't able to ship a product.
Building software with security engineers instead of software engineers is like trying to win a Formula One race with Fighter Jet pilots.
You also don't need to know much about many algorithms to write workaday programs. Most of the work is done by others. /head nod to Timsort.
You need to know how to avoid common pitfalls (i.e., yeah, maybe nested for-loops with millions of elements per level of iteration, maybe some alarm bells should go off)
Advanced algorithm knowledge is not needed for developing typical business software.
Bad security is a disaster for any public-facing software projects. A product you didn't ship because you hired the wrong people can't be reverted, but neither can a data leak.
Or the company who goes bankrupt because you leaked highly sensitive data.
I know that security is harder and requires wide grasp of theory and solid proficiency in practice but you know:
bad algo can almost always be rewritten, leak cannot be reverted.
I'll use this opportunity to advertise stuff:
Try against world class hackers/security engineers
https://ctftime.org/ctf-wtf/
>Capture the Flag (CTF) is a special kind of information security competitions. There are three common types of CTFs: Jeopardy, Attack-Defence and mixed.
>Jeopardy-style CTFs has a couple of questions (tasks) in range of categories. For example, Web, Forensic, Crypto, Binary or something else. Team can gain some points for every solved task. More points for more complicated tasks usually. The next task in chain can be opened only after some team solve previous task. Then the game time is over sum of points shows you a CTF winer. Famous example of such CTF is Defcon CTF quals.
>Well, attack-defence is another interesting kind of competitions. Here every team has own network(or only one host) with vulnarable services. Your team has time for patching your services and developing exploits usually. So, then organizers connects participants of competition and the wargame starts! You should protect own services for defence points and hack opponents for attack points. Historically this is a first type of CTFs, everybody knows about DEF CON CTF - something like a World Cup of all other competitions.
>CTF games often touch on many other aspects of information security: cryptography, stego, binary analysis, reverse engeneering, mobile security and others. Good teams generally have strong skills and experience in all these issues.