Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Right - I think when the equivalent of CoPilot shows up in incident response, security employment market changes for good. When a “cleared” CoPilot (for govt-supporting work) shows up, total overall.

If you don’t operate in the approach I describe, or are not just an all around tech expert who likes security for some reason, the stable high paying market is around digital forensics/incident response firms. Those folks have a lock bc there’s a small group of who knows assembly and OSs across multiple systems very well and knows if from a security context. Tribal work for a LLM soon enough as it’s just parsing OpCodes and stretching across log sources, end of the day. Scary stuff, glad I’m past entry level and I’m no fool thinking that I don’t have to worry too.



I'm not sure I see this as a reality anytime soon.

  > Those folks have a lock bc there’s a small group of who knows assembly and OSs across multiple systems very well and knows if from a security context.

There is two parts to this. The first is for some of these business in that arena I'm sure if they could speed up analysis to take on more client jobs requiring less labor they would have done so. Second is, what output are you going to provide that wouldn't need the very same people to decipher, validate, or explain "what" is going on?

As an example if you get hacked and you make a cyber insurance claim you are going to have to sufficiently explain to the insurance company what happened so they can try to get out of paying you and you won't be able to say "Xyz program says it found malware, just trust what it says." If people don't understand how the result was generated they could be implementing fixes that don't solve the problem because they are depending on the LLM/decision tree to tell them what the problem is. All these models can be gamed just like humans.

I'm not quite sure I agree that a better LLM is what has been keeping people from implementing pipeline logic to produce actionable correlation security alerts. Maybe it does improve but my assumption is much like we still have software developers any automation will just create a new field of support or inquiry that will need people to parse.


I think the impact of LLMs in DFIR will come down to

- speed at which actionable insights can get generated (otherwise needing a very high paid eng poking through Ghidra and cross-log correlation)

- reduced need for very high paid DFIR engs due to the above.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: