Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, email is not that complicated, three letter agencies in the US have a history of hiding backdoors in technology, and one of the authors had his company destroyed by one of those agencies. If you're writing software for the tinfoil hat crowd, and you're using high level languages and existing libraries to do very basic things, you raise suspicion even if it isn't immediately justifiable... the suspicion that most major routers were routinely pwned wasn't justifiable in April, either.

Speed is almost never the primary concern with email, which is a world full of rate limits, throttles, spam filters, transient failures, tarpitting, blacklisting, inbox placement delays, etc.



>three letter agencies in the US have a history of hiding backdoors in technology

So the solution is to use the simplest tech stack possible and put the vulnerabilities in there yourself? If you're using this software, you have to trust somebody. I think most people would sooner trust that Java isn't backdoored than trust that this new C project doesn't have any buffer overflows or memory leaks.


I disagree. The more layers of abstraction you have, the more places there are for a backdoor to hide. Also, unless you're using OpenJDK, Java means downloading a binary blob which could be hiding any number of surprises, intentional, or accidental.

With magma I've taken the approach of trying to limit my dependencies to the kernel, and libc. Anything else I use is bundled and thus gets tested extensively for leaks, overflows, etc. That doesn't mean bugs don't exist, but it does mean if they exist, then the source is there for you to inspect and fix.


Different strokes for different folks. I think intentional bugs are far worse than unintended, and scanning a c project for buffer overflows is relatively trivial compared to scanning tons of libraries for flaws so obfuscated they made it through review.

In my mind technical analysis is incomplete at finding 100% of intended or unintended flaws in either system, but the DIY approach allows for trust in the authors whereas a deep stack makes attribution muddy.


>scanning a c project for buffer overflows is relatively trivial compared to scanning tons of libraries for flaws so obfuscated they made it through review

You would have to scan the C project for buffer overflows and memory leaks in addition to scanning the tons of C libraries that it uses[0] for buffer overflows and memory leaks (note that openssl is in that list, among other massive libs). This is not even taking logic errors into account. There is simply too much to consider all at once, and being in C just makes it that much harder have a reasonable sense of security.

You're also implying that it's easier to spot intentionally backdoored C than intentionally backdoored Python, Go, Java, etc, but I have no reason to believe that that's true. Furthermore, the number of eyes that have been on those projects is far higher than the number of eyes that will ever grace Magma.

[0] https://github.com/lavabit/magma/tree/develop/lib/archives




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: