It's good to see that one of the positive effects of Heartbleed is that it motivated people to inspect OpenSSL's code, leading to more bugs being found and fixed.
This is supposed to be how open-source works; it's unfortunate that it had to take a huge vulnerability to cause this motivation.
It still doesn't get at two top points of meta: TLS is a horribly overfeatured, design-by-committee standard AND C makes it incredibly easy to screw up and so requires zealous attention to a defensive coding style to introduce fewer flaws.
It's possible to defensively code in C, particularly with modern tooling (some of which lets you write provably secure code) and good process.
The fact of the matter is the OpenSSL people simply did not care about writing good code, and the open source community as a whole was happy to assign their most critical security features to a library widely known to be confusing and terrible and not even remotely properly analyzed or tested.
Like most people, I see Heartbleed as a process failure; unlike most people, I think the process failure goes far beyond TLS or OpenSSL or C.
> The fact of the matter is the OpenSSL people simply did not care about writing good code, and the open source community as a whole was happy to assign their most critical security features to a library widely known to be confusing and terrible and not even remotely properly analyzed or tested.
Sure, but open source developers have no obligation whatsoever to anyone. Considering they're not being paid in any way for their code, they can write whatever code they want and under whatever process they like. The problem IMHO is that OpenSSL was used for highly sensitive commercial uses (like Gmail, Amazon and others); I think responsibility should fall on companies who used the library without checking it first (Disclaimer: I'm not in anyway associated with OpenSSL or any crypto library, this is just how I see things).
> I think responsibility should fall on companies who used the library without checking it first
Agreed. The OpenSSL license (as do many others), has this very clear (if somewhat hard-to-read) disclaimer in it:
"IN NO EVENT SHALL THE OpenSSL PROJECT OR ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE."
This is a standard disclaimer in software. It's just there so you can't sue. It doesn't mean OpenSSL was not intended to be a security library for everyone to use, because it obviously was.
I don't like the 'amateur defense' of open source. No, people are not legally liable for code; no, no open-source dev has any obligation to write good code; but OpenSSL is still garbage. As long as you're writing code and using other peoples' code, you might as well care about craftsmanship. If you don't care about craftsmanship, maybe this is the wrong approach to a critical security library.
Couldn't agree more. The only real difference between an engineer and a hobbyist is that one gets paid (sometimes hobbyists get paid too (gittip, etc)). Both need to realize that engineering ethics of what they're involved in. I might need to turn in my hacker badge if I was ever issued one, but I don't support the completely anarchist view point of "screw everyone, it doesn't matter because it's legal" because there is a greater ethical obligation to serve the greater good. It gets real real fast when a pseudo-democracy buys a 0day to a popular open source library to target dissenters and have them shut up.
the problem is that most or almost every professional programmer goes by the slogan "You shouldn't implement cryptography by yourself, but instead rely on the existing well tested libraries". This is hinders competition in the crypto space. You definitely cannot say that the programmers of openSSL are not professionals, because you know that the majority was relying on their library so it is definitely the best library out there. There is just no better alternative. if you want to have alternatives then you have to encourage amateur crypto programmers to implement competing products. But the whole programming community is doing the opposite by bashing people who start implementing their own crypto. For example If I remember the debate about Telegram where a highly talented team starts their own library the response from programmers is that they shouldn't do this and instead use crappy old libraries which do not even fulfill their requirements.
I don't think the advice not to implement your own crypto stuff ever meant that no one should do it under any circumstances. It means that application programmers should not do it as part of their application. And that, I think, is good advice.
I certainly wouldn't want to see corporate devs at my bank implement some custom "secure" http communications protocol.
What we need is two or three well funded TLS/SSL implementations competing with each other, kind of like we have a handful of competing web browser implementations or C++ compiler implementations or operating systems, all done by experts in their field.
> For example If I remember the debate about Telegram where a highly talented team starts their own library the response from programmers is that they shouldn't do this and instead use crappy old libraries which do not even fulfill their requirements.
The problem with Telegram wasn't about code, it was about algorithms and design. They designed their own crypto, which is a really bad thing. If they had written their own implementations of well-known, tested and proven crypto primitives/algorithms, everything would have been fine (well, mostly; you can still implement a good algorithm badly, but that's not what happened).
What? Come on. Get a grip on reality. The code is some of the most efficient and widely used. I think it's also the only code that implements the entire spec.
It's like those commenting "that skyscraper is garbage" by someone who doesn't even know what you'd need to build a skyscraper, let alone actually tried to build one.
It's like those commenting "that skyscraper is garbage" by someone who doesn't even know what you'd need to build a skyscraper, let alone actually tried to build one.
It's nothing like that at all. A piece of code's popularity is no evidence of its quality. Otherwise we'd never have advanced beyond BASIC.
Implementing an entire spec is nothing to be proud of. Specs are often over-engineered. This is almost certainly the case for TLS. It's 103 pages long: http://tools.ietf.org/html/rfc5246
We should be questioning the foundations we've been relying on until now when a single vulnerability was able to compromise most of the internet, possibly since 2011.
> The code is some of the most efficient and widely used.
Efficient, yes. Throughput in general is not a problem.
Widely used, well... yes to that too. Although in this regard I would attributed a good chunk of it to inertia. When (common) use of cryptography started to take off around the time of the browser wars, there weren't too many options around. Web servers needed a way to serve protected content, and there was OpenSSL.
Not just available, but already supported by language bindings and relatively easy to set up thanks to mod_ssl.
First mover advantage + network effects + the fact that getting crypto right is really hard. If you wanted to unseat OpenSSL, you would need to overcome the fact that despite all the bad rap and the butt-ugly, and the leaky API, it was still getting the most exposure and field testing. Unlike some newcomer library, OpenSSL has the advantage that it is firmly embedded in the software ecosystem, and it has benefited from more than a decade of bugfixes. (I still remember when the timing attack for extracting RSA keys was discovered. The bug was pretty much everywhere, and not just in OpenSSL.)
The widespread use of the library boils down to a simple and womewhat depressing fact: OpenSSL is used everywhere because OpenSSL is used everywhere.
Replacing it with something better will be a long uphill battle. Any newcomer would need an aggressive test suite which covered EVERY known implementation bug. And a sane API. And a project with the political savvy to succesfully lobby for its use through all avenues.
If the skyscraper collapses on me I feel entitled to say that it was garbage even if I'm not capable of building one.
But I am a half decent C programmer and I saw the code that included the heartbleed bug. This bug wasn't caused by a subtle issue. It wasn't even a logical error that a competent person could easily make in a confused moment. This was pure negligence, and negligence is rarely isolated in one part of a codebase.
PHP5 is widely used. The security history of that is less than awesome.
I think OpenSSL is better than I could do or most people could do, but again, it's still at the mercy of C AND a wide range of TLS and other features that seem to multiply like rabbits.
I think we really need a much, much simpler TLS standard going forward.
Dead wrong. When lives and money are on the line, there is obligation somewhere along the line back to the source. Maybe it stops elsewhere, maybe it doesn't. If folks are depending on open source for life-safety or risk the safety of innocent or targeted individuals, there is an obligation at some point to ensure systems are as secure as humanly possible. More importantly, regardless of circumstances, there is a fundamental duty of engineering ethics. [0] They're not just cliche words or some worthless university course, but the implications of how design decisions and construction of something affects the real world. That crappy commit to [project here] might be the difference between someone living and someone dying.
The page you linked to starts with: Engineering ethics is the field of applied ethics and system of moral principles that apply to the practice of engineering. The field examines and sets the obligations by engineers to society, to their clients, and to the profession. I read this to only apply to professional engineering, where engineers do work for money or other forms of payment. I don't think freely-given (or almost freely, such as GPL code) hobbyist code should be held to these standards, for one reason: it's being given for free, with no expectations in return.
If the developer expects nothing from you in exchange for the code, you shouldn't expect anything more than getting the code as-is.
If the hobbyist knows that the code may be used in a critical environment then the hobbyist should withdraw the code or make it very clear its suitability for some task has not been tested. It is rather like asking someone for directions and the person misleads you on the basis that you are not paying him for directions.
Ethics always apply whether you are being paid or not.
> If the hobbyist knows that the code may be used in a critical environment then the hobbyist should withdraw the code or make it very clear its suitability for some task has not been tested.
It's already made very clear in the license that the code hasn't been tested, and comes with no guarantees. How much clearer than that can it be?
Also, you can't really withdraw code from the Internet. Even if you take down the original repository, there may be dozens of forks.
"professional engineer" is more than just being paid. It is a regulated term that connotes an ethical code, qualification, and acceptance of responsibility. It can be loosely compared to the bar association in the USA, or many guilds.
There are very few "professional software engineers" in the USA. Only a few schools, such as UIUC, have such a program.
Software engineer and software architect are meaningless terms. They are self applied yet imply some sort of parity with engineering or architecture. Insert your favorite "if x were built in the same way as software" joke here.
"The problem IMHO is that OpenSSL was used for highly sensitive commercial uses (like Gmail, Amazon and others)"
I think it's worth noting that uses can be highly sensitive without being commercial (or governmental). Not that this takes away from your point, which I agree with in broad strokes - responsibility lies first with those deploying; they are the only ones that have access to the full picture.
This is true. I would trust it more if a company with a lot of cash backed it (google, apple..) of course, leaving it open source. We don't want anything sneaky going on.
They might already make a lot of contributions that I don't know about.
I guess you can but for some reason very few people do. Look at e.g. number of times browsers have security problems. Are those teams made out of idiots or is it genuinely hard?
Yes, and 'most people' are distracting us from discussing the future: how can a refactoring of the OpenSSL codebase, aimed to improve understandability (which implies improved security) be organized?
There will need to be many rounds of small incremental improvements. Who could propose these steps? Who can lead the process, how could contributions be vetted? This is a huge challenge, but unless people start thinking about it and proposals are made, it just won't happen and nothing will change. Just thinking about setting up such a projects boggles the mind, which would like nothing better than to go back to bikeshedding.
I'm not sure the "community as a whole was happy" about trusting such critical features to OpenSSL. Were there any other FLOSS solutions that did everything OpenSSL does when a lot of this stuff was developed?
The revealed preference is that the industry doesn't really care about security. We follow "industry best practice" - that is, we do the minimum to stop users complaining - but users don't buy based on security, and so it's not worth putting marginal effort into.
As this is open-source software, the best thing you can donate is your time: If you know even just a little C, download the source and start reading through it. Heartbleed is somewhat more subtle, but you don't need to be much of an expert to catch things like the "goto fail;" bug. The more eyes looking at the code, the better.
and, ideally, a rigorous test suite; for something that is essentially implementing nested state machines, that probably involves an adversarial state machine aware tester (ie get to state 1; attempt to break; get to state 2, attempt to break, etc), a fuzzer, and a serious unit test suite
I get that part of this is a community problem, etc, but a lack of the above on a security product is amazing
One may consider Mozilla's NSS library (Netscape invented SSL, "Network Security Services") as an alternative to OpenSSL. It has an compatible API layer (extra package), is used by Firefox, (Chrome), OpenOffice and has more sane default settings. Check out the comparison tables: http://en.wikipedia.org/wiki/Comparison_of_TLS_Implementatio...
It seems to be even more overly complex and hard to use (when not using the compatibility layer) than openssl though. The ease of use for programmers is also very important as an error of the application programmer can be as bad as an error in the ssl library.
Chrome is planning to move to openssl for everything, doesn't mean it uses openssl "now all around".
According to the planning doc linked earlier this week, it might take some time to get openssl into chrome. The Plan is for it to occur over 4 "milestones". I'm not sure how long each milestone will take, but suffice it to say, it's not "now".
And it was the other way around. Only Chromium Android used OpenSSL. Everywhere else it used NSS (linux, mac desktop, windows)
"Currently, Chromium supports two different SSL/cryptographic backends. On Windows, OS X, iOS, Linux, and Chromium OS, Chromium uses NSS. On Android, Chromium uses OpenSSL. "
Please check your sources carefully and don't write unfactual information. Especially in the realm of crypto it's important to get the details right.
It would be even better if you edited your post so future readers don't have to read my post at all in order to get the correct information.
I wish Theo and his colleagues would create a fork of OpenSSL that was up to OpenBSD/OpenSSH standards. It would be a huge level of work, but I'd happily donate to help fund it.
Yes, yes a million times. Wonder if this will prompt them to. For a rewrite to have any traction, it would need implementing by a team with some credibility, and they're probably one of the few teams with that credibility
Some anecdotes from recent discussion suggest it's not so easy to work with the original team. Also, if their standard for quality is what it is, it might be hard for another project to start changing things.
"Some anecdotes from recent discussion suggest it's not so easy to work with the original team."
'k, certainly forking is better than letting poor quality persist, and it may be better than reimplementing from scratch. I just don't like resorting to it when unnecessary.
"Also, if their standard for quality is what it is, it might be hard for another project to start changing things."
Possibly, though their "standard for quality" is probably at least a little malleable (particularly right now).
The scale of the project management involved would be tied to it, and that would take project buy in from the OpenSSL leadership far beyond anything I'd expect them to take, and also a much stricter leadership to keep things conformant to those higher standards. The current code base is mind bogglingly, shockingly bad, and the current leaders let that happen. Given Theo's leadership style on top of that I could only imagine a fork as even possible, he's slagged on the OpenSSL team already (justifiably).
I'd rather see a fork, honestly, to see more competent and disciplined standards and leadership, though that's a personal opinion. If the current project leaders stepped down and handed it off to people actually able to handle such an important software project, that'd be great, but I can't imagine it based on their past behavior.
Sure, I agree that it affects level of buy-in required, and that higher levels of buy-in are less likely. I don't think that it's clearly the dominant factor though - particularly in the face of an event like this, high levels of buy-in may very well be attainable. Maybe not, of course, but that would have more to do with intransigence on the part of the team - which would itself be the reason for the fork.
This is not the Heartbleed bug. I am not certain this is actually an immediately exploitable vulnerability (aka a sexy bug). What this looks like is an instance where the code calls free() on a buffer, then assumes the buffer is still available (uses it in some way). The patch seems to make it only free the buffer after it is empty, preventing this behavior. I think this is related to OpenSSL's issue with malloc and the way it handles memory allocation.
No this was written about a couple days ago. It frees the memory and because Open uses a LIFO memory allocater it can "safely" assume that whatever was still in there is still in there. I belive that in order to exploit this you would need to exhaust its internal allocator (so that it requests more from the OS) and your payoff would be... having your connection dropped.
This was discovered in the course of someone's attempt to figure out why OpenSSL randomly drops connections when its using a sane/OS supplied allocator.
Not a silly question. Not all of us know which bug report is which sensationalised bug name. I know I don't.
Admittedly, some bugs deserve to be sensationalised.
To be fair the linked patch is very sparse and even so not everyone reads C, or knows what the Heartbleed code looked like. I can guess at what this is because some kind people provided a few good links about how bad memory management in OpenSSL is handled.
The patch also comes with an explanation of the bug which obviously does not align at all with either the short- or the long-winded explanations of Heartbleed.
It seems like someone should start a Sourceforge for security project; a place that tracks and does high quality static analysis of open source projects, and makes the reports readily available.
I think that this is a good sign. I know everyone has been saying that OpenSSL code is terrible (can't say I have looked myself), but if this is the worst bug found since heartbleed then maybe it is better than it appears.
This isn't a bug found in a thorough audit of the entire OpenSSL code base. This is a bug that was discovered while trying to understand why applications using OpenSSL would run into trouble after disabling the code that made it impossible to detect Heartbleed with OpenBSD's malloc safety features.
Beware of the chilling effects of collecting Google bounties, they will claim a reward is invalid if you've blogged about the vuln outside of their timetable.
I believe that most other implementations are unsuitable for displacing OpenSSL for licensing reasons alone. I didn't look at every other implementation since I was looking for one that supported a specific mode, but the majority of them are GPL with the option to license for proprietary code. GnuTLS is LGPL, at least.
Wikipedia has a pretty good list of possible implementations:
Because now your command isn't /path/ independant. The original assumed you downloaded the patch to your current directory and you can just copy&paste this command. Your example requires me to modify your line making this "process" more error prone.
This is supposed to be how open-source works; it's unfortunate that it had to take a huge vulnerability to cause this motivation.