Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And it's getting worse, SGX[1] allows 3rd party encrypted binary blobs to run on your CPU without being inspectable.

It's sold as way to protect your secrets from malware. But it more likely will be used to run DRM code on the user's computer while treating the user as a hostile entity.

[1] https://software.intel.com/en-us/sgx



SGX has the potential to be amazing though. With it you can build "trusted" applications. For example, a Bitcoin mixer that's provably secure. (Well as secure as trusting Intel and users not to be able to break the chip.)


As it is right now you're giving up your liberty (debugging/inspecting/tinkering) in exchange for security.

That's generally a bad trade. Sadly one that many people are willing to make until it bites them.

It would be a lot better if secure mode had its own supervisor mode that worked through a master key that could be installed at boot time.


It's really a question of who you trust. There are lots of scenarios where you might trust the developer of a particular piece of software more than you trust the entire software stack running on your PC. This is especially true for a nontechnical / casual / grandma user, who has no hope of ever auditing or even having more than a vague idea of what's running on their computer at a given time, and probably is running (or at least needs to be assumed to be running) six different kinds of malware all the time. To someone like that, the PC itself is a hostile environment which they don't want to share certain information (e.g. their banking details, crypto keys, etc.) with. SGX allows you to ensure that.

If you take on premise that the PC is not safe and under your control, but is instead hostile and compromised, basically an outpost of the Internet in your house, then SGX and similar start to make sense. For many people, their computer is always going to be hostile; it was never "theirs" to begin with, so SGX doesn't really cost them anything, and the ability to let a single application basically force its way down to the hardware and elbow everything else in the stack out of the way is an improvement over having to trust the OS, browser, etc.

In a way it represents an abject failure on the part of the dominant OS developer (Microsoft) to produce a consumer computing platform that the average user can trust, as well as the failure of most other alternatives (e.g. DoD-style smartcards) to take off in the consumer market.


None of that justifies the absence of an ultimate user override.

There is no need to give up freedoms for that security.

Only when DRM comes into play you can really explain why the user is not in control here.


Last I spoke to Intel representatives, SGX enclaves couldn't be taken out of debug mode without having a contract and signing key from Intel.

In other words, those amazing applications appear to require Intel to approve the software author. Their keying mechanism allows revocation too.

I hope this changes or that the information I received was in error, but if not then SGX is mostly only useful for DRM. A shame because there really are a lot of productive applications.


What's their justification? I've heard that too, but it sounds too stupid to be true. "Here's an amazing feature built in to all our CPUs. Except you can't use it."


SGX is a major point and one I thought the linked post would deal with from its title.

For a user-owner point of view, I agree with your assessment of SGX. I imagine that, once it becomes used for things like media DRM and games copy protection, users will start turning it off in their BIOS, or managing the signing key whitelist manually. And I wouldn't blame them.

But from a user-not-owner point of view (ie, cloud computing), SGX offers the user more security, and a degree of protection against some cloud computing risks.



If you don't trust your cloud provider i'm not sure whether SGX is the solution. Consider all those side-channel attacks.

It might provide an additional defense barrier, but you'd still want to run on trusted hardware. And if you have trusted hardware then it should be ok to use user-provided signing keys, just as you can do with secure boot configurations (at least the acceptable kind).

So as long as you're the exclusive user of a machine it should be sufficient to also hand your public key to the cloud provider so they can put it in the BIOS.

The only reason for SGX to not support that is DRM&Co.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: