>Passkeys will be importable and exportable, cross-device, and across passkey managers.
Yes, but how will the users be able to do this in a way that they can understand and that is secure? The usability of stored secrets is a big outstanding issue at this point. You can't just blandly state you are going to solve the problem without any indication of how it is to be done. Just because something is technically possible does not mean that most people will be able to do it in practice.
Start with protocols for handling the data well. Then figure out what flows work.
Start at the bottom with what already is, with what is concrete & true, and build forward from that.
I both want us to be careful & considerate, but also, I find the assaults against progress like this to be impossible. We can keep tattering efforts to advance forever, create ever more gates, raise ever more concerns about any topic forever. I used to be able. To process this kind of cross examination as a tempered concern & moderation, but increasingly I feel like grace is lost, that- whatever the intent- we are suffering from a large amassed anti-will, a will against.
Having the technical standards & options to do these capabilities doesn't imply that every system has to offer them. Most should, & there should be good options for managing them (and the concern here is valid that the options indeed be good & usable) . But security keys have been a miserable failure, in huge part because users have never been given options to manage their contents, because the device has been the sole authority & holder. It's non negotiable in my mind that exporting importing files needs to happen ASAP, with low barriers. We keep roadblocking progress with concern that some users will make awful mistakes.
If we had forever we could perhaps make your perfectly safe & secure utopia, where power users and the lowliest end user both happily safely operate, but we do not have forever. Given this, we ought create powerful technical underpinnings that make user capabilities possible & evolvable, then let product iterate & hone on how they can best marshal & expose subsets of those capabilities in ways catered to their users. What we cannot do is gate progress on imagined self-threatening users, what we cannot do is to forever pick the lowest common denominator of options, out of fear.
> It’s something that’s being defined and designed
In other words, it’s not finalised. Whatever solution is implemented will depend not only on Apple but on the other players as well. It would make no sense to share those details on Mastodon, they’ll save them for WWDC or a press release like they do with everything else.
I think this addresses most of my concerns regarding passkeys.
The biggest remaining one is still that fingerprint locked information is not protected under the fifth amendment in the US. Even pin protected information/logins has been ruled as both protected and not protected by different courts. The "key vs combination" debate hasn't been decided one way or another. I can imagine a government push to not count fingerprint locked passkeys as equivalent to passwords, but that's not a weakness of passkey technology.
I am also interested in seeing how phone secured passkeys will work, iOS currently forces the use of keychain, excluding users who opt for a third party password manager like bitwarden. I also wonder if phones will be able to be used to login without enabling bluetooth.
On iPhones, pressing the home/lock button five times in quick succession or pressing and holding the lock button and one of the volume buttons for a couple of seconds will disable biometric unlock for the next use, i.e. the passcode will be required.
> The biggest remaining one is still that fingerprint locked information is not protected under the fifth amendment in the US.
Is this comment more about the ongoing existential concern for everyone, or wondering if this is a specific concern for yourself?
The Apple implementation protects with either a biometric or passcode; you should be able to still use it if you disable the biometric features of the phone.
A third-party implementation might instead support requiring a PIN rather than accepting a passcode for their users with such concerns.
This is very close to the existing situation with password managers.
> I am also interested in seeing how phone secured passkeys will work, iOS currently forces the use of keychain, excluding users who opt for a third party password manager like bitwarden.
The system implementation uses iCloud Keychain, but a password manager can ship a web extension to supplant the existing browser API with their own implementation.
On Android, there is a system level API in beta that allows third party password managers to offer passwords and passkeys to native apps and browsers.
> I also wonder if phones will be able to be used to login without enabling bluetooth.
There are complications to supporting USB on multi-purpose devices, and unfortunately nowhere near enough NFC reader adoption for such support to be prioritized. NFC also has some UX complications.
The bluetooth support is used for proximity checks; a pure network-based approach would be open to lower sophistication MITM attacks.
> The biggest remaining one is still that fingerprint locked information is not protected under the fifth amendment in the US.
Fortunately, that's not a threat vector that is a real concern for most people. If it is for you, then yes -- solely protecting access through biometrics is probably inadequate.
It's not, until it is. There have been cases of everyday people are forced to turn over the passwords at borders when entering or leaving the country. I fundamentally disagree about whether or not this is a real concern for most people. I have no problem with people using biometrics, I understand people will always choose the most convenient option, that's why it's important. I think the iOS "cop mode" feature to disable the fingerprint with 5 presses of the power button shows companies are concerned about it too.
Neat, but that also means their value as MFA is undermined.
It's no longer "something you have" when it can freely be copied, including by a casual passerby or a virus. You're basically reducing it to a password stored in a file called "MY PASSWORDS" on your desktop.
Great for user friendliness, terrible for security. So now when you use Passkeys we'll need to add another 2FA instance to properly secure your account...
An SSH private key isn't "something you have" - it's not a physical object, you can make copies of it.
To me it falls into "something you know". You might not be able to type it from memory into a file but in practical terms it's no different to a password, it just usually happens to reside on disk.
A physical key to your front door is a physical object and something you own - but you can still make copies of it. Someone else can even make a copy from a photo of the key. Google authenticator was (until recently) the only app I know that made it very hard to copy the OAuth secret to another device, lots of other OAuth apps made it a feature to sync/backup your keys.
I can't think of a lot of security factors (besides perhaps biometrics) where copying is really impossible. Its just not economic.
Very little is "impossible". The point of developing a good second factor is to have a design that makes it economically infeasible to copy. Like a good HSM.
This seems like a pretty significant semantic shift from what multi-factor originally meant. I think the real problem here is that the original idea of factors came from an environment where a person is being authorized for physical access to a controlled facility and the factors are being checked by a trusted deputy of the relying party. The different factors really mean something when you are being checked at a facility gate.
What you "know" is held in your mind and up to your will whether to divulge or not. You might not have a choice if put under duress.
What you "have" is possessed on your person and you may choose to present it but you may be forced to reveal it in a search. You may have had a choice as to which possessions to carry with you to the facility or which to hide and disavow.
What you "are" are passively observable characteristics of yourself that the facility may choose to measure and which may be difficult for an imposter to replicate. You would probably be unable to withhold these characteristics.
Once we water this down into a remote access to a website, these factors become vague analogies. In the end, the website is only able to observe information from the local user agent. How much can the website deputize end-user equipment and trust it to make any of these distinctions on its behalf?
What you know and what you have start to blur together as different grades of information. There may be no real way for the website to distinguish whether it came from your mind or from some storage or communication device in your possession.
What you are and what you have start to blur together as well. There may be no way for the website to distinguish how you protected your possessions (i.e. via biometrics or PIN), or whether you involved other parties who helped with this.
It is something you have, the machine where the file is. It's only something you know if the only way for me to steal it from you is to threaten you to say it. If I can just yank your open laptop from your lap in a cafe, it's something you have.
If you are able to copy it off the laptop it becomes something you know. One important property about something you have is that you the owner of the account have that thing in your possession then no one can hack you. If someone is able to copy it then that invariant is not true.
Writing your password on a piece of paper doesn't turn that password into something you have.
It literally does. Now whoever _has_ that piece of paper can get to your account. A physical piece of paper with a password written on it is no different from a physical key. Unless you're going to say that if you memorise the indentations on your house key then your key is "something you know" too.
Correct, unless you store it solely in a HSM like a Yubikey.
One of the critical parts of "something you have" is that it is something only you have. If you can copy it, there is no longer a guarantee that you are the sole person possessing it.
This is also why Google Authenticator did not allow export for a very long time. If you want multiple TOTP tokens, you should enroll them separately, so that it is at all times possible to determine which exact token is being used to authenticate and revoke them individually if required.
It's usually considered something you know, since it's used as password replacement (first factor). You can create schemes where it could be considered a second factor (used together with a password, mandatory smartcard/fob storage, etc.), but that's not the regular case.
At the same time, it's easy to turn password+TOTP into a single factor by storing both in the same password manager… the theoretical MFA definitions are not the most practical.
I wrote a 1-page article for this in a small obscure, local-language magazine. The title of which was, translated in English (OMG, I should have also published in English) - Re-factor the factors.
I think the 3-factor authentication (3FA) model of "know, have and are" is up for a revision. Knowing that all models are wrong, I hope my suggestion as at least useful. These were really rough thoughts, and I was hoping to spark some discussion of it, but unfortunately that never happened.
Anyway, I think the 3FA model has several problems.
1) there is no clear category for recent "ambient" auth factors that are getting very popular, like if one logs into a website on a browser and its fingerprint is unknown, that a "step-up" authentication is required.
2) Authentication is proving that you are who you say you are (you prove an identityclaim), a factor called "are" make the cognitive shortcut of equating the inherence factor to the identity proof very very tempting (and I think many people in the field have fallen for it, given the prevalence of biometrics as sole authentication.
3) Behavioural biometrics like keystroke dynamics do not neatly fall into one of the three categories (it is a little bit of knowledge, it is a little bit of "inherence")
I also take major offense to the statement that follows from it, that I AM my fingerprint, or I AM my Iris. NO, I can _prove_ who I am by presenting my fingerprint or iris.
To fix this, in the article I suggested moving to a 2-dimensional model of authentication (2DA), that is reminiscent of but not quite the same as the 3FA model.
The two dimensions were:
Dimension 1: Knowledge vs Possession. A factor is somewhere on this dimension, and my thought was which is which could be tested by the colloquial language use of "know". I can always say I have a password, I have a way of using my phone, but additionally I "know" my password, and I "know" my way of using my phone (even if implicitly). On the otherhand, I have a smartcard, but I do not know it, nor do I know my fingerprint.
The SSH private key would in my opinion fall in the "have" part of this dimension.
Dimension 2: Transferability. The degree to which the factor is transferable to someone else, to which ownership of the factor can be "bestowed" on someone else. Smartcards are highly transferable, as are passwords, however fingerprints generally are not expected to be transferable, neither are the way I use my phone nor from what browser and what location I log-in.
Naturally, 4 relatively distinct quadrants in this space arise, which I tried to label indepently with a single word:
possession, high transferability: carry
possesion, low transferability: show
knowledge, high transfer: tell
knowledge, low transfer: do.
So, the catchy phrase for the 2DA model would be, things that you "carry, show, tell and do".
If you were using webauthn as a second factor, I believe the passkey changes implemented by Apple and others means your device bound webauthn keys are now synchronized to their cloud services.
That hurts and my enterprise had to halt their rollout of webauthn for MFA while we figure it out.
The problem is that U2F/FIDO2/Webauthn was 100% used as a second-factor. Passkeys are pretty much just a variant of Webauthn, so you are now replacing token+password with solely a token.
Actually, I'm not even sure your memory is safer. You would have to remember hundreds if not thousands of passwords, including remembering which domain they belong to, as well as to be able to generate unique ones for every new domain you encounter for the rest of your life.
Yours may be, but that isn't true for the majority (dare I say >95%) of users. The problem is that people tend to simplify when forced to remember things, or re-use. A Passkey, even handled in the manner described in this post, is better for most people.
> It's no longer "something you have" when it can freely be copied, including by a casual passerby or a virus
Yup it's very sad. It's better, for regular people, compared to passwords but it's a complete step backwards compared to actual physical secrets stored behind an HSM (like inside an actual physical FIDO2 device).
It is pure insanity that FIDO2/webauthn was, at last, getting somewhere and that we had something secure gaining traction and that the standard has been embraced and extended by the likes of Microsoft (and Apple and Google) to then greatly lower its security.
While still pretending it's all still FIDO2/webauthn.
And as I understand it it's really bad because companies using passkeys shall be able to force their users to use the less secure form of passkeys by actually refusing passkeys from those offering passkeys where the secret never leaves physical hardware.
For a start it is pretty obvious that the less secure Google/Apple/Microsoft "password stored in a file on your desktop" (and backed in the (i)Cloud: what a concept for security keys!) you mention shall be accepted everywhere. While it's highly likely that at some point, "because convenience", they'll cut off passkeys vendors offering actually secure devices.
And it's likely we'll have at the same time "security experts" explaining it's a godsend security-wise.
> So now when you use Passkeys we'll need to add another 2FA instance to properly secure your account...
As long as it's actually possible to do so.
I fear that at some point it's going to be passkeys are the only auth option and by the way we accept "software passkeys backupable to the cloud" only.
It's just sad.
When you think that the original implementation of FIDO allowed users to detect if their security key had been cloned, thanks to the monotonic counter (I know for a fact because I verified this myself, with cloned security keys, both before and after Google removed that monotonic counter )... But since then Google (and others) removed that check (so it's already impossible to detect if your key has been cloned).
This all makes me think there's more to it than "just" Apple/Microsoft/Google. I think it's highly likely some three-letter agency was involved in the "enhancement" of the standard.
The point is if you can export the key from the HSM then it no longer requires that HSM as the exported key and MYPASSWORDS.txt can be bundled together. Knowing the contents of that bundle is a single factor of authentication.
There is no notion that "something is secure". It literally doesn't exist. There is a notion that something is secure under a specific threat model, and there are many threat models for which keys are secure and passwords are not.
When though? Because at the minute all I see is vendor lock in and no actual advantages over my password manager, other than not having to trust the security practices of a company.
I'm definitely not up for tying anything to a device or biometrics.
At the moment, through, passwords are good enough for most people who aren't specifically interesting targets, so I'm not super concerned or holding my breath. I just don't see people who use password managers and any kind of 2FA, even SMS, getting hacked in real life.
I expect the main value of passkeys will just be that people will be forced to transition from memorizing to letting the computer manage credentials.
Google Authenticator added exporting a while ago, although it is annoying on some platforms except for the case where you are exporting to another device running Google Authenticator.
That’s because it exports via QR codes containing information for a batch of accounts. On Apple devices you can screenshot that QR code, but I’m told that on Android it disables screenshots so you need another device to save a copy (or you use a photocopier to make a paper copy).
Once you have the QR code, use a QR code reader to extract the data. There is open source software that can turn the batch data from the QR code into individual account data and individual QR codes so you can use them with other TOTP software.
That's just how those vendors implemented it. There's nothing stopping you from using the TOTP apps that support the import/export functionality, as it's just strings, no secret sauce.
Not entirely sure why this is getting downvoted. Passkey is the only non-phishable authentication mechanism that works out-of-the-box on billions of devices. That is huge value from a security perspective.
So I was trying to figure this out, but basically passkeys are a private-public key pair as opposed to passwords that are a shared secret? And I guess authenticating with a passkey is basically doing a signature or some other proof that doesn't expose the private key.
Since passkeys are generated using an algorithm, there's no risk of people reusing them or choosing weak ones (and will force people to use password managers).
And at the root you still have a shared secret password/pin to unlock your password manager or device containing the password manager.
"that's what she said"...
Was good to avoid the fear of users so that they switched but then big tech don't have to really do it and follow their word once it becomes common.
Just look at the Google authenticator that is very simple but did not have the feature for years, until recently when they were afraid that bad communication about that will jeopardize their passkey initiative!
OK so it will be _possible_, but will Google, Apple, Microsoft, et al actually implement this functionality? There's really no guarantee of that. I certainly wouldn't bet on it.
According to this [0] post of Mozilla dev, no sooner than 8 versions ahead.
>Linux
AFAIK, the furthest efforts so far are implementing FIDO2 portal in Gnome [1]
The field is currently dominated by major centralised players (Google, MS, Apple), so it it really depends on the particular Linux community to introduce any FIDO2 support. For example, Chrome has no plans for Linux passkey support. [2]
Yes, but how will the users be able to do this in a way that they can understand and that is secure? The usability of stored secrets is a big outstanding issue at this point. You can't just blandly state you are going to solve the problem without any indication of how it is to be done. Just because something is technically possible does not mean that most people will be able to do it in practice.