Summary: Germany is moving away from a centralized approach (PEPP-PT = Pan-European Privacy-Preserving Proximity Tracing) to a decentralized one (like DP-3T = Decentralised Privacy-Preserving Proximity Tracing).
(But is the Apple-Google protocol identical to DP-3T, or does the headline mention “Apple-Google approach” just for simplicity? I heard there are some minor differences…)
There are some differences. DP-3T proposes two different systems, one with linkable tokens and the other without linkable tokens. The first system is similar to Apple-Google in the sense that your tokens for a day are derived from a key which is uploaded to a central distribution server when you test positive. In the second system the tokens are not linkable and they propose the use of a Cuckoo Filter to reduce the space complexity. A Cuckoo Filter is a probabilistic data structure that can tell you if an item is not or might be in a set. As a result there are some false positives.
DP-3T also explains how records are uploaded to a central server and the interactions with health-care providers. Apple-Google omit this part and focus on proximity data collection.
I'd like to mention the TCN Protocol here (https://github.com/TCNCoalition/TCN), another very similar specification. I bring it up because the readme goes into quite a bit of (easily understandable!) detail regarding the trust assumptions of such a protocol and associated rationale.
Ultimately I think Apple and Google are right to omit record upload and authentication concerns from the base protocol. The low level implementation should be as interoperable and generalized as possible in order to facilitate immediate uptake and maximum reusability. Higher level concerns such as who to trust and how to interact with users can be handled by the various app implementations.
It seems unlikely that anyone will deploy a version of DP-3T that differs significantly from the approach built into Android and iOS, due to the need for apps to obtain special permissions to run in the background. So the alternative variants that go under that brand are probably a dead letter.
"Those privacy principles are not going to change," said Gary Davis, Apple's global director of privacy. "They are fundamental privacy principles that are needed to make this work."
Can an app not simply ask the user for, and subsequently be granted, the necessary permissions? At least on Android I had understood it to work that way in theory, although in practice perhaps it doesn't always behave ideally (https://support.google.com/pixelphone/thread/6068458?hl=en).
Edit: I see now that it's specifically iOS that doesn't provide for granting the required permissions. I find such lack of control over a device that one supposedly owns highly concerning at best.
Guess: many people would tap OK without thinking about it (they don’t understand or care what background means) then would be unhappy that their battery drains..
It seems to me that restricting freedoms to combat ignorance is unlikely to have a desirable outcome. To your specific example, I suspect that bluntly warning that granting the permission has the potential to lead to significant loss of battery life would get even the most technically illiterate user's attention.
More generally, how are background streaming services supposed to work on the iPhone? Does Apple have to individually approve every app that wishes to do so (ex Spotify, Pandora, ...)?
No, they would just click anything that makes that dialog that stands between them and their goal of installing the app go away without reading the text and then be unhappy that their battery drains. Any design that relies on a confirmation dialog is fundamentally broken. Even technically competent users will read most confirmation dialogs as „Let me do what I want [Abort] [OK]“ no matter what you actually write there.
In many situations we may not have better solutions, but that doesn‘t change the fact that this is terrible.
> Any design that relies on a confirmation dialog is fundamentally broken
I'm having trouble interpreting this in any way other than a claim that granting users control over their devices is a fundamentally broken idea. I won't dispute that users often choose to do dumb things in practice, but it seems the two of us have a fundamental disagreement in our underlying worldviews.
> Even technically competent users ... no matter what you actually write
I'd argue that such users aren't actually technically competent then, despite the high opinion they might have of themselves. On the other hand, perhaps the users are technically competent and it's actually the relevant software developers that have done a poor job of communicating? If an actual technically competent user is experiencing significant difficulties using a program, then perhaps the program doesn't work as well as the developers thought it did.
The issue is that because of some "bad" users you restrict all users. What I do when I designed a prompt dialog that gates a dangerous operation is make the user type something, yopu could have the user type a different thing so you can confirm he actually reads the prompt text so there are technical solutions, IMO the justification that Apple is taking your freedom to protect a subgroup of users is not the reality, the reality is that restrictions make Apple more money, if lifting the restrictions would make them more money you will see a lot of praise on how smart tech is behind Apple's dialog prompts that allow you to lift restrictions.
Could this be a feature rather than a bug here? Help maintain tracing to a very high level statistically while still giving plausible deniability for personal privacy?
This is definitely by design. The Cuckoo Filter relies on hashes of the input so there is a chance of collisions. My understanding is a Cuckoo Filter is a recent extension of a Bloom Filter, if you're familiar with those.
Would you okay with me reposting your paper on my site? I'm working on a piece about the regulatory implications of contact tracing apps in the U.S. -- you've done a better job outlining the pros and cons of the various approaches than I could.
Feel free to hit me up henriquez AT protonmail if you'd like to discuss!
Can someone explain who invents this abbreviations?
The PP-PT and P-3T parts are are exactly the same except completely differently shorted.
In P-3T the "3" is in reference to the "P", not the "T", so i would think it should be either DPP-PT or maybe D3PT.
Then the hyphen in the first abbreviation are at exactly different positions when spelled out: PEPP-PT vs Pan-European Privacy-Preserving Proximity Tracing.
Too bad that (at least on Android an app that needs to use Bluetooth also needs GPS permission. So there's still will be a concern about privacy. I don't understand why these two permissions needs to be together.
If you don't trust the app to not localize/track you, you also don't want to give it BT permissions. BT on its own is enough to localize/track you pretty well in cities, which was one of the reasons people didn't like the idea of a centralized database of BT contacts ...
That's a good comic for an overview, but with a lot of assumptions. Like that Alice can limit what she reports, that 30 seconds or 5 minutes or any amount of time is a "contact" this is entirely undefined as far as I know, that Alice could be compelled to release this data (4th and 5th amendment cases possible), that there will likely be no open source alternatives to the centralized AppleGoogle plan where both will want to build into the OS, doesn't say that just because the "public data" was X that there isn't also meta data Y that a mfg could add and supplement (like GPS data) and just the whole idea of opt-out might be questionable now and impossible later.
It's a good idea to show how it could be privacy "OK" , but doesn't even hint on a reality that this might not be a road we want to go down - paved best intentions and all that.
1. Alice limiting what she reports is a function of her phone software; this is trivially doable by just not broadcasting beacons, or after-the-fact by not reporting your beacons (or reporting false beacons) for certain times.
2. The number of minutes is a threshold decided by epidemiologists, programmed into people's phone, as a public-policy heuristic. It doesn't need to be perfect, just to catch a lot of the actual contacts in its filter while avoiding too many false positives - even quarantining 50 or 60% of contacts can reduce spread by a lot!
3. You could be compelled to release data, but all you would get is a list of beacons; you'd then have to also subpoena/steal everyone else's records of whose beacons they saw (this data stays on local devices!) and correlate encounters with actual locations. This is HARD - like, nation-state actor hard. Like, nation-state intelligence agency hard, probably beyond the reach of your average criminal-justice apparatus.
5. If your threat model is "my phone/OS manufacturer will publish code that doesn't follow the protocol", then you're screwed anyway. (And in fact, Apple and Google aren't following this protocol, they're coming out with their own joint system that they claim is DP3T-inspired.)
1. You clearly didn’t reference the panel where Alice is choosing to upload data with exempting sections from it.
3. Yea, that’s great and all. The point is that you may compelled to upload your contacts - NOT your beacons. You are collecting people you met. You likely do understand some of the ways contract testing could be abused. Think about police trying to prove contact with someone they do have beacons for.
5. So why bother with passcodes, or any security at all anywhere for any new or existing feature? This is a slight evolution on “I have nothing to hide”.
1. That's exactly what I was referencing? i.e. I interpreted your saying "with a lot of assumptions. Like that Alice can limit what she reports" as saying "a feature of the protocol, explicitly mentioned in the comic, is an assumption not to be trusted." Whereas that's an option which is already in the preliminary demo apps.
3. Indeed, they can use this to prove that two people who are both under surveillance/investigation were in the same area as each other. That's a risk, but not as severe of a risk as the general fear I've seen around this protocol.
5. That's my point. If you don't trust your implementer to actually implement the security feature, you're screwed anyway. You have to trust someone, and by using a modern phone you are implicitly trusting Apple and Google. Or at least trusting their employees to make a big stink if lines are crossed.
A nice comic version of DP-3T, by Nicky Case, is available here: https://ncase.me/contact-tracing/
(But is the Apple-Google protocol identical to DP-3T, or does the headline mention “Apple-Google approach” just for simplicity? I heard there are some minor differences…)