Smart Phones

Apple and Google’s contact-tracing system risks your privacy, says EFF

UPDATED May 4 with additional information regarding location tracking and limitations on the number of apps that can use the Apple-Google system. This story was originally published April 30.

The smartphone-based contact-tracing system being developed by Apple and Google carries substantial privacy risks, the Electronic Frontier Foundation (EFF) said in a report last week.

Law enforcement might use the system to reveal meetings between criminal suspects, the EFF said. Criminals themselves might pluck temporary Bluetooth identifier tokens out of the air and use them to build up a record of people’s movements. 

The EFF also said that pranksters or crooks could rebroadcast those temporary IDs en masse, undermining the entire contact-tracing system with a flood of misleading data. And the EFF urged, as Google and Apple have promised to do, that the system be “sunsetted” once the coronavirus crisis is over.

“The apps built on top of Apple and Google’s new system will not be a ‘magic bullet’ technosolution to the current state of shelter-in-place,” EFF staffers Bennett Cyphers and Gennie Gebhart wrote in the report. 

“Their effectiveness will rely on numerous tradeoffs and sufficient trust for widespread public adoption. Insufficient privacy protections will reduce that trust and thus undermine the apps’ efficacy.”

Private by design

To be fair, the Apple-Google contact-tracing system was designed with privacy in mind. Those temporary Bluetooth IDs, known as rolling proximity identifiers (RPIDs), change several times an hour to minimize the potential for tracking. RPIDs will be stored only on the phones themselves.

Users have to opt into the system in the first place, and then have to opt into their data being shared in the form of “diagnosis keys” if (and only if) they test positive for the coronavirus. 

The diagnosis keys can be used to generate previously used RPIDs, and other users can then check those shared RPIDs to see if they’ve had a recent close encounter with someone who’s tested positive.

Google and Apple aren’t building the apps that can use this system. Instead, they’re making public an application-program interface (API) so that third parties can use the system and make it accessible to ordinary users. The EFF links to half a dozen apps already in development that plan to use that API.

Of course, you can just avoid the entire system if you just keep Bluetooth turned off on your phone, or you have an older phone that can’t use the modern Bluetooth Low Energy (BLE) protocol.

But not without risk

In that light, the tone of the EFF report may strike some as unnecessary hand-wringing, especially since the Google-Apple system hasn’t been finalized and no apps that can use the system have been fully developed. But the EFF report does make some good points.

First, RPIDs won’t be verified as coming from any particular device, so you could record the RPIDs from other people’s phones and then rebroadcast them later to mess things up.

That’s arguably a feature, not a bug, because it also means you can’t easily track a particular RPID back to a particular device. However, any anonymized system can be “de-anonymized” with enough data. 

The EFF imagines that someone might set up Bluetooth receivers at regular points in public spaces, similar to the tracking “beacons” found in shopping malls, to harvest RPIDs from passing phones. 

You could then cross-reference the collected RPIDs with the public diagnosis keys uploaded to the contact-tracing central database by people who have tested positive for the coronavirus, the EFF points out, and thereby get a map of movements of many people who are infected. 

A fairly small map of an individual phone’s daily routines can easily identify where that phone’s user lives and, in many cases, the user themselves.

The EFF doesn’t mention it, but we imagine that if a public place wasn’t too crowded and it was easy to pick out individual devices, then you could also correlate RPID movements with those of the regular Bluetooth and Wi-Fi identifiers, aka MAC addresses, which phones normally broadcast. You could also use the ad IDs that smartphones use, many of which send location data to advertisers. 

A record of everyone you met up with

Second, the contact-tracing system may be used by police to prove that two or more suspects met up with each other, the EFF said. 

Law enforcement would need access to each suspect’s phone, which could be seized with a warrant or otherwise. But the contact-tracing system on each device would, in theory, have logs of every RPID it encountered for the previous two weeks. We imagine that divorce lawyers might also want access to such data to establish evidence of adultery.

Third, if a malicious app has access to the contact-tracing system — which is very possible, because the API will be public — then the malicious app could steal the data and upload it to a central server controlled by criminals or intelligence services. 

Get enough stolen contact-tracing data from enough phones, and you won’t need to place Bluetooth beacons in public places or search individual devices at all.

Whom do you really trust?

The EFF report says that the entire system relies on trust — trust that the underlying Apple-Google system is solid and not prone to attack, trust that the apps being developed to use the system are secure and private, and trust that criminals, police and ordinary users won’t abuse the system. 

“There’s a lot that can go wrong, and too much is at stake to afford rushed, sloppy software,” the report says. “Public-health authorities and developers should take a step back and make sure they get things right.”

Update: No location tracking, and only app per country

Reuters reported May 4, apparently drawing from a FAQ Apple posted online, that apps using the Apple-Google contact-tracing API would not be able to use phones’ location data. 

Reuters specified that that referred to GPS-based location data, but the Apple document said it was all location tracking, implying that tracking via cell towers and Wi-Fi networks was also forbidden.

The Reuters story also said that only one app in each country would be able to use the API. We couldn’t find that information in the FAQ Apple posted, but the FAQ does say that the apps will be developed by public health authorities in each region.

The implication there is that no private apps developed without the participation of public health authorities will be allowed to use the Google-Apple API. The FAQ goes on to say that “Google and Apple will disable the exposure notification system on a regional basis when it is no longer needed.”

Leave a Reply

Your email address will not be published. Required fields are marked *