The UK’s latest COVID-19 Test and Trace app update failed to follow terms of use the nation had already agreed to.

Apple, Google, NHS, Coronavirus, COVID-19, security, privacy

ColdSnowstorm / Avesun / Getty Images

Apple and Google have been forced to reject the UK’s latest COVID-19 Test and Trace app update because it failed to follow privacy rules the nation had already agreed to follow in order to use the frameworks the tech firms provide.

Keeping deals

In line with World Health Organization (WHO) advice to test widely and act fast in the event of COVID-19 outbreaks, Apple and Google moved quickly at the beginning of the pandemic to develop a private-by-design Exposure Notifications system the world’s health authorities could use to build digital track-and-trace systems.

Both firms explained the need to prevent these systems from eroding privacy, built privacy safeguards inside the system, and insisted nations using it respect people’s privacy. These requirements are crystal clear in the terms and conditions of the software.

The companies recognized that the state of emergency should not be used as a way to sneak surveillance technologies in through the back door. Both Apple and Google are paying increasing attention to the consequences of that.

In the UK, at least, the government chose instead to try – and fail – to build a less private system. Now, the UK is back with a second attempt to use track and trace in a manner that erodes privacy. At the same time, many other nations now have functional systems built using the Apple/Google foundation that have cost less to develop and are now in use. The UK’s system cost billions, but has made little difference.

Why the ban?

Apple and Google have rejected the latest NHS app update because it includes functions that have been banned from the start. That means UK users seeking the government’s own contact-tracing app can only download an older version.

The updated version included a tool that required users to check-in to venues they visited using a QR code and the app. If they subsequently tested positive for the virus, the app would upload logs of those check-ins and warn others.

[Also read: Fueled by pandemic shifts, mobile is now even more critical]

While this almost sounds reasonable, it actually isn’t, because it effectively means authorities collect personally identifiable location data in direct contravention of the conditions of use Apple and Google have always required their contact tracing framework.

It’s also a little unnecessary, given the system already includes ways in which others who may have been exposed to infection can be warned in a way that protects the privacy of all parties.

It is interesting that this problem does not affect users in Scotland, which uses a different Check In Scotland app in conjunction with its contact-tracing effort. The UK Department of Health has not explained why it chose to indulge in yet another failure in this important tool at a time of crisis.

In tech, all for one is also all for all

The terms of use of the Apple/Google system are clear:

“The goal of this project is to assist public health authorities in their efforts to fight COVID-19 by enabling exposure notification in a privacy-preserving manner, and the system is designed so that the identities of the people a device comes in contact with are protected.

“Access to the technology will be granted only to public health authorities. If they create an app, it must meet specific criteria around privacy, security, and data control. The public health authority will be able to access a list of beacons provided by users confirmed as positive for COVID-19 who have consented to sharing them. The system was also designed so that Apple and Google do not have access to information related to any identifiable individual.”

Given that many less democratic governments may otherwise choose to exploit the need for such apps, it’s a protection that makes sense.

The tech firms cannot make an exception for one government, or they would be required to make an exception for all. Apple CEO Tim Cook recently observed that, “Once you have a back door, you have a back door for everybody.”

A battle for our times

Critics argue that this is another misuse of tech industry power. That’s not correct.

This is an illustration of tech firms taking a stance against mission creep, working to prevent governments and private data firms eroding privacy. It is also precisely in line with the arguments coming primarily from Apple and increasingly from Google and other tech firms that recognize mobile devices are woven into every part of life, making it essential to ensure those devices are secure.

Coronavirus has already deeply impacted global society and exposed systemic inequalities. Abandoning personal digital privacy in response to that existential threat will only extend deeper damage to our ways of life.

“In terms of privacy…, I think it is one of the top issues of the century,” Cook has said. “We’ve got climate change, that is huge. We’ve got privacy, that is huge. And they should be weighted like that and we should put our deep thinking into that and to decide how can we make these things better and how do we leave something for the next generation that is a lot better than the current situation.”

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.