Digital rights groups are pushing for more robust digital privacy regulations as Australia moves into the next phase of the pandemic, warning that regulations around personal data collection are not up to scratch.
The blowback is directed at South Australia’s home quarantine app, which works by contacting people in quarantine at random and requesting proof of their identity and location within 15 minutes.
The app uses facial recognition and smartphone geo-location as verification tools.
Failing a check-in – which happens when the person misses their 15-minute window, is located outside their home or is unable to be recognised by the app AI – prompts a visit from SA police.
New South Wales, Western Australia, the Northern Territory and Victoria are in different stages of rolling out similar apps for home quarantine. Queensland is a notable exception, in that its app uses only geo-location data.
In response to this increased uptake, the Human Rights Law Centre and Digital Rights Watch co-wrote an open letter to the country’s various health ministers outlining their concerns with the technology.
Digital Rights Watch project lead Samantha Floreani said that while the organisation supported the use of technology in home quarantine, sensitive biometric data was being left vulnerable.
“What we’re concerned about is that there aren’t appropriate protections in place to prevent the data that’s collected via these apps to ensure that information isn’t later going to be misused or used for other purposes,” she told Wild Health.
Facial recognition data, she said, was particularly problematic in that if there were to be a data breach, then – unlike with a leaked password or stolen license – there would be no way for an individual to re-secure their identity.
“You can’t change your biometrics, at least not easily,” Ms Floreani said.
Complicating this issue further is the fact that data, at least in SA, would be encrypted and stored in a central repository, to be destroyed at the conclusion of the pandemic.
“There are other ways that we can approach this,” Ms Floreani said. “We could, for example, do a decentralised approach where you don’t have all of that information containing people’s biometric information in one central location, a practice which can raise all kinds of privacy and security risks.
“What we would prefer to see is a system where you can meet the needs of checking in via the app, but have that information never leave your personal device.”
There are also no standalone privacy protections in place, meaning the stored data could potentially be accessed by law enforcement later down the track.
Ironically, the government’s failed COVIDSafe app had strong baseline privacy protections.
“COVIDSafe hasn’t been very useful on a practical level, but because of concerns that were raised at the time, it has robust privacy protections built into the authorising legislation,” Kieran Pender, a senior Human Rights Law Centre lawyer, told Wild Health.
“The data that will be captured by home quarantine apps are just as sensitive, if not far more sensitive, so it’s particularly important that those same safeguards are there.”
Another concern stemmed from the fact AI often failed to recognise the faces of people with darker skin tones, sparking fears of discrimination.
“It’s not too far-fetched to imagine the app failing to recognise someone and police being sent to check on them, just because of these proven and well-demonstrated shortcomings with facial recognition technology,” Mr Pender said.
“The fact that the government is considering imposing any sort of requirements using this technology, knowing its shortcomings and without adequate safeguards to address those, is very alarming.”
Like Ms Floreani, Mr Pender stressed that his organisation was not opposed to quarantine requirements or a shift to home quarantine, but was concerned about the specific methods being used.