To Contact tracing applications For facial recognition, technology has become part of the arsenal used to protect public health.
While this may have helped save lives, rights advocates say intrusion solutions may already be so busy that personal privacy is a long-term price that many people can pay.
“Once a large system enters society, it is difficult to fix it fundamentally, even if there is a problem,” said Chang Yeo-kyung, executive director of the South Korean Institute for Digital Rights.
The success story of COVID-19 has been largely in the country, partly thanks to aggressive testing and tracing.
This year, as a highly contagious but less fatal case Omicron variant Increased, it abolished contact tracing and mandatory isolation for those vaccinated in favor of self-diagnosis and home treatment to free up medical resources.
However, in December, it announced a nationally funded pilot to use artificial intelligence, facial recognition and thousands of CCTV cameras to track the movements of infected people – a move that raised privacy concerns.
The project was scheduled to begin in January in Bukion, one of the country’s most densely populated cities on the seam of Seoul, but is said to have been delayed.
“There is concern that surveillance will become a ‘new norm’ for our society after Kovid-19,” Chang told the Thomson Reuters Foundation via email.
For example, he said people are already accustomed to showing proof of identity before entering the venue.
The government has also extended the boundaries, Chang said, using cell towers to identify thousands of people at a given location in one instance – then faced only modest resistance.
QR codes and body heat
Elsewhere in Asia, countries ranging from Singapore to India, from Thailand to Taiwan continue to use contact tracing apps to track local residents as well as keep tabs on tourists.
Singapore, Thailand and others also make extensive use of QR codes for check-in at malls, restaurants, airports and other sites.
Last year, Singapore said it would allow police to use personal data from its contact tracing application in “serious” criminal investigations, and introduced a bill that provided penalties, including jail time, for data misuse.
The Indian state of Jammu and Kashmir said last year that it had shared data from a contact tracing application with local police.
App-based food delivery companies, such as Zomato and Swiggy, began sharing workers’ names and body temperatures with customers.
Some Indian cities have also made it mandatory for municipal workers to wear tracking devices, while teachers in New Delhi have filed lawsuits to stop the use of biometrics in attendance applications, which they say invades their privacy.
Pull back
The increase in surveillance has sparked heated debate and some legal action, according to digital rights experts, as fears are growing that surveillance has already gone too far.
“We’ve been asked to provide a lot of data for the purposes of controlling the virus. Sometimes it was necessary, sometimes it wasn’t,” said Carissa Veliz, a professor at the Institute for Ethics in AI at Oxford University in Britain.
“On the other hand … we are seeing more reaction and more awareness than before. I think people are tired of the feeling of espionage.”
Estelle Masse, global data protection lead at rights group Access Now, said contact tracing apps in Europe have been relatively good in terms of privacy protection, mostly due to public discourse.
“Most of the potential privacy threats that may exist have not materialized,” she said.
European applications, for example, store large amounts of data on people’s phones instead of a central database, she said, while the limitation of logged information extends only to the need to know.
Rollback?
But not everything went according to plan – not when the authorities used private data designed to prevent the virus for other reasons.
In Germany, prosecutors in Mainz apologized when it emerged that police had secretly obtained details of people gathered by Luca, a privately developed contact-tracing application, as part of an investigation into a man’s death.
Luca said the app worked because police only had access to data about the restaurant visited by that person when the health department got it to pretend it was the site of the infection.
Similar cases have caused a stir in Australia, where two states have tried facial recognition software that allows police to investigate whether people were at home during the quarantine.
And in the UK, there are reports that some QR codes used by pubs and restaurants, the terms and conditions of check-in apps allowed client data to hold for years and raised eyebrows.
It underscores the importance of reducing the amount of data that can be collected, and puts a strong legal framework on its use, Massey said.
But as the world moves from an epidemic to a local phase, it is time to start discussing what will happen next, she said.
“We are entering a phase when it comes to questions like, ‘How long will we need those applications?’,” She said.
If they are no longer considered necessary, governments have a duty to help phas them out and ensure that companies do not reuse resources for other uses.
“It’s kind of the nature of the internet, platforms disappear and people forget that they have an account somewhere. But these are apps that were pushed by governments to be used by millions of people,” Massey said.
“The way governments have with their rollout and use, they have to stick with their deletion from users.”
However, some developers believe that their applications will last a lifetime after Kovid because people know the benefits of digitizing services – as far as data security is concerned.
Patrick Heinig, director of the Luka app in Germany, said his firm’s experience tracking the virus at locations could easily be used to streamline restaurant payments or hotel check-in.
“(People) are very willing to share their data if they really see the benefits,” he said. “If things are done right, no problem, the general public will accept it.”