Australia’s Big States Sample Facial‑Recognition Tech During Quarantine
In a move that could raise eyebrows across the nation, New South Wales and Victoria are testing facial‑recognition software to make sure folks stuck at home during Covid‑19 are actually staying put.
What’s Going On?
Genvis – a small, Perth‑based startup – has filed its software on a “voluntary” basis. The tech was born in 2020 when the company partnered with Western Australia police to keep people from wandering during the early pandemic lockdowns. Now the company’s eye‑on‑the‑ground frontiers are NSW and Victoria, home to Sydney and Melbourne and more than half of Australia’s 25 million residents.
While Genvis is quietly boasting about its trials, the whole approach has already drawn global attention. South Australia kicked off a similar system last month (but not powered by Genvis), and privacy advocates worldwide have been sounding the alarm about “surveillance overreach.” With NSW and Victoria stepping up, worries are only growing.
How the System Works
- When a random check‑in is sent, residents must snap a quick selfie at their quarantine address.
- The software also grabs GPS data. If it can’t match the selfie to its database of “facial signatures,” the police will swing by in person to confirm the plaintiff’s presence.
Queensland’s so‑called “two‑week in‑hotel” rules are being eyed for replacement, as this new tool could make it easier for Australia to reopen borders without having to keep arrivals in the hotels under police supervision.
Why the Concerns?
Right‑wing advocacy groups and civil‑rights advocates have warned that the same tech could be dragged into everyday policing – a step that might disproportionally target ethnic minorities or other vulnerable demographics. The idea that one of the world’s most democratic nations could mimic the opaque surveillance model used in China is a recipe for stomach‑dropping.
With NSW Premier Gladys Berejiklian and Victoria Police diverting inquiries straight to their respective Health Departments, information is as scarce as a fresh bottle of canned beer, leaving citizens in the dark.
Bottom Line
While Genvis claims the trials are free‑choice and only a safety measure, the scale of the rollout underscores the need to keep a close watch on how technology is used in the name of public health.
Keep communities safe
Genvis CEO Talks Home Quarantine Tech, Leaves the Verdict to the Folks
Genvis chief executive Kirstin Butcher stayed tight‑lipped on the trial details, aside from the bits already posted on the product’s own website.
When the interviewer pulled the phone, she reminded people that
You can’t have home quarantine without compliance checks, if you’re looking to keep communities safe.
And because a person can’t scan your face every hour on a global scale, she added:
You can’t perform physical compliance checks at the scale needed to support (social and economic) re‑opening plans, so technology has to be used.
Rights Advocates & AI Skeptics Raise Red Flags
Not everyone sings the same tune. Rights groups fear that the tech could be off‑spec, and that the police might piggy‑back on the data for other, less obvious reasons.
“I’m troubled not just by the use here, but by the fact that this is an example of the creeping use of this sort of technology in our lives,” explained Toby Walsh, an AI professor at the University of NSW.
Walsh also pointed out that facial recognition isn’t a foolproof system—it can be hijacked to spit out false location data.
“Even if it works here… then it validates the idea that facial recognition is a good thing.” He added, “Where does it end?”
Western Australia’s Stance on Data Use
Western Australia says it’s banned police from using Covid‑related data for anything else. Yet the WA police claim they’ve processed 97,000 home‑quarantine checks using facial recognition without a single hitch.
Edward Santow, a former Australian Human Rights Commissioner who now heads an AI ethics project at the University of Technology Sydney, weighed in:
“The law should prevent a system for monitoring quarantine from being used for other purposes.”
“Facial recognition technology might seem like a convenient way to monitor people in quarantine, but if something goes wrong with this technology, the risk of harm is high.”
So while the tech might keep folks from sneezing on strangers, it’s still a slippery slope that could turn everyday footage into a pitch‑black repository for endless surveillance. Whether we’re ready for that is a question we’re still figuring out.
