Snapshot
- Lack of accuracy and quality in facial recognition algorithms and technology can result in false negatives and false positives, leading to misidentification of individuals.
- Entities outside Australia may be subject to the Privacy Act 1988 (Cth) if the right conditions are met.
- Use or provision of (free) trials of software subscriptions by APP entities may be subject to the Privacy Act.
In Commissioner initiated investigation into Clearview AI, Inc. (Privacy) (2021) AlCmr 54 (‘Clearview AI’) the question for Australian Information and Privacy Commissioner Falk (‘Commissioner’) was whether Clearview AI Inc. breached the Privacy Act 1988 (Cth) (‘the Act’) through use of its biometric recognition tool (‘the Tool’) which scrapes images from public webpages which it stores in a database of 3 billion digital images for image matching purposes. On 14 October 2021, the Commissioner ordered Clearview AI Inc. to destroy and not collect any more images of individuals in Australia.
Clearview AI signals increasing scrutiny of entities doing business in Australia that collect, use and transact biometric data about Australian individuals.
Context
In 2001, attendees at the Super Bowl, Florida Stadium, had their faces scanned and the images compared against a database of criminal mug shots. The catch? According to the swift media outcry, the attendees didn’t know face scanning and database matching was taking place. That was over 20 years ago, but biometric technology is still receiving attention. Since the New York Times in 2020 exposed Clearview AI Inc. having a database of 3 billion images used for biometric recognition technology, the UK Information Commissioner’s Office (‘ICO’) and Commissioner jointly investigated Clearview AI Inc.’s data processing practices, the ICO announced its provisional intent to impose a fine on Clearview AI Inc. and in 2021, the Commissioner determined Clearview AI and Australian Federal Police (Privacy) [2021] AICmr 74 (‘AFP’).