Low-cost tracking devices are being used to stalk and harass victims. With domestic violence at record levels, can technology companies be held liable for the foreseeable misuse of their devices?
For those of us prone to leaving our keys in any manner of inconvenient places, or for the sake of tracking adventurous pets or our luggage, tracking devices devices are a Bluetooth-enabled means to easily locate things we have lost.
But tracking products such as the Apple AirTag, which is small, inexpensive and can be attached discreetly, are proving to be a sinister tool for those who want to track people without their knowledge or consent. This Apple product, according to reports, has become a tool of choice for abusers and stalkers, particularly due to its accuracy.
Lawsuit against Apple seeks compensation
A class-action lawsuit, Hughes et al v. Apple, Inc., was filed against Apple by two women in California in the first week of December 2022. The claim is that Apple’s AirTags were used to stalk and endanger them. Subsequently, Los Angeles attorney Gillian Wade said her firm had been inundated with hundreds of calls from people who said they’ve been tracked without their permission using an Apple AirTag.
In December, Wade told US media, “Unfortunately, the AirTags are being used really as a means to perpetuate domestic violence in this country.”
The two plaintiffs in the lawsuit against Apple are victims of stalking by AirTag. Both women say former partners used the AirTags to track their whereabouts.
Among the 12 claims Apple faces, the plaintiffs allege Apple breached their privacy by geolocating them, violating state privacy laws, and using fraudulent marketing to deceive the public that AirTags were safe. The lawsuit states that “each Plaintiff continues to be at risk of unwanted and unlawful tracking via an AirTag device.” Apple is accused of Angeles unjust enrichment, negligence, negligence per se, intrusion upon seclusion and violations of New York General Business Law and California’s Constitutional Right to Privacy.
Apple cannot claim to be unaware of the potential of its AirTags to be used maliciously. In February 2022, the company issued a statement, stating, in part:
“Apple has been working closely with various safety groups and law enforcement agencies. Through our own evaluations and these discussions, we have identified even more ways we can update AirTag safety warnings and help guard against further unwanted tracking. We condemn in the strongest possible terms any malicious use of our products.”
Apple also said it had created a sound alert to be sent to Apple products like iPhone or Apple Watch, if an AirTag is detected nearby. It was this alert that enabled one of the class-action plaintiffs, Lauren Hughes, to discover an AirTag fitted to the wheel well on the rear tyre of her car, hidden in a plastic bag. At that point, Hughes had received numerous threatening calls from her ex-partner and had sought refuge in a hotel to escape him. She is one of the estimated 13.4 million victims of stalking in the US. According to the American Stalking Prevention and Awareness Resource Centre, approximately one in three women and one in six men experience stalking in their lifetime.
CEO of Domestic Violence NSW (DVNSW) Delia Donovan says, “the battle is between convenience and preservation of life.”
Donovan says, “A legal precedent enforcing accountability would be a positive step for the future of digital violence. When it comes to innovations in technology, we constantly need to weigh up whether the convenience outweighs the risk. Yes, I don’t want to lose my car keys, but I also don’t want to lose my right to privacy.
“At the moment, we are seeing significant global conversations around slowing down and regulating the advancement of AI, due to the dangers it poses. We need the same care and consideration taken when the danger is posed to victim survivors of domestic and family violence. Technology companies should have a responsibility to ensure the products they are designing cannot readily be used in a way that will harm people.”
A coordinated effort
In early May this year Apple and Google announced a coordinated effort to mitigate “unwanted tracking”, detailing some preliminary guidelines after consulting with industry and advocacy groups.
Alerts from Bluetooth location-tracking devices like AirTags will no longer be specific to particular platforms, enabling users of iOS and Android platforms to detect and discover AirTags, which were previously incompatible with all devices. Apple and Google have also provided manufacturers with the principles for best practice, and instructions “if they were to build these capabilities into their products”.
The plan has been submitted via the Internet Engineering Task Force, a leading standards development organisation in the US, with a three-month grace period for comments and reviews.
“Bluetooth trackers have created tremendous user benefits, but they also bring the potential of unwanted tracking, which requires industrywide action to solve,” said Dave Burke, Google’s vice president of Engineering for Android in the statement. “Android has an unwavering commitment to protecting users, and will continue to develop strong safeguards and collaborate with the industry to help combat the misuse of Bluetooth tracking devices.”
Following the comment period, Apple and Google will partner to address feedback, and will release a production implementation of the specification for unwanted tracking alerts by the end of 2023 that will then be supported in future versions of iOS and Android.
Apple has previously introduced measures to deter potential predatory misuse of the AirTags, introducing an alert feature in June 2021 for iPhone, iPad, or iPod Touch users to notify them if an unknown AirTag was detected moving with the user, on the proviso they were running iOS 14.5 or later. The warning appears on their device, assuming they are running the right platform, with the message: “AirTag Found Moving With You. The location of this AirTag can be seen by the owner.”
Tapping on the alert takes a user to the Find My app, which displays a map of the places travelled with the unknown AirTag. There are also instructions on how to disable the device. The major problem with the alert is that Android users cannot automatically access the feature, which is the central focus of the coordinated guidelines Google is seeking to address alongside Apple.
At present, Android users can scan for unknown AirTags with an app called Tracker Detect on Android, but the app must be open to receive an alert.
‘At the moment, we are seeing significant global conversations around slowing down and regulating the advancement of AI, due to the dangers it poses. We need the same care and consideration taken when the danger is posed to victim survivors of domestic and family violence.’
Apple also implemented a sound alert, which is triggered if the AirTag isn’t with its owner for a period of time. It emits a sound, which may alert a potential stalking victim that they are being tracked.
Apple’s other security measures, as revealed in a statement in February 2022 , include a unique serial number on all its AirTags, which are printed on the devices. Paired AirTags are also associated with an Apple ID. The company will only reveal the identity of the owner of an iPhone for which an AirTag is registered if required to do so by court order or a valid request from police. The statement says Apples has made updates to its Law Enforcement Documentation.
“We have been actively working with law enforcement on all AirTag-related requests we’ve received,” Apple said.
In relation to the additional implementation of beeping and alerts, Donovan says, “These are certainly helpful features, but by no means are they sufficient. There are so many situations where this will not assist victims, including not knowing what an AirTag is when they find it beeping. At the point that they find the tag, they have already been tracked. If they’re fleeing, this could prove fatal.”
She adds, “Accessibility is also a factor here – not everyone is digitally literate when it comes to technology. Then add language and accessibility barriers on top of that and you’ve got a huge issue. If technology doesn’t design with vulnerable communities in mind, then it will end up facilitating dangerous power dynamics.”
Too little, too late?
According to ABS data for the 2021–2022 financial year, an estimated 8 million Australians (41 per cent) have experienced violence (physical and/or sexual) since the age of 15; this includes 31 per cent of women and 42 per cent of men who have experienced physical violence, and 22 per cent of women and 6.1 per cent of men who have experienced sexual violence. An estimated 2.7 million people aged 18 years and over (14 per cent) have experienced stalking since the age of 15, including 20 per cent of women (2.0 million) and 6.8 per cent of men (653,400).
Donovan says, “Domestic violence is about control, and new technology like Apple AirTags provides perpetrators with more ways to control and monitor victim-survivors. AirTags in particular provide a new challenge because they are so small, easy to conceal and easy to place – from slipping into a handbag or a child’s favourite toy. Technology is so integrated into everyday life that victim-survivors of domestic violence can now be tracked through their social media, their phone, their car, their laptop as well as the technology of those around them. In 2020 the Wesnet National survey showed a 131 per cent increase in the use of GPS tracking apps since 2015. This is not just on devices belonging to women: the survey indicated a 346.6 per cent increase in perpetrators using devices [they have] given to children to monitor victim-survivor movements.”
‘Technology is so integrated into everyday life that victim-survivors of domestic violence can now be tracked through their social media, their phone, their car, their laptop as well as the technology of those around them.’
Donovan is also a board member of Wesnet, a national network representing organisations and individuals including women’s refuges, shelters, safe houses and information/referral services. In that capacity, she has heard of many cases of the abuse of surveillance technology.
“I have heard practitioners tell stories of [the location of] women being [identified as] shelters because of eTag accounts and tracking devices. Practitioners talk about technology like AirTags actually replacing spyware because they are so easy to buy in shops and then slip into belongings. It’s also terrible to see 34 per cent of frontline workers say they see increased use of children in those dynamics of control, with stories of children being given tracked devices as ‘presents’ and told to hide these from the victim-survivor.”
Asher Flynn is the Associate Professor of Criminology at Monash University and Chief Investigator of the Australian Research Council Centre for the Elimination of Violence Against Women.
Flynn was elemental in the joint Monash University and RMIT research and report Technology-Facilitated Abuse: Extent, Nature and Responses in the Australian Community, completed in July 2022. In the first nationally representative survey of victim-survivors and perpetrators of technology-facilitated abuse (TFA), the report revealed that the practice is widespread. Of the 4,562 people who took part in the survey, half had experienced at least one TFA-type behaviour in their lifetime (emotional abuse and threats, coercion, tracking and monitoring, harassment). There were higher rates of victimisation amongst those aged 18–34, Indigenous people, those with a disability, and those who identified as LGBTQIA+.
One in three victim-survivors kept their experiences to themselves, and the majority didn’t report to police, seek legal advice or contact the eSafety Commissioner.
“Victim-survivors reported experiencing a range of harms, including physical, emotional and mental health distress, as well as feelings of fear, paranoia and hypervigilance,” says Flynn.
Donovan observes, “We certainly need more regulation than we currently have, and there is significant difficulty in prosecuting these crimes. Unfortunately, the development of technology occurs far quicker than we can legislate and regulate. We cannot expect government and regulators to be able to foresee technological advancements and regulate for them.
“This is why we need responsibility put back on developers to make reasonable efforts to accommodate for the abuse of their products during design and development.”