Imagine a world where a simple brain scan could predict future criminal behaviour. Sounds like science fiction, right? But with rapid advancements in neurotechnology, this chilling scenario is inching closer to reality. While the potential benefits of preventing crime are undeniable, the ethical and legal implications can be a minefield.
So, should brain scans ever be used to predict criminality?
Some might argue that identifying individuals at high risk of recidivism could revolutionise criminal justice. Resources could be targeted for preventive measures, potentially reducing crime rates, improving rehabilitation outcomes, and ultimately reducing the pressure on our courts.
Dr Allan McCay, an Academic Fellow at the Sydney Law School and Deputy Director of the Sydney Institute of Criminology, says predictions of recidivism are of interest to sentencing in courts and that neurotechnology is already providing a level of predictive dimension.
Occasionally, expert opinion is drawn from brain-scans obtained in a hospital facility to inform the court about an offender’s mental condition, McCay says.
However, current brain scanning technology is far from perfect and prone to misinterpretation, potentially leading to unfair profiling and biases.
McCay says there has been much academic discussion about the kind of inferences that are or should be made from brain scans in the field known as “neurolaw”, and their predictive reliability remains a matter of controversy.
The future of brain scans
While brain-scanners have been largely confined to hospitals and research facilities, this is rapidly changing.
In January this year, Elon Musk announced that his company Neuralink had successfully implanted a Brain Computer Interface (BCI) into a human brain, and a follow-up post by Musk on X (formerly Twitter) this week, said that the patient was able to control a computer mouse by thinking.
“We’re trying to get as many button presses as possible from thinking, so that’s what we’re currently working on,” Musk continued.
There is also Neuropace’s Responsive Neurostimulation (RNS) System, a neural device designed to help manage seizures in individuals with epilepsy who haven’t responded adequately to medication.
McCay explains the chip is implanted under the scalp and continuously monitors the brain. Then, using “a machine-learning approach (a form of AI), it acts to stimulate the brain when it detects the immediate neural precursors to an epileptic fit. Once it detects a certain neural pattern, it predicts a fit is impending and it acts to prevent it.”
He believes that by using a neurotechnological approach, similar to the RNS System, it might be possible to predict and prevent angry outbursts.
Studies have shown a correlation between poor anger management and increased risk of criminal behaviour, especially violent crimes. While anger alone doesn’t necessarily predict crime, uncontrolled anger can impair cognitive function, decrease impulse control, and increase aggressive behaviour, all of which can contribute to criminal acts.
“It seems possible in the future that an offender with an impulsivity or anger management problem might, between offending and sentencing, have a predictive and interventionist neural device implanted in order to try to get some form of bond rather than be sent to jail.” McCay says.
Ethical dilemma
Technological advancements are pushing the boundaries of brain scanning, promising deeper insights into the human mind. However, peering into someone’s brain raises concerns about accuracy, privacy, and discrimination. Can we truly hold individuals accountable if their actions are deemed biologically predetermined?
Today, the courts already engage in the electronic monitoring of offenders but in the future, McCay says, “an offender with this neural device might argue that it will reduce their prospects of recidivism and so a jail sentence is unnecessary.”
“They might argue that Intensive Corrections Order comes with a condition that the neural device remain active might avert the need for a sentence of imprisonment.”
While this sounds like a promising outcome, striking a balance between harnessing the potential of neurolaw and protecting fundamental human rights is crucial.
“Such an order raises human rights concerns, and it may be that one day there might even be demands for this approach to become a more standard part of sentencing.” McCay says.
“Thinking of possibilities like these it is not hard to understand why the Australian Human Rights Commission has prioritised neurotechnology or why the United Nations Human Rights Council has asked for a report on neurotechnology to be prepared.”
As neurotechnology continues to advance, there are signs that proactive responses are being taken abroad. Last year, the Chilean Supreme Court ordered Emotiv, the neurotech company behind an external headset known as Insight, which monitors the brainwaves of users and might be used to monitor cognitive performance including levels of attentiveness or stress or used to control devices, to remove one of their citizen’s brain data from their portals and the cloud. It also required some of its regulatory bodies to further investigate Emotiv’s neurotechnological device.
Given Australia’s significant history of neurotech and expertise in biomedical engineering, there is a need now for robust legal frameworks to ensure its responsible use, safeguarding the privacy of our innermost selves while harnessing the potential benefits of this emerging field.
“Neurotechnology raises issues for many areas of law, and it seems that the attention of law reform bodies is now needed and engagement by the Australian Law Reform Commission would be a good start.” McCay suggests.