Tuesday, August 5, 2025
29.9 C
Los Angeles

Tanzanian Opposition Leader’s Trial Again Postponed

This week, a Tanzanian court postponed the politically motivated...

Iran: Authorities Amputate Three Prisoners’ Fingers in Acts of Torture

(Beirut) – Iranian authorities used a “guillotine machine” to...

Houthi Video of False Confessions an Apparent War Crime

Videos released by the Houthis on July...

UK Plans AI Experiment on Children Seeking Asylum

AI/MLUK Plans AI Experiment on Children Seeking Asylum

The United Kingdom’s announcement on July 22 that it would use AI face-scanning technology to evaluate whether an asylum seeker is under age 18 threatens to harm children seeking refuge. The asylum minister, Angela Eagle, said that the decision was because this experimental technology is likely to be the cheapest option.

Experimenting with unproven technology to determine whether a child should be granted protections they desperately need and are legally entitled to is cruel and unconscionable.

Facial age estimation has not been independently evaluated in real-world settings. Companies that tested their technology in a handful of supermarkets, pubs, and on websites set them to predict whether a person looks under 25, not 18, allowing a wide error margin for algorithms that struggle to distinguish a 17-year-old from a 19-year-old.

AI face scans were never designed for children seeking asylum, and risk producing disastrous, life-changing errors. Algorithms identify patterns in the distance between nostrils and the texture of skin; they cannot account for children who have aged prematurely from trauma and violence. They cannot grasp how malnutrition, dehydration, sleep deprivation, and exposure to salt water during a dangerous sea crossing might profoundly alter a child’s face.

These AI systems’ inability to explain or reproduce results further erodes a child’s right to redress and remedy following wrongful assessment, while creating new privacy and non-discrimination risks.

The UK government has repeatedly and illegally subjected children seeking asylum to abusive conditions by wrongly classifying them as adults. Just last week, the UK’s chief inspector of borders and immigration wrote about “young people who felt disbelieved and dismissed by the Home Office, whose hopes have been crushed, and whose mental health has suffered.”

The government plans to tender for contracts in August and use the technology in 2026. It should stop and instead follow the chief inspector’s recommendations to fix flawed age assessment processes. These processes should adhere to international standards, be used as a last resort as evidence to resolve serious doubts about a person’s declared age, and the decisions should be conducted by professionals trained in child protection and trauma.

And the government should listen to the young person who, after surviving her difficult journey to the UK, told inspectors that the government “should not be judging people on their appearance on the first day they meet them.”

Story from www.hrw.org

Disclaimer: The views expressed in this article are independent views solely of the author(s) expressed in their private capacity.

Check out our other content

Ad


Check out other tags:

Most Popular Articles