International treaty to curb military use of AI 'urgent and essential', Oireachtas committee to hear

A military weapons expert will argue that it would be difficult to hold human operators or companies accountable 'for the unpredictable actions of a machine they cannot understand'
International treaty to curb military use of AI 'urgent and essential', Oireachtas committee to hear

Also, AI advisor to Unesco, Rosanna Fanni, will tell the committee that AI in warfare is not limited to the cultural idea of 'killer robots'. File photo

An international treaty to restrict the use of artificial intelligence (AI) for military decision-making and autonomous weapons is now “urgent and essential”, the Oireachtas AI Committee is set to hear.

Bonnie Docherty, a Harvard law lecturer and military weapons expert, is set to tell the committee that the use of autonomous weapons systems in peacetime would likely contravene international human rights law, in that humans could be “arbitrarily deprived of life” by a system operating without human reason.

Ms Docherty will tell the committee — which has been considering the impact of AI on international defence for several weeks — that autonomous weapons and AI are problematic for several reasons, legal and moral, and because “they could lower the threshold to war and lead to an arms race”.

Read More

“Machines cannot understand the true value of a human life because they are not themselves living beings. In addition, they would instrumentalise and dehumanise their targets by relying on algorithms that reduce people to data points,” she is due to say.

She will say that autonomous weapons risk being discriminatory by default, given that — as has been seen with other AI technology such as X’s Grok chatbot — “algorithmic bias can disproportionately and negatively affect already marginalised groups and discriminate against people based on such categories as race, sex, or disability”.

She will add that the various problems afflicting the use of AI in international defence are exacerbated by “a gap in accountability” for any harms that may be caused by autonomous weapons.

Ms Docherty will argue that it would be difficult to hold human operators or companies accountable “for the unpredictable actions of a machine they cannot understand”, while likewise holding individual programmers or developers accountable would be problematic legally.


“A legally binding instrument is needed to adequately address this plethora of problems. It should prohibit the autonomous weapons systems that inherently operate without ‘meaningful human control’ and those that target people,” she will say.

Informal discussions between countries should give rise to full official talks between countries as soon as possible, she will argue, given the rapid pace at which AI technology is developing and changing.

“Given the speed of technological development and gravity of the threat these systems pose, moving diplomatic discussions to formal treaty talks is urgent and essential,” Ms Docherty will say.

'Killer robots'

Separately, AI advisor to Unesco, Rosanna Fanni, will tell the committee that AI in warfare is not limited to the cultural idea of “killer robots”.

“AI facilitates many non-kinetic processes, including intelligence analysis, decision-support systems, and logistics operations,” Ms Fanni will say. 

“These uses still have significant ethical impact and raise important governance considerations,” she will add.

x

More in this section

Politics

Newsletter

From the corridors of power to your inbox ... sign up for your essential weekly political briefing.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited