INTERPOL hosted an event in Singapore bringing leading experts together with the aim of examining how AI will affect crime and prevention.
The event, organised by INTERPOL and the UNICRI Centre for AI and Robotics, was held at the former’s Global Complex for Innovation. Experts from across industries gathered to discuss issues and several private sector companies gave live demonstrations of related projects.
Some technological advances in AI pose a threat. In a recent interview with Irakli Beridze, he provided us with an example of AI potentially being used for impersonation. This could eventually lead to completely automated fraud.
Speaking about the Singapore event, Beridze said:
“I believe that we are taking critical first steps to building a platform for ‘future-proofing’ law enforcement.
Initiatives such as this will help us to prepare for potential future types of crime and capitalize on technological advancements to develop new and effective tools for law enforcement.”
Bringing policing up-to-date on these emerging threats is vital. Some 50 participants in law enforcement from 13 countries attended the event to exchange their expertise with the private sector and academia.
Some of the potential use cases for AI in law enforcement was fascinating. Discussions were held about things such as conducting virtual autopsies, predicting crime to optimise resources, detecting suspicious behaviour, combining with blockchain technology for traceability, and the automation of patrol vehicles.
Anita Hazenberg, Director of INTERPOL’s Innovation Centre, commented:
“Innovation is not a matter for police alone. Strong partnerships between all stakeholders with expertise is necessary to ensure police can quickly adapt to future challenges and formulate inventive solutions.”
Naturally, there are many obstacles to overcome both technologically and socially before such ideas can be used.
One major concern is that of AI bias. Especially where things such as facial recognition and behaviour detection are concerned, there’s potential for automated racial profiling.
A 2010 study by researchers at NIST and the University of Texas in Dallas has already found that algorithms designed and tested in East Asia are better at recognising East Asians, while those designed in western countries are more accurate at detecting Caucasians.
Several live demonstrations were given at the event by private sector companies including virtual communications, facial recognition, and incident prediction and response optimisation systems.
Police forces are planning to invest heavily in AI. Singapore Police, for example, has deployed patrolling robots and shared their experience with them during the conference.
Next on INTERPOL’s agenda is drones. The organisation will be holding a drone expert forum in August to further assist police in understanding how drones can be a tool, a threat, and a source of evidence.
What impact do you think AI will have on crime and policing? Let us know in the comments.