Google is aiming to prevent societal disasters caused by its AI technology with the creation of a dedicated ethics panel.
The panel is called the Advanced Technology External Advisory Council (ATEAC) and features a range of academics and experts from around the world.
Eight people are currently on the panel, with some from as far as Hong Kong and South Africa. Among the roster is former US deputy secretary of state William Joseph Burns and University of Bath associate professor Joanna Bryson.
Bryson published a high-profile thesis called ‘Robots Should Be Slaves’ in which she argued against treating robots like people.
“In humanising them,” she wrote, “we not only further dehumanise real people, but also encourage poor human decision making in the allocation of resources and responsibility.”
ATEAC’s diversity is a strong point to ensure various backgrounds are fairly represented. It will focus on areas where AI could have a disastrous societal impact such as facial recognition.
Just yesterday, AI News covered a report which found 94 percent of IT leaders want a greater industry focus on the ethical development of artificial intelligence.
Google has struggled to convince people its AI developments will not be harmful. Even some of its own employees resigned over the infamous Project Maven contract with the Pentagon to supply AI technology for drones.
The announcement from Google suggests the company is attempting to ensure its own AI developments are ethical. That’s a welcome step, although it won’t convince everyone.
ATEAC first meeting will be held in April, with plans for three more over the course of 2019. A summary will be published following each with the aim of improving the ethics of the whole AI development industry.
Interested in hearing industry leaders discuss subjects like this and their use cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London, and Amsterdam to learn more. Co-located with the IoT Tech Expo, Blockchain Expo, and Cyber Security & Cloud Expo.