When the “world’s first robot artist” announced he was giving evidence to a parliamentary committee, the House of Lords was probably hoping to shake up its dormant reputation.
Unfortunately, when the I-da robot arrived at the Palace of Westminster on Tuesday, the opposite appeared to be the case. Undoubtedly due to the packed atmosphere, the machine, which resembles a sex doll with a pair of egg whips, was shut down halfway through the testimony session. As its creator, Aidan Mellor, scrambles with power sockets to restart the device, he places a pair of sunglasses on the machine. “When we reset her, she can sometimes pull very interesting faces,” he explained.
The headlines that followed were unlikely to be what the Lords Communications Committee had hoped for when Mellor and his creations were invited to give evidence as part of an inquiry into the future of the UK’s creative economy. But Ai-Da is part of a long line of humanoid robots that have dominated the conversation around artificial intelligence, even though the technology that underpins them is cutting edge.
“The committee members and the roboticists all knew they were part of the fraud,” said Jack Stilgow, an academic at University College London who researches the governance of emerging technologies. “It’s a testimony hearing, and all we’ve learned is that some people really like puppets. There’s little intelligence on display — artificial or otherwise.
“If we want to learn about robots, we need to get behind the scenes, we need to hear from the roboticists, not the robots. Get roboticists and computer scientists to understand what computers can’t do rather than marvel at their pretensions.
“There are really important questions about AI and art – who really benefits? Who owns creativity? How can those who provide AI’s raw material – like Doll-E’s dataset of millions of previous artists – get the credit they deserve? Eye-da clouds help with this discussion.
Stilgo is not alone in lamenting the missed opportunity. “I can imagine that Ai-Da has many purposes, and many of them may be good,” said Sami Kaski, a professor of AI at the University of Manchester. “The unfortunate problem is that this time the public stunt failed and gave the wrong impression. And if the expectations were really high, those who saw the demo might generalize, ‘Oh, this field doesn’t work, this technology doesn’t work in general.’
In response, Mellor told the Guardian that Ai-Da “is not a fraud, but reflects our own current human efforts to decode and imitate the human condition. The artwork encourages us to reflect critically on these social trends and their ethical implications.
“Ai-Da is part of the discussion in Duchampian, and contemporary art and follows in the footsteps of Andy Warhol, Nam Joon Paik, Lynn Hershman Leeson, all of whom explored the human form in their art. Ai-Da is considered in the tradition of Dada, which challenged the concept of ‘art’. I-Da challenges the concept of ‘artist’. While good contemporary art can be controversial, our overall goal is to stimulate a broad and considered conversation.
As peers on the Lords Committee heard before Ai-Da arrived on the scene, AI technology is already having significant input on the UK’s creative industries – and not just in the form of humanoid robots.
“There has been very clear progress, especially in the last two years,” said Andres Guadamuz, an academic at the University of Sussex. “What was not possible seven years ago, the capabilities of artificial intelligence are at a completely different level. Even in the last six months, things have been changing, especially in the creative industries.
Guadamuz appeared alongside representatives from Equity, the Performers Union, and the Publishers Association, as the trio discussed the ways in which recent advances in AI capability are having real effects on the ground. Equity’s Paul Fleming, for example, raises the possibility of synthetic performances where AI is already “directly influencing” the actors’ situation. “For example, why would you engage multiple artists to put together all the moves that go into a video game if you can just mine the data? And it’s very complicated to break away from that, especially for a person.” If an AI could watch every performance of a given actor and create character models that move like them, that actor might never work again.
Dan Conway from the Publishers Association says the same risks apply to other creative industries and the UK government is making them worse. “There is a research exemption in UK law … and at the moment, the legal provision allows any of those businesses, of any size anywhere in the world, to freely access all of my members’ data for text and data mining purposes. . . . between a large US tech company in the US and an AI micro startup in the north of England. No difference. Technologist Andy Baio calls this process “AI data laundering,” and how a company like Meta can train its video-creating AI using 10m video clips scraped for free from a stock photo site.
The Lords inquiry into the future of the creative economy continues. No robots, physical or otherwise, are scheduled to testify.