A new research initiative at the University of Melbourne aims to determine whether robots equipped with artificial intelligence can deliver genuine comedy. Led by Dr Robert Walton, a dean’s research fellow in the faculty of fine arts and music, the project has received a grant of approximately $500,000 from the Australian Research Council. Walton’s approach differs significantly from existing AI models, focusing on non-verbal communication rather than text-based humor.
Walton believes that while robots often elicit laughter through accidental mishaps, such as falling over, their attempts at intentional humor tend to fall flat. “Robots are good at making people laugh… they are humorous because they break and bump into things,” he explains. “However, when they try to do something funny on purpose, it ain’t so funny anymore.”
This project involves training a group of around ten robots, which will not be humanoid but rather ground vehicles ranging from 40 cm to 2 meters tall. Initially, the focus will be on visual comedy, allowing these robots to learn through human interaction. They will be designed to detect subtle movements and cues, such as a tilted head or laughter, enhancing their ability to respond in comedic contexts.
Dr Walton describes the robots as akin to infants who are just beginning to understand the world around them. “That’s partly what we’re trying to do with machine learning and AI—giving it more ways to sense and build a more holistic understanding of what it means to be in the world,” he states. He emphasizes that standup comedy provides a unique platform for clear interaction between the robot and the audience, allowing for immediate feedback.
When asked if vocal capabilities would eventually be integrated into the robots, Walton responded, “Potentially. It depends on how we go.” This project raises important questions about the role of AI in creative fields, particularly in an industry where job security is increasingly a concern due to automation.
While Walton’s research seeks to explore the potential for robots to understand and perform comedy, it also aims to investigate the implications of humor in human-robot interactions. He notes that humor can be a tool for connection, but it can also be used manipulatively. “While I’m looking into building belief in comedy performance by machines, I’ve got this other eye on what does it mean and how might this be used coercively,” he explains.
Despite the ambitious goals of the project, skepticism remains about the feasibility of making robots genuinely funny. At the recent G’Day USA arts gala, Australian comedian Tim Minchin expressed doubts, stating that audiences are drawn to the human experience behind comedy. “AI might come for the perfectible stuff but never for our flaws. Our flaws are our humanity,” he remarked.
Similarly, Susan Provan, director of the Melbourne comedy festival, highlighted the essence of human originality in comedy. “A performer is bringing something only they can bring, because they are bringing their individual lived experience to the material,” she said. Provan added that the humor derived from a robot’s failure to perform could be amusing, but it would not equate to genuine comedy.
As the project progresses, Dr Walton and his team remain committed to understanding not only the mechanics of humor but also the broader implications of AI in creative industries. By exploring the interplay of comedy and technology, they hope to shed light on the evolving relationship between humans and machines.