In the bustling Roppongi district in Tokyo’s Minato ward lives the Mori Art Museum, a contemporary art space nestled in the 54-story Roppongi Hills Mori Tower.
A recent exhibit at Mori Art Museum entitled “MACHINE LOVE: Video Game, AI and Contemporary Art” contained approximately 50 works of contemporary art utilizing game engines, AI, and virtual reality (VR). Among the works exhibited was German American artist Diemut Strebe’s “El Turco/Living Theater”, featuring code written by Kahlert School of Computing Assistant Professor Ben Greenman.
The piece presents two character puppets on screen. The puppets speak out loud as their lips move in sync, and their words appear on screen like a chat history. One puppet portrays an inventor of smart home devices being interviewed by the other puppet. However, the course of the conversation can change, and each performance is unique.
One puppet is controlled by a human. Another puppet is controlled by Anthropic’s Claude AI. The audience is faced with a challenge: which puppet is AI, the inventor or the interviewer? Does it matter?


Photos of “El Turco/Living Theater”courtesy of elturco.diemutstrebe.com
Behind the scenes, this project combines several technologies: including Claude API, Azure text to speech, Amazon speech to text, and Unreal Audio to Face. The piece uses the Racket programming language, developed by Kahlert School Professor Matthew Flatt, to synchronize these different technologies in an event-based framework. For example, Audio to Face can sleep until Claude has written the next part of its puppet’s script.
Select performances of “El Turco/Living Theater” are available on the artist’s YouTube channel.
Greenman would like to extend a special thanks to Varun Shankar for providing access to a machine for software development.