A musical performance through AI eyes

A musical performance through AI eyesHow do machines “see” a musical performance? The artist Trevor Paglen—whose probing work has included locating underwater cables and charting US military bases in the desert—will explore this in a new performance piece, Sight Machine, to be staged at Pier 70 in San Francisco today (14 January) at 8pm. The San Francisco-based Kronos Quartet will play a live 12-song set that ranges from Bach to blues, to an audience of both people and machines. During the first song, Bach’s Contrapuntus II, a live video feed of the performance will be projected above the musicians. But for the rest of the set list, machines will interpret the video feed of the performance, and the algorithmic visuals will be projected above the musicians, allowing the (human) audience to see the performance through artificial intelligence “eyes”. The performance, a collaboration with the creative studio Obscura Digital, is a commission by Stanford University’s Cantor Arts Center, where Paglen is currently the institution’s first artist in residence. Paglen will also talk about Sight Machine and artificial intelligence and ethics today at 11am at the FOG Design + Art Fair in San Francisco with Kate Crawford of Microsoft Research in New York, Alison Gass, the Cantor Art Center’s associate director of collections, exhibitions and curatorial affairs and Jennifer Granick, the director of civil liberties at the Stanford Center for Internet and Society.