Larkhall is thrilled to introduce our AI system Otto, designed to listen and follow along to a live performance. This “following” ability feeds into a live-visuals generating module, which can support or elevate the performance in a number of ways: creating reactive visuals for music, or allowing automatic and expressive captioning for speech-based performance.
Otto is capable of segmenting temporal events (i.e. section or scene changes) as well as separating out textural layers (i.e. is a given note melody or harmony? Which performer is speaking right now?). This is made possible with several approaches, deployed depending on the type of performance being followed: music vs speech, improvised vs scored or scripted.
Through this we aim to increase accessibility by translating a performance’s audio experience into visual elements, in a creative, artistically-compelling and immediate way.
The Challenge
Thanks to support from MyWorld’s Challenge Call led by Digital Catapult, we have been able to accelerate and deepen our development of three strands of Otto’s capabilities:
– Otto’s existing 2D projection ability
– Onstage pixel-mapping with custom LED lighting towers
– Automatic stage lighting control via live DMX generation
2D Projection
Prior to the MyWorld Challenge, this was Otto’s most highly-developed capability. We undertook work with technical visual artist Rod Maclachlan and stage lighting designer Jen Roxburgh to refine our offering in this strand and push the envelope in terms of the quality of our production.
Pixel-Mapping Towers
Industrial designer and maker Drew Batchelor came on board to develop our concept for taking visuals beyond a 2D screen. Our goal was to have portable, addressable pixel-mapping towers onstage with the performer in order to bridge the gap between 2D visuals and the performer existing on the 3D stage. With Drew, we built a series of prototypes and a control system through which Otto can address individual pixels on a group of 1-12 towers. Initial testing showed that this is indeed an exciting experience and we are continuing to develop this into a more advanced prototype.
DMX control
Beyond what a performer can tour with, venues often have substantial lighting rigs which accept control signals via DMX (the standard for digital control of lights in theatre). Normally this is achieved by programming lighting states into a software like QLab, and having a technician cue changes during the performance.
Our goal here had two parts: first, generate DMX control signals live from Otto, allowing the lighting changes to be triggered automatically. Second, create an “adapter” layer allowing easy plug-and-play compatibility between predefined lighting states in Otto and the DMX required by specific models of lighting fixture which might be available at a venue.
While the first goal was surprisingly easy to achieve, the second one turned out to be unrealistic: there are simply too many types of light available, with no overarching standard for control, to be able to develop our “adapter” layer in the time available. While this could be a good goal for future development, this bit of new knowledge led us to park this strand for the rest of the Challenge in order to focus on developing our other strands further.
Additional Discoveries
In developing live speech-to-text and speech-following prototypes, we learned that there is a lot of interest and enthusiasm in the potential of this strand. We are therefore pivoting to focus more on developing this for customers who may want to use it in a more functional way, as well as those who may want to integrate expressive captioning into their stage show at a more fundamental, creative level. Additionally, our collaboration with The Colour of Dinosaurs (Created by Lloyd Coleman & Otic. An Unlimited and Polka Theatre Partnership Commission, with support from Bristol Old Vic Ferment) has introduced a new input stream in the form of the Empatica wristband, which livestreams pulse, activity level, skin conductance and temperature from audience members. This allows a crowdsourced input stream which can be visualised, creating a feedback loop that performers can react to in the course of the performance.
Next Steps
Working with collaborators and mentors over the remainder of the Challenge Call will enable us to refine our prototypes and develop a to-market strategy for one or more of these strands. In part we will be performing our own show using all of these as a way of demonstrating Otto’s capabilities. Additionally, we will document our work with our project partners to use in further outreach and development conversations.
Experience Otto
Larkhall will be performing and offering workshops across the South West in 2023 and beyond. Shows will be announced on via Larkhall’s website and social media. Please come say hello at one of the shows, we’d love to meet you!
—
By Charlie Williams, Director, Larkhall, MyWorld Challenge Call project