Unexpec

A forum where art, science, business, creative industries and philosophy come together to shape the future.

AI can now read sheet music -- and activate a movie recording

Published
February 27, 2025
‍Now the new concert venue at Science Village Hall has been inaugurated — and already in the first month a test of software that keeps track of where the musicians are in the piece was conducted. The software offers the possibility to create events that are started when the musicians reach a certain place in the score — in this show it was about activating the camera face for a recording, but in the future it could also be used for other things. “I see the possibility of integrating it into our work to create new digital concert experiences and with its help initiate text messages, display websites or start other activities”, says Jesper Larsson, project manager in Kalaudioscope.

Image: Musicians from the Malmö Opera Orchestra under the direction of Anton Lasine at Science Village Hall.

In the project Kalaudioskop/LUDICH Several actors work together to create opportunities for individualized streaming of concerts and performing arts. The idea is that the audience will be able to have their own specially designed visual and sound experience, so that, for example, you can sit at home in peace and quiet and follow a digital concert live and choose for yourself which camera angle and sound is preferred. This means that more people would be able to access concert experiences and that the industry would reach a new audience with partially new content.

Now two of the three years with the project have passed and several tests have been done of how to combine microphones and cameras to create a really good result. A new opportunity for testing was when the Opera Orchestra Malmo Opera in January 2025 was in place in the newly opened concert venue Science Village Hall in the Loop, next to the ESS at Brunnshög. During the evening, the string ensemble gave a programme including Vivaldi's The Four Seasons. The company OnStageAI had the opportunity to demonstrate its software before the concert. The software can listen to music and, by comparing sound waves, determine where in a score you are. Based on this, for example, a producer can provide advance instructions when it is time for camera zooms and movements.

Jesper Larsson, Head of Malmö Opera's Production, Planning and Audience Section and Project Manager for Kalaudiscope/LUDICH

- The software is based on a reference material and makes a recording based on predetermined instructions. It has also been possible to control cameras according to something predetermined. What is new about this software is that by using AI it knows where in a score the musicians are and that it can adapt to the musicians playing at a different tempo or maybe even that they have jumped to another place in the score,” says Jesper Larsson, head of Malmö Opera's production, planning and audience section and project manager for Kalaudiscope/LUDICH.

The test was conducted during the musicians' rehearsal for the concert. Six network cameras were deployed to send different images to the system. The Company OnStageAI had also prepared the viewing by highlighting camera faces, zooms, and other camera changes in the software that follows the score. The musicians had to play the same piece twice at different tempos. The result was that in both cases the software was able to keep track of where in the notes the musicians were and from this change the camera according to the instructions that were embedded in the program and thus create a pre-cut recording of the concert.

- The main part of this is to help an image producer, but you could of course imagine putting markings in the score that trigger completely different things — alarms, text messages or texts. It could help a sound or light technician to know when a particular moment in a performance is happening or support the inspector's “cueande” of scene changes and so on. Everything that can be triggered by the data signal can then be controlled by the musician's playing. This is what we hope to use in our project,” says Jesper Larsson.

OnStageAI is far ahead in its development of the software. Now they have begun testing sharply with different orchestras and are just about to launch their product commercially. In February, a successful test was conducted with the London Symphony Orchestra through two symphonic concerts at The Barbican. The software is suitable for everything from chamber music and small orchestras to full symphonic scale with soloists. Traditional recording techniques can be used, and the changes are planned with great precision. The sound used is that which comes from the sound engineers' mixing desk. Even if there is a finished video recording out of this, the material from all the cameras is saved and if you want to change the cut afterwards and make other mixes, maybe even make several versions, it is possible.

The Ludich project aims to create the best possible conditions for a new generation of digital live performances. In the project, the actors work with interactive streaming solutions and digital formats that allow the user to choose and customize their concert experience. This is made possible by using the latest technology and the combined expertise of the project's collaboration partners, which are six faculties at Lund University, Malmo Opera, Malmö Live Concert Hall/MSO, Axis Communications, Amazon Web Services, Cinfo, Capgemini, Future by Lund, Helsingborg Arena and Scene and Science Village Hall and others.

Company OnStageAI had prepared the viewing by highlighting camera faces, zooms, and other camera changes in the software that follows the score. The screen displays on the right side the score with camera byte markings, the bottom left the one recorded by all cameras and the top left the camera selected for the video.