The players are sitting on a grid (for instance, 3 rows by 4 columns, for a total of 12 people). Their mobile devices form a matrix of screens and loudspeakers that are used to spatialize sound and light.

For now, the matrix is performed by one player at a time: a representation of the matrix appears on the screen of a player who becomes the performer: by moving his finger on the matrix (on the screen), he controls from which smartphone(s) the light and sound come from in the real world. (The sound changes with the speed of the finger trajectory.) That way, he remotely uses the other people’s instruments. After a fixed time, another player takes over the control of the sound and light, and becomes the new performer.

The video gives an idea of the technical setup. While the players are usually seated at a distance of 1 or 2 meters from each other, the smartphones are spaced by a few centimeters only for the purpose of this video.

The sound is generated locally on the mobile devices that are connected to a Web Socket server (using node.js and socket.io). The server receives the position from the performer’s device and controls the sound generators of all devices of the matrix.

This mobile web application has been developed in the context of Collective Sound Checks with the Studio 13/16 at the Centre Pompidou.