An Apple Power Mac G5 and a cello are being combined to create a new kind of musical instrument which will be premiered in London this week.
Although not an instrument with a new a physical shape like digital age meta-trumpets or hyper-flutes, the cello/Mac combo marks an evolutionary step in the use of computers for music composition and performance.
The instrument comprises a cello and its player, both connected to the G5 via censors, the Apple computer and the virtual devices encoded in its software.
Instead of simply launching pre-programmed samples via the sensors, the entire instrument is played by the soloist.
Professor John Croft, a member of the development team that designed the instrument, and the composer of one of the first pieces to be played on it, said: "By using a number of sensor devices on the performer and instrument, and by using treatments which closely track the sound of the instrument, I create an environment in which the soloist plays the cello-plus-computer as a single extended instrument."
"No sound files are being triggered, which is 90 per cent of what people are doing when they use sensors with music performance. In music for solo instruments and electronics, one often finds the conventional relationship of soloist and accompaniment."
His colleague Professor Peter Wiegold builds on this sentiment about the use of Midi in composition and performance, no matter how impressive the sound created. "Midi-sequencing we all know is fixed. There is no life in it," he said.
The software used in the project, known as MAX-msp, will analyse, record and respond to the live cello. Working in a similar way to loop machines which solo guitarists use to accompany themselves, MAX-msp takes the concept to the next level.
The sound from the cello is not merely played back, but transformed into a live, dynamic accompaniment.
According to the cellist's actions, the bowing speed, and the way the cello responds to how it is being played, a different chain of reactions will happen in response, building a landscape of intertwined 'real' and computer-generated sounds.
The inaugural performance is the result of an 18-month project at Brunel University where Professor Wiegold is head of music research and Professor Croft is head of music. They worked with music software programmer Carl Faia, and the world-renowned cellist Matthew Barley.
Wiegold's piece, entitled The Burden'd Air, and inspired by William Blake's The Marriage of Heaven and Hell, and Croft's new work Sonata, which takes its inspiration from Renaissance pieces, will be performed by Barley on the opening night of the IF:06 contemporary music festival in central London on 26 February 26.
Wiegold said that the basic objective of the project was to create "a good piece of music". But it was also about "the live interaction between musician and technology, and it being dynamic rather than passive".
Microsoft comes up with a new way to foist its unloved and little used Edge web browser on people
Facebook suspends Cambridge Analytica following weekend claims that it illegally harvested information from 50 million users
Insider claims Cambridge Analytica used academic app to filch Facebook data of 50 million users
Is the Samsung Galaxy S9+ worth its high price?