‘White noise' is the outcome of the residency undertaken at the Creative Computing Lab funded by the Alan Turning Institute. The residency focussed on utilising Louis McGallum his research models into regenerative music.  
The composition is made from a collage of outcomes of resynthesis models trained on a data base made out of 6 hours of reading of Merlin Sheldrakes writing. At its core, ‘White Noise’ serves as an exploration into the computer's perception of the human voice, a particularly relevant inquiry as voice technology takes centre stage in contemporary AI research. The project's primary goal is to delve into how humans perceive the processing of audio data.

The project explores how the generated sounds could be perceived as music, or non music. The generated outcomes were out of the artists control and is only composed together, not using any filters or additional sound design. Offering insight into the nuanced relationship between human agency and machine-generated content, highlighting the need to strike a balance between the two in the creative process.

A selection of the composition is later on remixed to a bass-heavy track to be played in a club, pushing the boundaries of sound design.

@malougistics / soundcloud / vanderveldmalou@gmail.com