University of York
iGGi PG Researcher
Available for post-PhD position
Kyle is a composer, programmer and researcher who is designing AI music tools for game composers that don't step on their toes. He is also an IMDb accredited composer, a former business owner, and session musician. Kyle’s PhD research first looked to understand why music generation has not been widely adopted in video games - compared to visual procedural generation - when games as a longer medium have the potential to cause listener fatigue in players through repeated exposure to music.
Through interviews with 11 professional composers, Kyle found that concerns are multifaceted and not limited only to: generative output quality being low and concerns for loss of authorship. These interviews have helped to focus his further research into improving the expressive quality of generative music and MIDI mock-ups by developing an assumption free pipeline that only needs the pitch, ontime and duration of MIDI notes to create expressive performances. In listening studies, his algorithm (CFE+P) has been shown to outperform the inexpressive baseline, a randomised baseline modelled after Logic X’s Humanise function, and a score cue informed machine learning model called the Basis Mixer.
Now Kyle is working to use a Bidirectional Encoding Representation from Transformers (BERT) model to generate variations of musical layers for existing pieces in a way that does not step on composers toes and could add variety to game music after a set period of time in the game. He plans to later test both the generation and performance algorithms in a game play setting to evaluate quality and repetition, as well as potentially evaluate them with professional composers to prove their co-creative potential.