This work is licensed under a
Creative Commons Attribution
4.0 International License.
Automation of musically creative tasks generally requires inclusion of elements of syntactic and/or semantic information related to the specific task being automated. Such information is rational and meaningful and relates to both the task and its context. When this information is based upon subjective judgements, such as musical similarity, its suitability to the task maybe unknown and thus need validation. This paper outlines the design of a computationally creative musical performance system aimed at producing virtuosic interpretations of musical pieces. The case-based reasoning part of the system relies on a measure of musical similarity using the FANTASTIC and SynPy toolkits that provide melodic and syncopated rhythmic features, respectively. A listening test based on pair-wise comparison to assess to what extent the machine-based similarity models match human perception was conducted. It was found the machine-based models do differ significantly from human responses due to differences in participants’ responses. The best performing model relied on features from the FANTASTIC toolkit obtaining a rank match rate with human response of 63%, while features from the SynPy toolkit only obtained a ranking match rate of 46%. These results suggest that one can use features from the FANTASTIC toolkit as a measure of similarity in creative systems. These are systems, that, from a computationally creative perspective, both display creative behavior and are capable of reflection. This is the ability for an agent (or in the context of this paper, a computational system) to evaluate or reason about its own creative output and, in light of this evaluation, adapt or alter its behavior.
Goddard, Callum; Barthet, Mathieu; Wiggins, Geraint
Affiliations: Queen Mary University of London, London, UK; Vrije Universiteit Brussel, Brussels, Belgium(See document for exact affiliation information.)
JAES Volume 66 Issue 4 pp. 267-276; April 2018
Publication Date: April 29, 2018
Download Now (229 KB)
No AES members have commented on this paper yet.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.