Generative Music Video Composition: Using automation to extend creative practice
File version
Accepted Manuscript (AM)
Author(s)
Brown, Andrew R
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
License
Abstract
This article outlines the design of a system for dynamic real-time editing of online music video sequences, utilising probabilistic parameters and algorithmic decisionmaking for progression. It will explain how this system enables videos to be different every time they are accessed, thus providing users with an enhanced viewing experience and creators with a new tool for video composition. As the advancement of online technology influences habits of media consumers, user uptake of new technologies suggests that a transition away from video's prevalent mode of presentation, as a linear sequence of shots, is entirely possible. Factors such as viewer engagement and usability drive the need for ongoing exploration in the use of video for entertainment, information and advertisement. The production of a music video with generative methods is described in this article as a semi-automated process. Human tasks are not replaced by computational execution, but are shifted to become increasingly conceptual.
Journal Title
Digital Creativity
Conference Title
Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
© 2014 Taylor & Francis (Routledge). This is an Accepted Manuscript of an article published by Taylor & Francis in Digital Creativity on 15 Jul 2014, available online: http://www.tandfonline.com/doi/full/10.1080/14626268.2014.932289
Item Access Status
Note
Access the data
Related item(s)
Subject
Information and computing sciences
Built environment and design
Creative arts and writing
Interactive media