Computer Music Research Logo

 


University of Plymouth
The House
Drake Circus
Plymouth PL4 8AA
United Kingdom
Tel: +44 (0)1752 232579

Reaching us...

 


Seminars (2011/2012)

Seminars presented by vistors and by members of the Computer Music Research team. Each seminar will be followed by an informal discussion open to the audience. Members of the University's academic community, partner colleges and collaborating institutions are welcome. For more information contact Alexis Kirke.

Note: the programme may change, so please consult this web site regularly for updates.

 

TERM 1


06 October 2011

Topic: Advancing Concatenative Sound Synthesis Technology

Speaker: Noris Mohd Norowi

Venue: Babbage, top floor Plym Room

Time: 14:00 – 15:30

Abstract: Concatenative Sound Synthesis is a data-driven sound synthesis method that uses large corpus of source sounds, and through stages such as feature extraction and segment selection, best matched sounds are found and synthesized. Factors that affect the distance of the match, such as the combination, order and weight of the features are studied and presented. A robust method to automatically assign consistent weights to all features through the use of Analytic Hierarchy Process is also described.


20 October 2011

Topic: Pulsed Melodic Processing - Using Music for natural Affective Computation and increased Processing Transparency

Speaker: Alexis Kirke

Venue: Babbage, top floor Plym Room

Time: 14:00 – 15:30

Abstract: Pulsed Melodic Processing (PMP) is a computation protocol useable at multiple levels in data processing systems. For example at the level of spikes in an artificial spiking neural network or a pulse processing system, or at the level of exchanged messages and internal processing communication between modules in a multi-agent system or a multi-robot system. The approach utilizes musically-based pulse sets (“melodies”) for processing – capable of representing the arousal and valence of affective states. Affective processing and affective input/output is now considered to be a key tool in artificial intelligence and computing. In the designing of processing elements (e.g. bits, bytes, floats, etc), engineers have primarily focused on the processing efficiency and power. Having defined these elements, they then go on to investigate ways of making them perceivable by the user/engineer. However the extremely active and productive area of Human-Computer Interaction - and the increasing complexity and pervasiveness of computation in our daily lives – supports the idea of a complementary approach in which computational efficiency and power are more balanced with understandability to the user/engineer. PMP provides the potential for a person to tap into the affective processing path to hear a sample of what is going on in that computation, as well as providing a simpler way to interface with affective input/output systems. This comes at a cost of developing new approaches to processing and interfacing PMP-based modules - this cost being part of the compromise of efficiency/power versus user-transparency and interfacing. In this chapter we introduce and develop PMP; and demonstrate and examine the approach using three example applications: a military robot team simulation with an affective subsystem, a text affective-content estimation system, and a stock market tracking tool.

 


03 November 2011

Topic: Writing Machine

Speaker: Hanns Holger Rutz

Venue: Roland Levinsky, 206

Time: 14:00 – 15:30

Abstract: The seminar develops ideas attached to my recent sound installation "Writing Machine" and tape piece "Leere Null", and functions as a progress report of my research. Writing as a generic operation of differentiation has been proposed by Jacques Derrida in his Grammatology: Here, the grapheme is the manifestation of the process of writing-as-trace, an infinite chain of signification (and an absence of "presence" or re-presence of an original signified). What I am interested in is to consider both the process of composition of a sound piece and its algorithmic unfolding (either while composing or in a real-time performance) as two interrelated forms of recursive writing-as-decision-making. And also to see this writing as an ever ongoing gesture with boundaries -- the delimitation of what "a" piece is -- that can only be established from an observer's viewpoint. Finally, writing machine as a description of the computer-based composer, of programming the music. Pro-gram, writing-before-the-letter, and the algorithm's execution as writing-after-the-letter.

 


17 November 2011

Topic: The Warren: A BCMI Composition

Speaker: Joel Eaton

Venue: Robbins SR3

Time: 14:00 – 15:30

Abstract: This seminar will present the unique Brain-Computer Music Interface (BCMI) that allows users to control music using information read from the brain, combining medical and creative music technologies. The Warren is a piece written specifically for the BCMI.

 


01 December 2011

Topic: Cellular Automata Sound Synthesis

Speakers: Jaime Serquera

Venue: Roland Levinsky, 209

Time: 14:00 – 15:30

Abstract: In this seminar I will present Histogram Mapping Synthesis (HMS), a sound synthesis technique based on Cellular Automata that I have developed in my PhD. I will firstly establish a framework for my research and state my research problem. Then, I will present HMS and its capabilities for sound design. To that end, I will describe a mechanism for controlling over time an automaton based on a DSP monitoring of the automaton¹s evolution. Such dynamic control will be suitable to design sounds with dynamic spectrum and controlled complexity. Finally, I will present a new textural concept based on dynamic and complex sound beats.

 


15 December 2011

Topic: The techniques of string instruments

Speakers: Christian Dimpker

Venue: Roland Levinsky, 209

Time: 14:00 – 15:30

Abstract: Strings are versatile instruments since numerous extended playing techniques can be executed on them. For some of these techniques a conventional method of notation has been established, but most articulations are notated in varying ways. As a composer one is hence forced to choose his own methods of depiction when willing to employ augmented techniques. In the seminar a coherent method of notation for these articulations will be presented by using the example of the work »Tatsachen in zwei Sätzen« for violin alone. The piece may be described as the implementation of previous theoretic developments. An insight into this research will be given by demonstrating the utilised notation systems and playing a recording of the work.

 

 

 

TERM 2

 

12 January 2012

Topic: Subatomic Musical Instrument

Speaker: Alexis Kirke

Venue: Roland Levinsky 208

Time: 14:00 – 15:30

Abstract: A piece of physical apparatus called a Cloud Chamber is saturated with ethanol and cooled by liquid nitrogen. The subatomic particle tracks in the cosmic radiation bombarding us from space are made visible by this cloud chamber, and a camera above the chamber will follow some of the particle tracks, converting them into synthesized music which accompanies a violin. The image from the camera is also magnified onto a screen for the audience to see. An advanced visual recognition to sound interface called the Cloud Catcher will track subatomic particle trails and use them to drive certain control “knobs” on a digital sound synthesizer in real time. To create more of a duet, an amplified violin sound to also be sent to an electrically-generated field system positioned near the visible cosmic rays. Thus the violin will create a variable force field in the chamber influencing the way the ionized particle tracks behave; and during the performance – with the violinist’s playing will sometimes be influenced by the sounds that particle tracks make as well. This is all designed to create a dynamic musical interaction between the violin and the particle tracks.

 


26 January 2012

Topic: CANCELLED

 

 

 

09 February 2012

Topic: CANCELLED

 

 

 

23 February 2012

Topic: Shakuhachi as a noise and technology interface

Speaker: Mike McInerney

Venue: Roland Levinsky 303

Time: 14:00 – 15:30

Abstract: This seminar attempts to locate the Japanese shakuhachi flute as an optimal live performance interface with new technological resources, particularly at the noisier end of the spectrum. Metaphorical and practical connections can be drawn between the performance tradition, methodology of playing and acoustic vocabulary of the instrument and a number of new sonic resources. Using these connections as an analytical basis, I plan to identify a possible performance practice and compositional language and illustrate this with examples from my own recent work.

 

 

 

08 March 2012

Topic: Jamming with a slime mould

Speaker: Eduardo Miranda

Venue: Fitzroy Building 210

Time: 14:00 – 15:30

Abstract: Unconventional computing searches for new algorithms and computing architectures inspired by or physically implemented in chemical, biological and physical systems. Physarum polycephalum is a cellular slime mould, which can be used as a biological computing substrate that exhibits properties of non-linear media computers. Physarum polycephalum is a huge single cell with thousands of nuclei, which behaves like a giant amoeba. During its foraging behaviour the plasmodium exhibits a wide range of electrical activities, which accurately reflects basic physiological states of the plasmodium. As a first step towards interacting with the plasmodium we are developing techniques to sonify the behaviour of the plasmodium. This project is being developed in collaboration with Prof. Andy Adamatzky at the Unconventional Computing Centre of the University of the West of England, Bristol.

 

 

 

22 March 2012

Topic: The techniques of percussion instrument

Speaker: Christian Dimpker

Venue: Babbage Building 214

Time: 14:00 – 15:30

Abstract: In this seminar an introduction into extended playing techniques of percussion instruments is presented, focussing on augmented modes of attack. These include the extended utilisation of the agents of attack, the problem of determining the point of impact as well as rubbing and bowing motions and playing the snares of a drum. All techniques discussed in this session lack a conventional method of notation. Therefore, it is shown how to depict these articulations in a coherent and consistent manner.


Details of the seminars in:

- 2011/2012 academic year.
- 2010/2011 academic year.
- 2009/2010 academic year.
- 2007–2009 academic years.
- 2006/2007 academic year.

- 2005/2006 academic year.

- 2004/2005 academic year.