Skip to main content

Music expression with a robot manipulator used as a bidirectional tangible interface

Abstract

The availability of haptic interfaces in music content processing offers interesting possibilities of performer-instrument interaction for musical expression. These new musical instruments can precisely modulate the haptic feedback, and map it to a sonic output, thus offering new artistic content creation possibilities. With this article, we investigate the use of a robotic arm as a bidirectional tangible interface for musical expression, actively modifying the compliant control strategy to create a bind between gestural input and music output. The user can define recursive modulations of music parameters by grasping and gradually refining periodic movements on a gravity-compensated robot manipulator. The robot learns on-line the new desired trajectory, increasing its stiffness as the modulation refinement proceeds. This article reports early results of an artistic performance that has been carried out with the collaboration of a musician, who played with the robot as part of his live stage setup.

1 Introduction

Composition and performance of music is evolving radically as technology offers new paths and new means for artistic expression. When in the mid 70's, the earliest programmable music sequencers and drum machines were introduced, for the first time musicians had the opportunity to operate on devices able to play long music sequences on their own, without the need of continuous human interaction. Since then, the presence of controllable semi-autonomous machines in studios and on stage has been stimulating the imagination of many artists. Bands like Kraftwerk have been playing their music exclusively using these devices in conjunction with analog and digital synthesizers, fostering with their production a future where technology and robots could play an even more active role in musical expression [1]. Forty years have passed, and while Kraftwerk featured for the first time dancing robots on their stage, music content processing by and for robots became a feasible research topic and a realistic perspective.

Nowadays humanoid robots are able to accomplish complex tasks like playing musical instruments, improvising, and interacting with human and robot musical partners [2]. This kind of robot emulates human behavior and human functioning, thanks to fine mechatronic design and multimodal sensory systems. Other kinds of robots, which we could call "ad hoc mechatronic devices", completely lost their anthropomorphic appearances, evolving towards shapes and models specifically created to optimize the execution of arbitrary scores on musical instruments. For example, these devices can be multi-armed automatic percussionists or motorized string exciters [3, 4].

Applications proposed so far with humanoid robots and ad hoc mechatronic devices operate directly on the musical instrument, making use of data coming from the remote human operator (on-line and off-line) and from the instrument itself. Typically, physical interaction with a user is not allowed, since the robot behaves as a completely autonomous musician rather than a musical interface.

The consideration of robots as both manipulators and actuated interfaces offers new perspective in human-robot interaction, human-centered robotics, and music content processing. Such actuated interfaces can take various roles and will require expertise from various fields of research such as robot control, haptics, and interaction design.

This article aims to exploit these new hardware capabilities. Instead of considering separated interfaces to communicate and send commands to the robot, the proposal is to explore the use of the robot as a tangible interface. We adopt the perspective that the most intuitive communication medium for a human-robot interface is to transmit information directly through physical contact.

We take the perspective that, in the context of music playing, the musical instrument or interface should not restrict the artist but instead provide him/her with an intuitive and adaptive medium that can be used in the desired way. By using the motor capabilities of the robot, the interface can create a new active role, which moves the original perspective of the passive interface towards a human- robot collaborative tool for musical expression.

The object of this study is to explore the use of a robotic arm as a bidirectional compliant interface to control and create music. The user is allowed to define low frequency oscillators gradually refining periodic movements executed on the robot. Through this process, the user can grasp the robotic arm and locally modify the executed movement, which is learnt on-line, modulating the current musical parameters. After releasing the arm, the robot continues the execution of the movement in consecutive loops. During the interaction, the impedance parameters of our robot controller are modified to produce a haptic feedback which guides the user during the modulation task. We think that this feature may enhance the modalities of artistic content creation, offering an unexplored approach to a very common task in music composition and performance.

We collaborated with an electronic musician to observe the real flexibility and the capabilities of such a system, when handled by a user with deep musical skills but no robot interaction experience. To study in a practical scenario, we arranged a performance making the robot part of a live stage setup, completely connected with professional musical instruments and interfaces. The artist then created a brand new musical composition, specifically conceived to exploit the expressive possibilities of the system, and performed it live.

2 Compliant robot as tangible interface for music expression

Most of the commercially available robots are controlled by stiff actuators that precisely reproduce a very accurate predefined movement in a constrained environment, but these robots cannot be used close to people for safety reasons [5]. With the vibrant and promising advances in robot control, inverse dynamics, active compliance and physical human-robot interaction, the robot's articulations progressively become tangible interfaces that can be directly manipulated by the user while the robot is actuated [6–10].

Active compliance control allows the simulation of the physical properties of the robot in a controlled manner. For example, it is possible to send motor commands to compensate for the gravity and friction in the robot's joints in order to provide a backdrivable interface. In this way, the robot can be manipulated by the user without effort since from the user's perspective the robot appears to be "floating" in space. The robot is controlled based on our previous study towards the use of virtual dynamical systems in task space [9]. For example, the robot can move towards a virtual attractor in 3D Cartesian space as if its dynamics was equivalent to a virtual mass concentrated in its end-effector and attached by a virtual spring and damper.

We propose to explore these control schemes in the context of music expression. The sophisticated sensing and manipulation skills humans have developed should be taken into account when designing novel interfaces [11, 12], in particular tangible user interfaces can fulfill many of the special needs brought by the new live computer music paradigms [13]. In general, haptic information is crucial to play most musical instruments. For expert musicians, haptic information is even more important than vision. For example, expert pianists or guitarists do not need visual feedback of the hands to control the movement. This occurs because, in the expert phase, tactile and kinesthetic feedback are important to allow a high level of precision for certain musical functions [14]. In learning and music composition, the standard gestural relationship is bidirectional: it includes transmission of our gestures to the instrument, but also reception, perception of feedbacks, which are fundamental to achieve control finesse [15].

We explore in this article how robot interfaces could recreate similar human-instrument dynamics with varying haptic properties employed by the user as an interface for musical expression. Compared to a standard musical instrument or passive musical interface, the robot introduces three additional features. The first one is the capability to continuously change the behaviors of the virtual dynamical systems, with stiffness and damping parameters varying during the interaction. This feature has been exploited in a vast number of previous studies and it is one of the basic concepts in haptic interaction and haptic music research. The second one consists of the capability to spatially redefine the types of movement and gesture required to interact with the virtual instrument. This is done actively, through realtime software control, which makes the robot different from a standard interface that has these capabilities embedded in its hardware structure. Although some interfaces that support software-based compliant control are available, the high dimensionality of the robot control parameterization makes it a unique platform, which could strongly support the study of unconventional and inspiring musical interactions. The last feature is the capability to use the interface for both haptic input and visual output processes. In other words, the instrument can be used to continue or replay the music without using an external interface or visualization tool. This is a powerful feature, which remains largely unexplored as a hardware music interface.

Furthermore, such actuated interfaces offer new interaction capabilities where the robot becomes part of the choreography. The interface can replay a recorded sequence, which is interesting not only from an auditory perspective but also from a visual perspective by synchronizing the audio output with a movement. For example, the physical presence of the robot can complement the performer's presence on stage by temporarily adopting the role of a virtual music band player.

3 Related work

The use of haptics has often been exploited in music. Simulating the dynamics which characterize non-digital traditional instruments, haptic interfaces are used to make sound from a gesture interaction with an energetic coupling between the instrument and the player [16]. Both the study of Cadoz et al. [15] and Gillespie et al. [17] investigate the possibility to build a keyboard controller able to reproduce the force feedback of a piano and other key-based instruments. The motors driving the keys behavior feed back to the user force information typically perceived while playing an instrument, like inertia, damping, and compliance. Other important works address force feedback drifting away from traditional controllers, introducing brand new devices in terms of shape and functionalities. Some examples are the Plank [18], a one-axis force feedback controller used to explore methods of feeling and directly manipulating sound waves and spectra, and Michel Waisvisz's Web [19], which affects sound texture and timbre changing the mechanical tension on the various segments that compose its reticular structure. In the study presented in [20] direct force feedback is replaced by vibrations. The system is meant to facilitate the composition and perception of intricate, musically structured spatiotemporal patterns of vibration on the surface of the body. This wide exploration of haptics applied in the music domain has also deeply influenced the way human-instrument interaction is taught, including haptic feedback in the list of the most interesting features which characterize the design of novel interfaces [21].

Haptic capabilities of reactive robots are currently exploited to transfer to and from humans important information linked to the learning of a task. Solis et al. present in [22] the use of a reactive robot system in which a haptic interface is employed to transfer skills from robots to unskilled persons. Different levels of interaction were implemented with Japanese handwriting tasks. While the first kind of interaction was mainly passive since it was using some pre-defined rules, the second type, an active interaction modality, showed the capability of the robot to dynamically adapt its behavior to user actions respecting their intentions without significantly affecting their performance. Numerous researchers have dealt with the problem of robot learning of motion and force patterns. In particular the field of Robot programming by demonstration, also called learning by imitation or learning from demonstration, explores the transfer of skills from human to robots with generalization capabilities [23]. Instead of replicating the exact same task, this line studies how the robot can extract the important features of the task and reproduces those in new situations that have not been demonstrated. In [10], Lee et al. present a physical human-robot interaction scenario in which human users transfer to robots, by mean of demonstrations, several motor tasks, which can be learnt on-line. By physically guiding the robot, the user can initially demonstrate a movement which then is learnt and reproduced. During the execution of such movements, the user can refine/modify the skill by grasping and moving the robot and showing new trajectories that are learnt on-line. The robot controller adapts the behavior of the manipulator to the forces applied by the user. Schaal et al. [24] used dynamic movement primitives[25] to reproduce movements with adaptation to final goal changes arising either before the beginning of the movement or during it. We proposed in [26] the use of Gaussian mixture regression to learn the task constraints not only in the form of a desired trajectory, but as a probabilistic flow tube encapsulating variability and correlating information changing during the task. In [27], we extended the approach to tasks in which both motion and forces are required to perform a collaborative manipulation activity such as lifting an object, and where the robot shows, after learning, the capability to adapt to human motions and learn both the dynamic and communicative features of the task. We started to explore in [28] the use of robot manipulators as both an input and output device during physical human-robot interaction.

Another category of relevant studies investigated the possibility to create robots able to perceive and join a collaborative human activity such as playing music with an ensemble. Petersen et al. [2] presented a flutist robot employed in a music based interaction system using sensory information to generate a musical output from the robot during an interaction established with human partners. Two levels of interaction are implemented, beginners and advanced, which involve the use of different sensors and schemes for elaborating the relative information to influence the robot behavior. The study presented in [29] describes a system in which a robot theremin player interacts with a human drummer introducing the possibility of a novel synchronizing method for the human-robot ensemble through coupled oscillators. Such oscillators are used by the robot to predict the user's playing speed and adapt to it. The experiments showed the effectiveness in reducing the differences between humans' and robot's onset timing and in obtaining better synchronized performances.

Particular interest is drawn onto the creation of robots which can take part in live performances, as a means to create music or dance choreographies. For example, specific classes are available at the California Institute of the Art, during which the history and art of musical robotics are taught [3]. In 2009, the Music Technology program and the Technical Direction program built four new ad hoc mechatronic devices, designed to perform with ten human performers in the Machine Orchestra. The study presented in [30] describes the use of four mobile robotic platforms/interfaces to create multimodal artistic environments for music and dance performance. These robotic interfaces are employed as instruments with the capability to move in a given space and display reactive motions on a stage while producing sound and music according to the context of the performance. These system exhibited a "human-robot dance collaboration" where the robot moves in accordance with human performers through the perception of audio and visual information and the current performance context.

4 System setup

4.1 The musical interface

In electronic music domain, low frequency oscillators are periodic functions addressed to the modulation of sound synthesis or effect parameters. In ordinary hardware and software music interfaces, they can be selected from a set of predefined common waveforms (e.g., saw tooth, triangle) that represent the trend of the function within its period T. Once triggered, the chosen shape is looped to create cyclic automations on the music parameter, according to the way the image of the periodic function is mapped onto the range of values of the music parameter. Typically, this is done linearly, mapping the minimum and the maximum in the image, respectively, to the minimum and the maximum parameter values.

Some devices include graphic and parametric editors to allow the user to create custom periodic functions. The waveform can be drawn within its period starting from a constant flat line, and then adding breakpoints to arbitrarily change the steepness of the curve. In other editors the period domain is discretized into small intervals, where a constant value for the function can be defined. At high discretization rates, this technique permits a good approximation of any waveform. Both breakpoint-based and interval-based techniques provide a graphical feedback of the resulting functions that are addressed only to the musician, since they are displayed on the devices she/he is operating on. As opposed, the audience can only perceive the sound that results from the choice of the low frequency oscillators. This lack of information does not play a crucial role in sound synthesis, while it is particularly strong when oscillators are used to modulate an effect parameter. In sound synthesis, indeed, the complex processing oscillators take part in could make difficult understanding the function shape and progression, hiding its contribution onto the output. On the contrary, during effect modulation the sound-function mapping is often straightforward, making the oscillator visual feedback--and its progression over time--a strong appeal for the audience's sensorial and emotive involvement. Furthermore, this decoupling of audio and visual feedback produces a gap between the sonic output and the gestures the artist is performing to create or affect sounds, for the turning of knobs and the pressure of buttons could hardly be considered a clear metaphor for the drawing of periodic functions. This lack of a comprehensible connection can be easily perceived during both synthesis and effect modulation.

Exploiting the dynamic features of our robotic arm, we designed a novel haptic interface to create and refine cyclic waveforms. This system permits the physical drawing of the periodic functions that compose oscillators, by directly grasping and moving the robotic arm around a predefined center, arbitrarily varying the radius to affect the chosen music parameter (Figure 1). This approach guarantees a continuous coupling between the visual and the audio output for both the musician and the audience, and a direct metaphor that clarifies the artist's gestures.

Figure 1
figure 1

A user grasping the robot: a force feedback is perceived while the user performs a modification to the current robot trajectory. The trajectory directly affects the modulation of a related music parameter.

As previously introduced, in common devices the periodic waveform is shown on a 2D Cartesian coordinate system, where f t (x) ∈ [0,1] and x ∈ [0,T). The interface we designed works, instead, on a 2D Polar coordinate system, where f t (ϑ) ∈ [0,Rmax] and ϑ ∈ [0,2π) (Figure 2). Compared to the use of Cartesian coordinates, this solution highlights the periodicity of the functions, being represented by the continuous movement in space of the robot's hand, where the hand can be grasped during each cycle to arbitrarily change its motion.

Figure 2
figure 2

A waveform (for simplicity a straight line) shown in the Cartesian (top) and the Polar (bottom) coordinate system. The red dot displays the current function value, while the dot line shows the forthcoming trend.

The interface is composed by two elements, a generic controller/input device (e.g., a computer keyboard, a MIDI controller) and the robotic arm. Initially, the robot is in gravity compensation mode, and a given central point in the robot workspace acts as a virtual attractor. A set of forces only allows the user to move the arm along a predefined direction, where Ï‘ = 0, in order to select a suitable radius value. Once reached the desired value, the user can trigger the robot movement by pushing the controller start button. The robot responds by starting to move around the center in a circular trajectory (initially with constant radius).

From now on, any local modification of the radius is learnt on-line by the robot, which gradually becomes stiffer during the progressive refinement of the user's trajectory. When the user is satisfied with the resulting trajectory and/or with the audio feedback generated by the related modulation, she/he can release the arm, which will continue moving by repeating the learnt loop.

A haptic interaction occurs between the robot and the user whenever the latter decides to apply a modification to the executed trajectory. By touching the robot, the user experiences a force feedback whose intensity depends on the amplitude of the introduced perturbation (i.e., trajectory modification), through the stiffness and damping parameters of the controller. Such force reflects the effort the user has to produce in order to apply the desired perturbation. The introduced haptic feedback guides the user and his/her gestures during the musical task, connecting the performer's physical effort directly to the intensity and the velocity of the music output modifications. We believe this may increase the player's consciousness over the interface and its fine usage, and consequently pave the way to novel artistic expression.

4.2 Audio/visual setup

We placed the robot in front of a Powerwall (a 4 × 2m2 large high-resolution display wall) to provide the user with a visual feedback. While the robot is moving, a stereoscopic trail is projected onto the screen to visually represent (with a 3D depth effect) the trajectory of the robot end effector. This superimposition of real and virtual elements in Hybrid Reality music environment has been proposed in [31], to enhance gestural music composition with interactive visuals. The system records in real time the trail and displays it as a virtual trajectory in the background when the user decides to start modulating another parameter. When the user pushes the button to create a new modulation, the robot stops cycling and moves again towards the center, under the influence of the virtual attractor. While the trail from the previous loop continues to cycle as a virtual trajectory (still affecting the related sound parameter), the robot's current trail color changes. The user can now set the starting radius for the next parameter modulation, creating a new trajectory that dynamically overlaps and intersects with the previous ones. This procedure can be repeated over time, to layer multiple modulations of different parameters and to visually superimpose the related trajectories, each created using the robot (Figure 3). Each trajectory is associated to a virtual memory slot, where the trail is saved, and to a previously selected set of device parameters, which are modulated according to the radius length. Thus, the user can choose which parameters to modulate, selecting on the controller the proper slot. Virtual trajectories saved into virtual memory slots can be stopped or recalled through the controller.

Figure 3
figure 3

The figure shows the projected visual feedback for two trajectories. The blue virtual trajectory (continuous line) is automatically looping, while the violet trajectory (dot line) is being defined by the robot movement. The two lines have been added in post processing for a better reading of the figure.

The precise alignment of the stereoscopic trails with the position of the robot's hand was made possible thanks to the bidirectional connection between the system dedicated to the control and the central workstation, which manages all the hardware and software devices that compose our setup. The main application running on the central workstation is VRMedia [32] XVR, a flexible free software primarily meant for virtual environment design; quick to program and extendible with custom modules, XVR uses a UDP connection to receive from the robot the current 3D position of its hand, and works as interface to convert and forward the control signals coming from the external controller.

One of the custom modules we developed for XVR allows receiving and transmitting OSC and MIDI signals from external hardware and software devices. The radius r of both robot trajectory and virtual trajectories is translated into a numeric value according to functions g z ( r ) = p min w + m z ( r ) p max w - p min w for OSC, and functions g z ( r ) = p min w + m z ( r ) p max w - p min w for MIDI, with r ∈ [0, Rmax], m z (r) ∈ [0,1]. Inner functions m z (r) apply an arbitrary mapping between domain and image, z is the number of the current trajectory, and p max w and p min w are, respectively, the maximum and the minimum value for the w-th parameter. Each trajectory is associated to up to three parameters, wmax = 3, which are constantly updated and sent to predefined connected devices. By exploiting standard digital music communication protocols, the robotic interface can be easily integrated with more common electronic setups, making it possible to control the different hardware and software devices; an example of such a composite setup has been shown during the performance described in Section 5.

4.3 Robot setup

The robot employed in this study is a Barrett WAM with 7 revolute DOFs back-drivable arm, controlled by inverse dynamics solved with recursive Newton Euler algorithm [33]. A gravity compensation force is added to the center of mass of each link. Tracking of a desired path in Cartesian space is insured by a force command F = m x ¨ , where m is a virtual mass and x ¨ is a desired acceleration command.

Tracking is performed through a weighted sum of virtual mass-spring-damper subsystems, which is equivalent to a proportional-derivative controller with moving target μ ^ χ :

x ¨ = K P μ ^ χ - x - K V x ˙ , with μ ^ χ = ∑ i = 1 K h i μ i χ .
(1)

The virtual attractors μ i χ are initially distributed along a circle, following a trajectory determined by a fixed center xc, an orientation (direction cosine matrix) Rcand a series of K points parameterized in planar polar representation r i , θ i i = 1 K .

μ i χ , K P , and K V are defined as

μ i χ = x c + R c r i cos ( θ i ) r i sin ( θ i ) 0 ; K P = R c k P 0 0 0 k P 0 0 0 k ⊥ P , K V = R c k V 0 0 0 k V 0 0 0 k ⊥ V ,
(2)

where κ P and κ V are adaptive stiffness and damping gains in the plane of the circle. κ ⊥ P and κ ⊥ V are constant gains in a direction perpendicular to the circle.

The variable scalar gains κ P and κ V are defined as

κ P = κ min P if t = 0 κ min P + κ max P - κ min P t t max if t ≤ t max , , k V = 2 κ P . κ max P otherwise .
(3)

The weights h i in (1) are used to switch between the different subsystems by following a periodic sequence. To ensure smooth and parameterizable transitions, we use a weighting mechanism based on a variant of variable duration Hidden Markov model representation [34]. The weights are defined at each iteration n as h i , n = α i , n ∑ k = 1 K α k , n , with initialization given by αi,1= π i , and recursion given by α i , n = ∑ j = 1 K ∑ d = 1 d max α j , n - d a j , i p i ( d ) . π i is the initial probability of being in state i. a i,j is the transitional probability from state i to state j. p i (d) is a parametric state duration probability density function defined by a Gaussian distribution p i ( d ) =N d Δ t ; μ i D , ∑ i D . In particular, the state duration is discretized in intervals indicated with the index d. The mechanism shares similarities with the forward variable of a Hidden Semi-Markov model[35] in which only state duration information would be used (i.e., spatial information is discarded).

Parameters m = 1 [ k g ] , κ ⊥ P = 169 [ N / m ] , κ ⊥ V = 26 [ N s / m ] , κ min P = 100 [ N / m ] , κ max P = 300 [ N / m ] , t max = 60 [ s ] , μ i D = 0 . 06 [ s ] , ∑ i D = 0 . 02 [ s 2 ] , d max = 5 , K = 100 , and Δt = 0.02[s] have been determined empirically based on the robot capabilities and feedback of the performer.

5 The performance

We collaborated with K [36], a promising musician, to prove the capabilities of the robot arm when used as a compliant tangible music interface. Together with the artist, we created a custom live performance setup, connecting to the interface all the instruments usually played by K during his concerts. After an acclimatization period with the robot and its novel music control paradigms, the artist composed a brand new track especially meant to exploit the arm as an expressive haptic music device, and as an interactive and choreographic element in live performance (Figure 4).

Figure 4
figure 4

A shot of the performer while controlling a music parameter with the robot. On the right side of the screen, a bar displays current force feedback and stiffness parameter of the robot.

The live stage setup can be divided into three parts. The first part concerns the robot interface, and includes the robot arm, the central workstation (equipped with an external audio interface), the stereoscopic projection system and a 40 h MONOME [37] used as generic input device. The second part consists of K's live performance equipment, this includes: an Access [38] Virus TI synth, an iPad and a laptop equipped with an external MIDI interface (Figure 5). Through MIDI connections, K's laptop keeps synchronized with our central workstation, operating as a slave device. Two Ableton [39] Live sets have been created, and run respectively in the kind of master and slave; they share the same structure, but differ for the kind of output MIDI controls, which have been created according to the connected devices (i.e., the Virus for K's laptop, the robot interface and the MONOME for the central workstation). The third part of the setup is a Naturalpoint [40] Opti-track multi-camera infrared tracking system, connected to the central workstation, and detecting the 3D position of passive reflective markers. These data can be analyzed in XVR and forwarded via UDP to remotely control the robot's arm and fingers. This feature has been extended with music mappings, as explained later in this section.

Figure 5
figure 5

The musician's instruments have been connected to the robot interface without introducing structural and functional modifications to his common live performance setup.

The robot is used as a haptic interface to create low frequency oscillators and automations, and as a remotely operated music controller, using MIDI signals to switch from one configuration to the other. In the opening part of the performance, the artist creates a minimalist atmosphere by playing a theme on the synthesizer. As the arrangement gradually evolves, the performer keeps playing the keyboard with right-hand only and moves the left-hand in front of the robot. An imitation game is now played, in which the robot synchronously reproduces the movement of the user, with his left-hand being tracked by the Optitrack system through the use of reflective markers, one on the thumb and one on the middle finger. During this mirror-like duet, the human and the robotic arm control a sound parameter each, according to their position in space. The more they move down in space, the louder and the more complex the sound becomes. The distance between the two markers attached to the fingers of the user's hand commands the position of the fingers on the hand that is mounted on the robotic arm. When the performer closes his hand, being imitated by the robot, he triggers full bass lines and drums. After this introduction phase, the mirror metaphor fades out, the robot arm is oriented towards the screen and is used as the tangible music interface described in previous sections.

Although the artist alternates playing diverse instruments, the rest of the performance is focused on the cyclic refinement of parameter modulations, both on software devices and on K's synthesizer. The involved parameters vary from effect features (e.g., delay dry/wet, hi-cut filter cut off) to waveforms for sound synthesis (e.g., frequency modulation). The control parameters obtained from the analysis of the trajectories are converted into OSC values, when addressed to software devices running on the master Live set; here the LiveAPI/LiveOSC package provides for the correct routing of the message. When the robotic interface controls the external synthesizer, the system sends standard MIDI CC messages. During the interaction, a dynamic bar shows the intensity of the force that the performer is perceiving (with a maximum of 18[N]) and the stiffness which characterizes the robot dynamic behavior during the ongoing loop (from 100[N/m] to 300[N/m]).

A visual interface has been developed to intuitively use the MONOME to control the robot's behavior. On each column of the button grid, the status of a trajectory slot is summarized; starting as blank, each slot can be activated, by pressing the first column button. The diverse combination of illuminated buttons guides the performer throughout the setting of the initial radius of the trajectory, the recording of robot's movement, and the managing of virtual trajectories, allowing him to easily recognize which slot is currently active, which slots contain virtual trajectories, and which others are still empty.

6 Discussion

6.1 The artist's feedback

Since musical interfaces are designed to be used by musicians, we paid much attention to the reactions and to the comments made by the artist during all the different parts of the interface development and music creation processes.

K actively participated in the empiric determination of the robot control parameters, and was responsible for the haptic feedback produced during the trajectory creation (see Section 4). His help permitted us to define a configuration according to which the robot produces an intelligible feedback for the user. Obviously, this human-instrument feeling is governed by subjective perceptions and qualitative preferences, and may thus need to be adapted with respect to the artist and to the music style being played. This may result in alternative choices regarding the interface musical mapping, feedbacks, and robot control parameters, and this is all part of the artistic creative process.

The artist made positive comments about the integration of the interface within his common setup. Although the control capabilities of the robot covered almost all the stage devices, he noticed the absence of structural and functional modifications in the basic usage of his instruments. In other words, the connection between the on stage musical equipment and the robotic interface was perceived as completely transparent, allowing K to have a traditional approach on his instruments. At the same time, the whole interface embraced K's equipment by adding novel usage paradigms on his setup and expanding his musical horizons. According to his feedback, this resulted in more self-confidence while on stage, and enhanced the expressiveness of the performance and the level of experimentation.

The possibility to perform on stage with a semi-autonomous device strongly fascinated the artist. K tried to show the evolving relationship established with the robot, first demonstrating the skill to the robot and then letting the robot continue the music on its own. According to the comments collected after the live performance, the artist felt that the robot had a strong expressive function that actively influenced his movement and changed the taste of the performance. It was neither a mere interface, nor a completely autonomous band mate, but a developing stage entity which characterized the music and the choreography of the performance.

6.2 Future works

In our setup, the robot behavior can be modulated by the value of three associated robot control parameters, namely inertia, stiffness, and damping. The robot motion controller used in this study and described in Section 4 exploits this concept in a simple way by just keeping the natural Cartesian inertia of the robot, a stiffness monotonically increasing with time and a damping dependent upon the stiffness. We believe that the emulation of such a simple dynamic system applied to a basic music task (i.e., low frequency oscillator shaping) is a good starting point to develop more complex experimentations. The use of compliant robot manipulators as bidirectional tangible musical interfaces is a new and largely unexplored field of research, and the successful design and implementation of a simple but operational platform for live performances encourage us to pursue further research in this direction.

We intend to use more sophisticated motion controllers in future study to broaden the number of available degrees of freedom that can be used for the shaping of the robot motion and interaction force feedback. Several audio features will in turn be associated with each of these parameters, driving the robot.

In a practical scenario, stiffness, damping, and inertia can be used to influence the relative contribution to the force given by, respectively, intensity, first and second time derivative of the desired modification applied to the music parameters, which are reflected by the robot positional error.

Moreover, a different shaping mechanism can be adopted in accordance with the given music parameter being processed (e.g., two different sets of control parameters for two given audio features) thus resulting in different haptic interactions. In particular, audio effects can be set into configurations that intensely alter the original signal. Precise shaping mechanisms could help in changing in real time these parameters, avoiding uncontrolled or unwanted sound output, thanks to the dynamic haptic feedback.

Apart from the gain parameters modulation, the mechanical capabilities and the design of the robot deeply influenced the capabilities of the proposed system. Nowadays active compliance control is supported by an increasing number of commercially available robots (e.g., the Barrett WAM arm, the Mekabot upper-torso humanoid or the Kuka/DLR LWR), each is characterized by shapes and mechanical features specifically designed to accomplish diverse tasks, from manipulation to whole body movement in space. These new capabilities could inspire novel paradigms of human-robot interaction applied to music content processing, contributing to the evolution of research on haptic music and, more generally, on new interfaces for musical expression. Consequently, possible extensions of our study include the use of different robots as collaborative tools shared by several artists playing from different locations, with the robots sequentially moving and behaving according to the contribution of the different performers. The use of these robots as platforms to test metaphors for music creation could also give birth to unconventional musical interfaces, half robots, and half instruments, directly inspired by robotic experimentation in music research.

7 Conclusions

Throughout this article, we investigated the use of a robotic arm as a bidirectional tangible interface for musical expression. By actively modifying the compliance control, the interface permits the creation of a haptic feedback that strongly connects the gestural input to the music output. We exploited these capabilities to design an interaction paradigm suitable for the creation of low frequency oscillators for recursive modulations of music parameters. The user can grasp the robotic arm to define cyclic trajectories that are learnt and automatically executed by the robot; the trend of each trajectory is locally converted into standard music control signals, and can be routed to all the connected hardware and software devices. The interface also provides the user with a visual feedback, consisting of a stereoscopic representation of the created trajectories.

We collaborated with an electronic musician to design and implement the algorithms concerning robot and music control, and to organize a live performance showcasing the robotic interface capabilities within a live stage setup. The interface was used to control different devices, merging audio, and visual contents in a human-robot interaction choreography. The show was documented, and this article is accompanied by the audio/video recordings of the performance. This material has been made available online [41].

References

  1. Bussy P: Kraftwerk: Man, Machine and Music. SAF Publishing, London; 2004.

    Google Scholar 

  2. Petersen K, Solis J, Takanishi A: Musical-based interaction system for the Waseda Flutist Robot. Autonomous Robots 2010, 28(4):471-488. 10.1007/s10514-010-9180-5

    Article  Google Scholar 

  3. Kapur A, Darling M: A Pedagogical Paradigm for Musical Robotics. In Proceedings of the 2010 conference on New Interfaces for Musical Expression. Sydney, Australia; 2010:162-165.

    Google Scholar 

  4. Singer E, Larke K, Bianciardi D: LEMUR GuitarBot: MIDI robotic string instrument. In Proceedings of the 2003 Conference on New Interfaces for Musical Expression. Montreal, QC; 2003:188-191.

    Google Scholar 

  5. Gaskill SP, Went SRG: Safety issues in modern applications of robots. Reliab Eng Syst Safe 1996, 53(3):301-307. 10.1016/S0951-8320(96)00053-1

    Article  Google Scholar 

  6. Laffranchi M, Tsagarakis NG, Caldwell DG: Safe human robot interaction via energy regulation control. In Proc IEEE/RSJ Intl Conf on Intelligent Robots and Systems (IROS). St. Louis, MO; 2009:35-41.

    Google Scholar 

  7. Albu-Schaeffer A, Haddadin S, Ott C, Stemmer A, Wimboeck T, Hirzinger G: The DLR lightweight robot--design and control concepts in human environments. Ind Robot 2007, 34(5):376-385. 10.1108/01439910710774386

    Article  Google Scholar 

  8. Bicchi A, Tonietti G: Fast and soft arm tactics: dealing with the safety-performance tradeoff in robots arm design and control. IEEE Robot Autom Mag 2004, 11(2):22-33. 10.1109/MRA.2004.1310939

    Article  Google Scholar 

  9. Calinon S, D'halluin F, Sauser EL, Caldwell DG, Billard AG: Learning and reproduction of gestures by imitation: an approach based on Hidden Markov Model and Gaussian Mixture Regression. IEEE Robot Autom Mag 2010, 17(2):44-54.

    Article  Google Scholar 

  10. Lee D, Ott C: Incremental Motion Primitive Learning by Physical Coaching Using Impedance Control. In Proc IEEE/RSJ Intl Conf on Intelligent Robots and Systems (IROS). Taipei, Taiwan; 2010:4133-4140.

    Google Scholar 

  11. Ishii H, Ullmer B: Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of CHI 97 Conference on Human Factors in Computing systems. Atlanta, Georgia; 1997:234-241.

    Chapter  Google Scholar 

  12. Jordà S: Sonigraphical instruments: from FMOL to the reacTable*. In Proceedings of the 2003 Conference on New Interfaces for Musical Expression. Dublin, Ireland; 2003:70-76.

    Google Scholar 

  13. Jordà S, Geiger G, Alonso M, Kaltenbrunner M: The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In Proceedings of the first international conference on Tangible and Embedded Interaction. Baton Rouge, LA; 2007:139-146.

    Chapter  Google Scholar 

  14. Vertegaal R, Ungvary T: Towards a musician's cockpit: transducers, feedback and musical function. In Proceedings of the International Computer Music Conference. Hong Kong, China; 1996:308-311.

    Google Scholar 

  15. Cadoz C, Lisowski L, Florens J: A modular feedback keyboard design. Comput Music J 1990, 14: 47-51. 10.2307/3679711

    Article  Google Scholar 

  16. Castagne N, Cadoz C, Florens J, Luciani A: Haptics in computer music: a paradigm shift. In Proceedings of EuroHaptics. Munich, Germany; 2004:422-425.

    Google Scholar 

  17. Gillespie B: The Touchback Keyboard. In Proceedings of the International Computer Music Conference. San Jose, CA; 1992:447-448.

    Google Scholar 

  18. Verplank B, Gurevich M, Mathews M: THE PLANK: designing a simple haptic controller. In Proceedings of the 2002 Conference on New Interfaces for Musical Expression. Dublin, Ireland; 2002:33-36.

    Google Scholar 

  19. Krefeld V: The Hand in the Web: an interview with Michel Waisvisz. Comput Music J 1990, 2: 28-33.

    Article  Google Scholar 

  20. Gunther E, Davenport G, O'Modhrain S: Cutaneous grooves: composing for the sense of touch. In Proceedings of the 2003 Conference on New Interfaces for Musical Expression. Dublin, Ireland; 2003:01-06.

    Google Scholar 

  21. Verplank B, Sapp C, Mathews M: A course on controllersn. In Proceedings of the ACM CHI 2001 Workshop on New Interfaces for Musical Expression. Singapore, Singapore; 2001:01-04.

    Google Scholar 

  22. Solis J, Marcheschi S, Frisoli A, Avizzano CA, Bergamasco M: Reactive robot system using a haptic interface: an active interaction to transfer skills from the robot to unskilled persons. Adv Robot 2007, 21(3):267-291. 10.1163/156855307780131992

    Article  Google Scholar 

  23. Billard A, Calinon S, Dillmann R, Schaal S: Robot programming by demonstration, in Siciliano B, Khatib O. In Handbook of Robotics. Springer, Secaucus, NJ, USA; 2008:1371-1394.

    Chapter  Google Scholar 

  24. Schaal S, Mohajerian P, Ijspeert AJ: Dynamics systems vs. optimal control a unifying view. Progress Brain Res 2007, 165: 425-445.

    Article  Google Scholar 

  25. Ijspeert AJ, Nakanishi J, Schaal S: Movement imitation with nonlinear dynamical systems in humanoid robots. In Proc IEEE Intl Conf on Robotics and Automation (ICRA). Washington, D.C; 2002:1398-1403.

    Google Scholar 

  26. Calinon S, Guenter F, Billard A: On learning, representing and generalizing a task in a humanoid robot. IEEE Trans Syst Man Cybern Part B 2007, 37(2):286-298.

    Article  Google Scholar 

  27. Calinon S, Evrard P, Gribovskaya E, Billard A, Kheddar A: Learning collaborative manipulation tasks by demonstration using a haptic interface. In Proc Intl Conf on Advanced Robotics (ICAR). Munich, Germany; 2009:01-06.

    Google Scholar 

  28. Pistillo A, Calinon S, Caldwell DG: Bilateral physical interaction with a robot manipulator through a weighted combination of flow fields. In Proc IEEE/RSJ Intl Conf on Intelligent Robots and Systems (IROS). San Francisco, CA; 2011:3047-3052.

    Google Scholar 

  29. Mizumoto T, Otsuka T, Nakadai K, Takahashi T, Komatani K, Ogata T, Okuno H: Human-robot ensemble between robot thereminist and human percussionist using coupled oscillator model. In Proc IEEE/RSJ Intl Conf on Intelligent Robots and Systems (IROS). Taipei, Taiwan; 2010:1957-1963.

    Google Scholar 

  30. Suzuki K, Hashimoto S: Robotic interface for embodied interaction via dance and musical performance. Proc IEEE 2004, 92: 656-671. 10.1109/JPROC.2004.825886

    Article  Google Scholar 

  31. Zappi V, Mazzanti D, Brogni A, Caldwell D: Design and evaluation of a hybrid reality performance. In Proceedings of the 2011 conference on New Interfaces for Musical Expression. Oslo, Norway; 2011:355-360.

    Google Scholar 

  32. VRMedia--Virtual Reality on the Web[http://vrmedia.it/]

  33. Featherstone R, Orin DE, Dynamics , Siciliano B, Khatib O: Handbook of Robotics. Springer, Secaucus, NJ, USA; 2008:35-65.

    Book  Google Scholar 

  34. SZ Yu: Hidden semi-Markov models. Artif Intell 2010, 174: 215-243. 10.1016/j.artint.2009.11.011

    Article  Google Scholar 

  35. SZ Yu, Kobayashi H: Practical implementation of an efficient forward-backward algorithm for an explicit-duration hidden Markov model. IEEE Trans Signal Process 2006, 54(5):1947-1951.

    Article  Google Scholar 

  36. Soundcloud K[http://soundcloud.com/veryelectromusic]

  37. Monome[http://www.monome.org]

  38. Access Music[http://www.access-music.de]

  39. Ableton[http://www.ableton.com]

  40. NaturalPoint - Optical Tracking Systems[http://www.naturalpoint.com/]

  41. Programming-by-Demonstration: BlueRegen Performance Videos[http://www.programming-by-demonstration.org/BlueRegen/]

Download references

Acknowledgements

This study could not have been possible without Valerio Solari, the talented musician who performed using our robotic interface under the pseudonym "K". Our acknowledgments are mainly addressed to him for the effort and the interest showed during the making of the project. Additionally, we would like to thank Valerio Guglielmini and Massimiliano Valente for all the video material they made available to document the performance.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Victor Zappi.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ original submitted files for images

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Zappi, V., Pistillo, A., Calinon, S. et al. Music expression with a robot manipulator used as a bidirectional tangible interface. J AUDIO SPEECH MUSIC PROC. 2012, 2 (2012). https://doi.org/10.1186/1687-4722-2012-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1687-4722-2012-2

Keywords