Next Article in Journal
A Narrative Review of the Sociotechnical Landscape and Potential of Computer-Assisted Dynamic Assessment for Children with Communication Support Needs
Previous Article in Journal
Saliency-Guided Point Cloud Compression for 3D Live Reconstruction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multimodal Embodiment Research of Oral Music Traditions: Electromyography in Oud Performance and Education Research of Persian Art Music

by
Stella Paschalidou
Department of Music Technology and Acoustics, Hellenic Mediterranean University, 74133 Rethymno, Greece
Multimodal Technol. Interact. 2024, 8(5), 37; https://doi.org/10.3390/mti8050037
Submission received: 6 April 2024 / Revised: 24 April 2024 / Accepted: 26 April 2024 / Published: 7 May 2024
(This article belongs to the Special Issue Multimodal Interaction in Education)

Abstract

:
With the recent advent of research focusing on the body’s significance in music, the integration of physiological sensors in the context of empirical methodologies for music has also gained momentum. Given the recognition of covert muscular activity as a strong indicator of musical intentionality and the previously ascertained link between physical effort and various musical aspects, electromyography (EMG)—signals representing muscle activity—has also experienced a noticeable surge. While EMG technologies appear to hold good promise for sensing, capturing, and interpreting the dynamic properties of movement in music, which are considered innately linked to artistic expressive power, they also come with certain challenges, misconceptions, and predispositions. The paper engages in a critical examination regarding the utilisation of muscle force values from EMG sensors as indicators of physical effort and musical activity, particularly focusing on (the intuitively expected link to) sound levels. For this, it resides upon empirical work, namely practical insights drawn from a case study of music performance (Persian instrumental music) in the context of a music class. The findings indicate that muscle force can be explained by a small set of (six) statistically significant acoustic and movement features, the latter captured by a state-of-the-art (full-body inertial) motion capture system. However, no straightforward link to sound levels is evident.

1. Introduction

While the understanding that music performance and perception is inherently embodied has existed for a long time, it was only relatively recently that it gained explicit recognition, with Leman coining the term ‘embodied music cognition’ [1]. Since then, embodied music cognition constitutes a sub-field of (systematic) musicology that emphasizes the importance of the human body in music and is interested in studying its role in relation to all music-related activities. Within the music embodiment endeavour, touch has been acknowledged for its fundamental role in our sensory experience of sights and sounds [2], with hearing being even acknowledged as a specialised form of touch [3]. The impact of haptic cues provided by musical instruments has been extensively discussed [4]. However, its high significance even in relatively static, isometric gestures and its profound connection to music deserves more attention.
Fundamental in this discussion [5] has been the amount of physical effort exerted in performing on an instrument or in singing or even in gestural instructions of music education, which has been regarded as a key factor, associated with aspects of expressivity and musical intention [5,6,7]. Performers ‘need to suffer a bit’ in order to bring musical tension in the piece, and audiences need to visually perceive physical effort in order to recognise particularly intense passages played by the musician [8]. In fact, Bennett et al. [9] consider bodily effort as ‘the impetus of musical expression’ in the design of electronic musical instruments. Yet, there has been a scarcity of systematic experimental approaches, particularly in employing quantitative methods, on this subject.
Effort often appears as closely associated with the sensation of forces, where the link between gestures and sound extends beyond mere spatial geometry (the melographic representation in the 3D Euclidian space [10]). The exploration of force-related concepts in music has been previously approached from an engineering [11], a music theory [12], and an empirical ethnomusicological [5,13] perspective. Yet, equating effort to force proves challenging due to its complex and perceptual nature. For this, qualitative methods [14] and indirect quantitative methods [5] haven proven essential in better understanding the perceptual and subjective nature of effort and its relationship to musical aspects in both music performance and education. However, being able to directly measure effort through physiological data is an attractive concept that is still under consideration (ibid.).
EMG technologies appear to hold good promise for sensing, capturing, interpreting and translating the dynamic properties of movement in music, which is considered innately linked to artistic expressive power [15]. The underlying premise is a straightforward relationship between EMG, muscle force, and effort, a relationship that is also deeply intertwined with musical aspects in our cognition. However, long-term studies attest that this assumption should not be taken for granted. Additionally, although employing electromyography (EMG) data to sense muscle activity as an analogous representation (intensification vs. abatement) of phenomena in the acoustic realm has been widely used in music practice, it is not such a straightforward task. This complexity arises from both the intricate and perceptual nature of effort and various challenges in using EMG. Therefore, maintaining a critical perspective is imperative.
Beyond highlighting EMG possibilities, the current paper also aims to outline the pertinent challenges, both conceptual and technological, associated with this endeavour. Specifically, it raises concerns and challenges associated with the use of electromyography (EMG) data and sensors in the context of effort-related concepts in music research for both music education and performance. Setting the stage for this discussion, the paper draws upon a case study involving the use of EMG in multimodal music research in semi-controlled conditions during fieldwork for both music education and the performance of instrumental Persian art music and specifically the oud, a Middle Eastern 11-string short-necked pear-shaped fretless string instrument.
The paper is structured as follows. In the first section, the goals of this study are contextualised within a comprehensive review of the relevant literature that concerns embodied music cognition, effort and electromyography. The implicated materials and methods are then outlined, before presenting the results and final discussion on the given opportunities in using EMG in music and also the challenges worth being discussed within the context of each specific music genre.

2. Background

2.1. Embodied (Music) Cognition

From a phenomenological point of view, embodiment underscores a holistic understanding of human existence and understanding oneself, others, and the world through the physical body [16]. In cognitive science, it highlights the importance of concrete sensorimotor patterns shaping a cognitive system’s engagement with the world [17]. In any case, both perspectives depart from the Cartesian mind–body separation and recognize that cognitive processes extend beyond the brain, emphasising the intertwined nature of perception, behaviour, action and cognition in our experiences [18,19]. Consequently, the emphasis is placed on the significance of an individual’s physical body and its interaction with the environment in cognitive abilities [20]. More recent perspectives, known as ‘4E Cognitive Science’ (with 4E standing for the four key principles: embodied, enactive, extended, and embedded), have redefined cognition as not limited to the brain alone but involving the integrated agency of the brain–body (physical and social) environment [21].
Within the music embodiment endeavour, the importance of both the sonic domain and the human body in our musical experiences has been acknowledged [1], viewing the body as the mediator between the mind (the music experience) and matter (sound energy). Although Schiavio and Menin [22] have argued against the ‘mediator’ idea of the human body, considering it a fallacy that unintentionally maintains the Cartesian body–mind dichotomy, it has led to a deeper comprehension of the body’s fundamental role in musical understanding, sense making, expression, communication, and creativity. It is now widely accepted that our understanding of the relationships between our body, environmental objects, and sounds is crucial for our perception of any sound [23].
A deeper connection between movement and sound exists that extends beyond mere mechanical links to encompass fundamental cognitive patterns [24]. The ‘4E cognition’ approach in music sees now cognition as distributed across the entire body and its surroundings [25] as an embodied, enacted, extended, and embedded (the 4E) concept. Numerous studies have been conducted in the field of music embodiment in recent years, be it in performance or education (such as [5,13,26,27,28,29,30,31,32,33,34,35,36,37,38] to name only a few), some of which have primarily focused on the technical aspects of capturing multimodal motion-related data [39]. However, the concept of bodily effort in music has not been scrutinised adequately up to now.

2.2. Effort

The term ‘effort’ commonly signifies the exertion needed to achieve a goal in everyday language. In dance, choreography, and kinesiology, effort serves as a subjective measure of expression, describing movement qualities with regard to inner intention. It encompasses the forces influencing movement that reflect the active or passive attitude toward physical conditions and exposes the mover’s manners, energy level, impulse, and intentionality [40,41,42], featuring distinct tension/exertion and release/relaxation phases [43].
In music, effort is defined as ‘the element of energy and desire, of attraction and repulsion in the movement of music’ [44]. It reflects the musical tension in a piece, manifested as patterns of intensification and abatement, evoking emotional responses [8,35,45,46] and is crucial for both performers and audiences [47]. Performers must invest effort to highlight tension, and audiences need to perceive this effort to recognize particularly intense musical passages played by the musician [48]. The disturbance of the ‘efforted-input paradigm’ has been identified by [49] as the cause of the shortfall in expressiveness observed in digital musical instruments (DMIs): departing from the constraints imposed by mechanical couplings and without the need for energy exchange between actions and sound in sound-producing actions of the real world, designed artificial mappings in DMIs [50] create room for interactions that can result in an ‘unnatural’ sense of mismatch. An instance of this occurs when, for instance, minimal physical effort is employed to control explosive sounds. In an attempt to enhance physicality and render the control of artificial sounds more ‘natural’ or ‘physically plausible’ [51], the concept of effort has been interestingly accommodated in DMIs by [52].
Research by Paschalidou [5] in Hindustani vocal music has revealed connections between effort and various movement features. Notably, these associations are most pronounced in the context of pitch-related information within melodic glides, which tend to invite increased physical power by the performer. In the realm of Xenakian virtuosity, Franco-Greek musicologist Solomos [53] argues that the emphasis lies on the ‘pure consumption of physical energy’—an expression not strictly adhering to exact sciences but one that should be considered as an effort to capture the dynamic aspects of performance—prioritising this factor over the mere count of ‘wrong’ notes. Varga [54] further accentuates the significance of ‘sheer physical pressure and the transcendence of performers’ limits’ as key elements contributing to virtuosity.
A widely acceptable definition of effort among different scientific fields, like physiology, kinesiology, biology, neuroscience, and psychology, has not been established yet. The reason lies in its subjective nature and the complexity of goals (intentionality), which include both physical and cognitive aspects [55,56]. These challenges contribute to the scarcity of analytical works on effort, particularly in the field of music, supporting the prevailing preference for qualitative methodological approaches. In music, this translates to effort being viewed as an active or passive response to physical conditions during intentional musical tasks, whereby effort evaluation—given its perceptual and multimodal nature—relies on the subjective experience of the person involved or an observer. Nevertheless, the notion of offering technological tools and methods to directly quantify, represent, visualize, and interpret effort in music is an intriguing concept—albeit not new [57]—worth exploring.

2.3. Quantifying Effort

The level of challenge in developing computational approaches for concepts described often interchangeably as ‘force’, ‘power of an action’, ‘gestural intensity’, or ‘weight’ (not necessarily implying gravitational force) in movement is reflected upon the scant publications of the past, which focused mostly on overt kinematic features—in other words, the visible gestural impact of a person’s intention (in our case, musical)—such as velocity, acceleration (considered proportional to force in mechanics), and kinetic energy [58,59,60,61]. However, the power of an action extends beyond joint displacement; it may encompass ‘covert’ muscular activity, occurring even without observable movement, as seen in isometric muscle contractions (e.g., when pushing against a wall or prolonging the pressing of a piano key—a concept effectively utilised in capturing aftertouch values in MIDI keyboards) [62]. Ratings of perceived exertion (RPE) scales have been previously developed in exercise science [63]. However, similar studies have not been conducted in music performance and perception.

2.4. Physiological Data

Several approaches for quantifying effort rely on physiologically measurable quantities, presumed to be indicative of effort in music. These include muscular tension, breathing rate, biochemical energy expended in the form of heat (calories), sweating, pupillary dilation, and brain activity [64,65,66,67,68,69,70]. Nevertheless, these absolute measures may not consider an individual’s ability to perform a task or accurately capture the subjective perception of effort. As a result, they may lack widespread agreement. For example, while the absolute force measurement might be low, factors such as reduced muscle power or fatigue could lead to a calorie expenditure surpassing the resting burn rate, as assessed through metabolic equivalents.
This has led to a notable shift in recent years towards investigating physiological EMG data. While movement serves as the visible expression of intention, muscular activity is the covert process generating the actual movement. This process is initiated by neural activity in the motor cortex, transmitting information about movement to the spinal cord [71]. As a result, EMG data seem to offer a more accurate representation of intention, both in the musical and gestural context, compared to observable kinematic measures [6,72]. Therefore, in the realm of music performance, EMG data stand out as robust candidates for conveying effort-related information.

2.5. EMG

Electromyography (EMG) is a technique used to measure the electrical potentials produced by the nervous system during voluntary or involuntary muscle contraction. EMG sensors comprise electrodes that come in two main types: intramuscular EMG (iEMG), which involves inserting tiny needles through the skin into the muscle, and surface EMG (sEMG), which uses non-invasive electrodes on the skin [73]. iEMG is often more clinically significant, but sEMG, despite being limited to superficial muscles, is non-invasive and thus preferable in music and often also in medical research.
It should be noted, however, that the objective assessment of quantifiable physiological measures for the amount of muscular power force exerted during a specific action is a challenging task due to the different capacity of individuals in achieving the same physically demanding goal. EMG amplitude is therefore most often expressed as a percentage of the maximum muscular force achieved by a single subject in order to alleviate discrepancies among individuals in their capacity for producing force. Hence, most often in clinical studies, a subject will be asked to perform a maximum voluntary contraction (MVC) [74] under different conditions that are representative of the actual task that will be requested from the participant.
While movement is the overt manifestation of an intention, muscular activity is the covert process that produces the actual movement, which in turn is instigated by neural activity in the motor cortex conveying information about movement to the spinal cord [71]. While movement necessitates muscular activity, the reverse is not always true; muscular activity may not necessarily result in observable movement. Therefore, EMG sensors have the ability to detect potential with or without the occurrence of body movement. Hence, EMG data appear to better represent intention [72,75]—musical and gestural in our case—than observable kinematic measures [6,72] and form a strong candidate as effort-related information in the context of music performance, precisely illustrated in the following claim by Tanaka and Donnarumma, prominent representatives of EMG use in artistic practice.
‘Physiological sensing provides direct access to information on the user’s physical effort. Subtle movements or intense static contractions which might not be captured by spatial or inertial sensing are detected since physiological sensors transduce energy (mechanical or electrical) directly from the muscles’.
[76], p. 4
For this reason, a strong turn towards the study of EMG data has been observed over the last few years in the music domain too [77]—the assumption behind this being a simple, unequivocal proportional relationship between EMG, underlying muscular forces, and effort, one of the presumptions that I wish to critically examine in this paper. In music research, EMG technologies have been used extensively, both as sensors to study expression in music-related gestures [57,76,78,79,80,81,82,83,84] and as actuators offering vibrotactile feedback [85,86] to enhance instrument-learning practices [87] and provide additional multimodal feedback for those with auditory impairments [88]. For instance, a pilot study (reported in [89]) explored the relationship between effort-related EMG data and musical tension in Iannis Xenakis’ piano composition ‘Evryali’, aiming to address expressed concerns by [90] regarding virtuosity, performability, physical exertion, and energy consumption, or ‘energetic striving’ [85], in the challenging passages of the work, notorious for the difficulty imposed by the dense and complex graphical notation.
In contemporary artistic practice, EMG technologies have been applied extensively for bringing physicality into the musical expression of new electronic musical instruments and musical interaction, as in [78,89,91,92,93], with interesting examples provided in the BioMuse [57,94] by the Sensorband [95], the XTH Sense by [96], and the augmented piano recital ‘Habitet (Avec) Xenakis’ by [97], in which consumer-grade surface electromyography equipment was used for the work ‘Mists’ by Xenakis, aiming to augment the audience’s musical experience by exposing, magnifying, and visually representing otherwise invisible effort-related processes of the performer’s muscular activity.
It appears that electromyography (EMG) sensors and measures have been extensively used in both research and artistic practice. Nonetheless, it is not automatically implied that they form a suitable candidate for assessing effort, given the intricate nature of effort discussed earlier, encompassing perceptual and multimodal elements. This makes it a challenging task that requires systematic exploration, and the present paper delves into these considerations.

3. Materials and Methods

The critical perspective emerges from the experience acquired through empirical work comprising various setups that utilize sEMG in music research in both controlled laboratory conditions and ecologically valid settings during fieldwork. The subsequent section provides details on the equipment, individual setups, and their corresponding scopes. Since the dataset discussed here was intended for offline analysis, post-capturing processes were necessary. These included post-processing (such as filtering), alignment with other multimodal data streams (multi-camera videos), and segmentation.

3.1. Equipment

The basic configuration comprised at least one of two different sEMG systems, both of which can be seen in Figure 1a.

3.1.1. Device #1

A previously commercially available sEMG device, originally designed for prosthetic applications under the name Myo Armband by Thalmic Labs (later renamed to ‘North’ and subsequently acquired by Google and shut down) was used, with one device worn on each of the participant’s forearm. The system comprises 8 sEMG dry sensors (without the need for skin preparation/abrasion), an inertial measurement unit (IMU) consisting of an accelerometer and a gyroscope, and a Bluetooth low-energy transmission module. The sEMG sensors are placed around a stretchable armband, with each of the sensors corresponding to 8 different muscle groups of the forearm: flexor digitorum superficialis, brachioradialis, extensor carpi radialis longus, extensor digitorum, extensor digiti minimi, and extensor carpi ulnaris [98]. All raw data are acquired and transmitted over the Bluetooth protocol, with the 8-channel EMG signal at a 200 Hz sampling rate and the IMU data at a 50 Hz sampling rate. In our setup, all data were transferred over Bluetooth to a computer running a custom-made program based on the ‘Myo for Max’ external by Jules Françoise and Yves Candau [99] in Cycling’74 MaxMsp [100] version 7. As illustrated in Figure 1c, custom-made MaxMsp patches were developed for monitoring, recording, and playing back the following synchronised EMG data at 200 Hz, IMU data (acceleration and 3-dimensional orientation) at 50 Hz, and audio data (at 44.1 kHz): sEMG: a list of 8 streams of 8 bit ADC values; acceleration: (x,y,z); orientation: quaternions (x,y,z,w); 1- or 2-channel audio, depending on setup; and additional equipment according to the specific setup, as outlined for each case study.

3.1.2. Device #2

A biosignal capturing kit by BITalino (R)evolution Plugged Kit BLE/BT [101], comprising a microcontroller (MCU) block, a Bluetooth dual-mode (BLE/BT) block, and one electromyography muscle sensor on each hand, is used. Each sensor receives input from 3 dry electrodes, extracting bipolar differential measurements from the two and using the third (placed on a rather fixed position on the forearm, usually the elbow) as a reference voltage value. Raw EMG data for a single channel are acquired at 1000 Hz, and they are transmitted over Bluetooth and received on the proprietary software openSignals (r)evolution by Plux.

3.2. Empirical Work in Music Performance and Education Research: Persian Art Music (Radif)—Analysis of Oud-Plucking Gestures in Semi-Controlled Research Settings

Alongside empirical (unpublished) work on various instrumental performing gestures carried out in our lab (Image, Movement, and Sound Technologies) under different conditions, instruments and research scopes, we have utilised the two EMG technologies, aiming to establish comprehensive recording scenarios, to assess the ease of use and reliability of EMG data streams between the two employed systems and to conduct systematic research on effort-related electromyography. The basic configuration comprised at least one of the two different abovementioned sEMG systems in combination with various types of full-body motion capture systems for the following case studies. Apart from EMG, the recordings included the acquisition of a number of multimodal data streams for capturing full-body 3-dimensional motion data (via full-body marker-based optical or inertial systems, specifically Optitrack [102], Xsens (now Movella) MVN Link [103], Motionshadow [104], or Perception Neuron 32 [105]), audio, and multi-camera video. These experimental setups are used as a backdrop to discuss practical issues arising from the simultaneous use of EMG and motion capture (mocap) systems in the Discussion section.
The primary emphasis of the paper will be on a case study conducted on the performance and teaching of the Iranian oud by Yasamin Shahhosseini, within the context of Persian art music (radif). The radif is a repertoire or collection of melodies in Persian art music, each associated with specific dastgahs or modes, which serve as frameworks for improvisation and composition, defining both the melodic and modal characteristics. As Persian art music is an oral music tradition, i.e., it relies mostly on direct demonstration (by the teacher) and imitation (by the student) rather than on written music notation, embodiment holds a particularly pivotal significance. Although there has been a recent shift towards partially depending on written musical notation [106], knowledge is still predominantly conveyed through direct demonstration and imitation [107], involving not only auditory aspects but also physical movements. Frequently, novice music students encounter difficulties in mastering precise control over sound-producing gestures, requiring extensive training to attain proficiency and virtuosity, particularly in controlling sound volume. A fundamental inquiry revolves around whether generating louder sounds necessitates stronger muscle contractions or not. Another aspect to explore is the potential correlation between muscle tension (quantifiable feature based on measuring EMG) and exerted effort (a perceptual feature), alongside investigating if muscle tension relates to other multimodal (acoustic and movement) characteristics.
To investigate these queries, a participatory study was undertaken, whereby multimodal data were captured in the context of a private music class in a domestic space. Specifically, force-related and movement data were recorded solely for the right hand (string-plucking gestures) for both the instructor (Shahhosseini) and a novice oud player (the author) during an Iranian oud teaching session. The study involved capturing (a) EMG and IMU data by a Myo Armband worn on the right hand, with EMG data running at 200 Hz and IMU at 50 Hz, (b) full-body motion data by the Xsens Link IMU system at 240 Hz (T-pose calibration can be seen in Figure 1b), (c) video by three GoPro Hero 7 Black cameras at 60fps, and (d) stereo audio by MixPre 10 Sound Devices and two cardioid LineAudio CM4 cardioid small diaphragm condenser microphones at 44.1 kHz—16 bit. EMG data were acquired by the 8 dry sensors for each corresponding muscle group of the forearm described earlier (flexor digitorum superficialis, brachioradialis, extensor carpi radialis longus, extensor digitorum, extensor digiti minimi, and extensor carpi ulnaris [98]) without skin preparation/abrasion and without skin impedance measurements.
Estimates of muscle force values were extracted from the raw EMG data based on the ‘Myo for Max’ external in MaxMsp [108] that makes use of Bayesian filtering techniques similar to those used in the control of prosthetics [109]. These values were first scaled to a (maximum) reference. Although commonly EMG data are calibrated by relying on MVC, this is not necessarily the most representative value for the maximum contraction achieved during a real music performance. For this reason, force values were scaled based on Shahhosseini’s strongest stroke that she was requested to perform prior to the recordings. The Xsens system was calibrated based on specified poses that the musician was instructed to follow. The entire setup can be seen in Figure 2, a still from the synchronised videos and motion capture data. Prior to the recording, Shahhosseini provided her written informed consent and maintained the right to discontinue her participation in the interviews at any time. No compensation was offered for her involvement.
The study was divided into three sessions: Initially, the instructor (Shahhosseini) was tasked with demonstrating (session 1) and then guiding (session 2) the student (the author) in executing different types of isolated (either single or double, i.e., down–up strokes) or repetitive (tremolo) plucking gestures with different sound levels (effect) or muscle contraction levels (hypothesised cause) on a single string. In session 3, Shahhosseini was asked to perform on the oud a radif comprising a number of dastgahs (melodic systems) and gushes (multiple correlated parts of a specific dastgah).
Session 1 comprised various scenarios and conditions as outlined below:
Session 1.1 (a and b): isolated strokes, with progressively ascending sound levels (repeated twice because, in the first repetition, the maximum possible sound levels was not reached by the planned number of strokes); 24 strokes;
Session 1.2: isolated double strokes, with progressively ascending sound levels; 15 strokes;
Session 1.3: continuously repeated strokes (tremolo), with progressively ascending sound levels;
Session 1.4: isolated single strokes, with progressively ascending sound levels but with stiff hands; 10 strokes;
Session 1.5: continuously repeated strokes (tremolo), with progressively ascending sound levels but with stiff hands;
Session 1.6: isolated strokes, with equal sound levels but progressively ascending stiffness, from looser to stiffer; 5 strokes;
Session 1.7: double strokes, with equal sound levels but progressively ascending stiffness, from looser to stiffer, 5 strokes;
Session 1.8: continuously repeated strokes (tremolo), with constant sound levels and same speed (probably part of session 1.9 but started with same speed);
Session 1.9: continuously repeated strokes (tremolo), with constant sound levels but varying repetition speed, from slower to faster;
Session 1.10: continuously repeated strokes (tremolo), with constant sound levels and speed.

4. Analysis

The analysis comprises two sections, with two different objectives:
(4.1.) The first involves conducting a linear regression analysis for inferring muscle force levels—deduced from the raw EMG-captured data—from various movement and acoustic features in order to (a) examine the null hypothesis that muscle force is unrelated to sound levels (objective measure), calculated using the root mean square energy (RMS), and (b) devise formalised descriptions and examine whether muscle force values can be explained through a small set of statistically significant acoustic and movement features extracted from the raw multimodal captured data.
(4.2.) This involves a Pearson’s product moment correlation analysis to examine the strength and direction of the (possibly linear) association between muscle force values (measured feature) and effort values (perceptual feature), the latter annotated by Shahhosseini herself while watching and listening back to her own video recording.
Case (4.1.) refers to short excerpts with a duration of a few seconds, and therefore, it represents an exploration of force–gesture–sound associations at the meso level (duration between 0.5 and 5 s), while case (4.2.) refers to long phrases and larger, musically meaningful sections of the performance (annotations and exported values from the data), representing in practice an exploration of these associations at the macro level [24].

4.1. Linear Regression for the Relationship between Muscle Force and (a) Sound Levels (Examination of Null Hypothesis), (b) Various Acoustic and Movement Features (Development of Formalised Descriptions): Examination at the Meso Level

Numerous issues may arise when attempting to apply linear (regression) models (LMs) to time-varying data. These challenges include data distributions deviating from normality, limited datasets hindering the segregation of training and testing phases, issues pertaining to temporal dependencies in the data [110], and the inability of linear models to capture non-linear elements highlighted by the authors of [111]. Indeed, a histogram for the maximum force values for Shahhosseini in Figure 3 illustrates deviations from normality. Nevertheless, linear regression offers a distinct advantage by effectively portraying complex real-world phenomena in a concise and easily understandable manner, facilitating verbal descriptions of relationships using simple terms.
(a)
LM response variables—force-related feature extraction
The linear models were fit to various types of force-related information, all representing the mean values of all eight Myo Armband sensors, which include raw and smoothed (filtered for noise) EMG and force values: raw EMG data; smoothed raw EMG data; force data; and smoothed force data. The smoothed values were calculated within MaxMsp by a simple linear y(n) = x(n) + x(n − 1) low-pass filter.
(b)
LM explanatory variables—acoustic and movement feature extraction
All data first underwent segmentation based on manually annotated timestamps. In the case of isolated strokes, each stroke was segmented from −10 ms before the peak acceleration −10 ms before the onset of the subsequent stroke. For continuously repeated strokes (tremolo), the entire recording was analysed. The raw motion capture (mocap) data from the Xsens system were exported into c3d format and underwent the following pre-processing steps:
1. We filled some limited gaps observed in the data that resulted from occasional interruptions in the Wi-Fi connection between the computer and the Xsens Link system by employing linear interpolation between adjacent time points.
2. We employed smoothing techniques to reduce high-frequency noise in the raw mocap data. This was achieved using a low-pass Savitzky–Golay finite impulse response (FIR) filter with a window of 90 ms.
A large number of time-varying kinematic and acoustic features were extracted from the mocap and audio recordings respectively using Matlab 2015 with Mocap Toolbox v. 1.5/MIR Toolbox v. 1.6.1. For each of these features, a number of variations were computed. These variations encompassed both raw and filtered versions, alongside the computation of features specifically for the onset section of the gesture. Filtered kinematic features were processed using a Savitzky–Golay FIR filter with 90 msec and 130 msec windows for velocity and acceleration, respectively, while filtered acoustic features were passed through a Savitzky–Golay FIR filter with a 0.10 msec window. The onset versions entailed computing features for 15% of the event duration in the case of a duration > 1.7 s and 45% for events with a duration ≤ 1.7 s. This approach was deliberately chosen to capture the actual stroke section rather than the decay and release of the note that was recorded. Following this, representative statistical global measures (such as mean, standard deviation (std), minimum (min), and maximum (max)) were computed from the time-varying movement and acoustic features and were normalised prior to being utilised in the models. Pitch was not included in the extracted acoustic features as Shahhosseini was instructed to only play on a single string.
Due to the countless potential combinations of explanatory features and the quest to identify an optimal feature set for inferring force values, a diverse array of potentially pertinent features was initially plotted for tremolo gestures that Shahhosseini was instructed to perform with ascending sound levels (sessions 1.3, and 1.5, the second with constantly contracted muscles), constant sound levels and ascending speed (session 1.9), or constant sound levels and constant speed (sessions 1.8 and 1.10). This visual exploration aimed to discern features deemed relevant to varying vs. constant force values. For example, Figure A1 in Appendix A depicts diverse time-varying features, with force and other variables exhibiting significant increment with the increasing sound levels that Shahhosseini was instructed to produce through repeated tremolo strokes (session 1.3), whereas Figure A2 in Appendix A showcases the same features under the conditions of constant sound levels and ascending plucking speeds from slower to faster (session 1.9), illustrating constant force values and other variables. The illustrations include the sound decay part too. Starting from this initial—visually identified—set of features, the optimal models (determined using R version 3.2.3/RStudio version 0.99.491) were identified through a trial-and-error process aimed at achieving a balanced trade-off between model accuracy, simplicity (measured by the inclusion of a small number of independent variables), and ease of interpretation and feature extraction. For this, some features that did not appear to be statistically significant in the force inferences were removed, and alternative features, deemed more pertinent to the study’s specific context, were added, with the aim of enhancing the explained variance in the estimated responses.
The selection of the best models was based on the goodness of fit, which was assessed using the adjusted R-squared (R2adj) coefficient of determination for linear models in inferring the force. The strength and direction of the correlation between each individual feature and the response variable were evaluated based on the absolute value and sign of each individual coefficient. The probability (p) value served as an indication of the likelihood of observing an association between the feature and response purely by chance in the absence of any real association. A significance level (α) of 0.05 was used for comparing the p-value to assess the null hypothesis, as recommended by [112]. Since the primary focus of this paper was on inference rather than prediction, and the dataset—derived from uninstructed real performances rather than controlled experiments—was limited in size, the entire dataset was utilised for model fitting without partitioning separate training data for prediction purposes.

4.2. Pearson’s Product Moment Correlation Analysis for Force–Effort Associations: Examination at the Macro Level

A second-person perspective [1] was taken for session 3 by asking the performer to watch and listen back to the video recording of her own radif performance and letting her annotate the levels of effort she perceived herself to be committing for larger phrases and sections of the performance (macro level). Since there is no established scale for rating perceived exertion in music research (relying on a combination of movement and sound for its evaluation), no specific RPE scale from the bibliography was utilised. Rather, the performer was simply instructed to annotate her own effort levels in a scale between 0 and 10 (10 being the highest). For this, she inspected her own performance and annotated larger sections that were musically coherent, representing the macro level of music analysis discussed earlier. A list of manual timestamps and effort level annotations was produced, on which the analysis relied. For each of these manually annotated sections, the data were first segmented, and then the global descriptive statistics of various force value alternatives (raw/smoothed, mean/max) were extracted (using Matlab 2015 with Mocap Toolbox v. 1.5/MIR Toolbox v. 1.6.1) and analysed (in R version 3.2.3/RStudio version 0.99.491) for their correlation with the manual effort annotations. As the performer both watched and listened back to her performance, it should be noted here that effort levels were supposedly assessed as a compound concept, consisting of both bodily and acoustic aspects. A Pearson’s product moment correlation analysis for force–effort associations was conducted in R/RStudio.

5. Results

5.1. Results for Linear Regression for the Examination of the Relationship between Muscle Force and (a) Sound Levels (Null Hypothesis), and (b) Various Acoustic and Movement Features (Development of Formalised Descriptions): Examination at the Meso Level

The original set of features, encompassing various iterations of raw, filtered, and onset variations alongside their descriptive statistics, yielded a linear regression model with a high goodness of fit, comprising thirteen statistically significant features. Nonetheless, a subset of the explanatory features displayed significant collinearity, prompting the removal from the model of those with correlation coefficients exceeding 0.8. This resulted in the linear regression model of Table 1:
The results suggest that a relatively good part of variance (more than half) in the maximum values of force can be explained by a small set of (six) statistically significant acoustic and movement features extracted from the raw data, leading to the rejection of the null hypothesis that force is unrelated to the musical context and to movement qualities. Apart from the minimum acceleration value, all other explanatory features refer to the initial onset part of the gesture. Specifically, higher maximum smoothed force values are associated with strokes producing a larger deviation of spectral brightness at the onset [f.1], starting from lower minimum values [f.2] and reaching higher maximum values [f.3], which are achieved by plucking gestures travelling over smaller vertical distances from the initial rest position during the onset [f.4], with larger mean values for vertical velocity during onset [f.5] and larger minimum smoothed acceleration values during the entire duration of the gesture [f.6]. The relationship of explanatory features in pairs can be seen in Figure 4.
However, contrary to our initial hypothesis and expectations as well as the behaviour illustrated in Figure A1 for the continuously repeated plucking gestures (tremolo) of session 1.3, sound levels (measured by RMS of acoustic energy) were not significantly related to force for isolated strokes and hence did not manifest as anticipated. One possible explanation is that isolated actions (as opposed to those continuously interconnected through tremolo strokes) may not facilitate variations in sound-producing muscle force levels corresponding to changes in sound levels. This could be attributed to the hand starting from a resting position of touching on the string and only moving downwards with the support of gravity, without the preparatory action of retracting to a higher vertical position, an action which would be required when performing a subsequent plucking gesture and would produce—as a consequence of this—variations in sound levels. Therefore, although the captured audio exhibits progressively increasing values in sessions 1a and 1b, the muscle force values may not accurately reflect what would occur in real-performance conditions involving co-articulated gestures and musical phrases [113].
Finally, it should be reminded that the diagnostic plots of regression models need to be produced and inspected for the developed LM. Figure 5 illustrates the following: (a) residuals vs. fitted samples indicate pronounced patterns in the residuals, revealing non-linear elements in the relationship between the maximum force and explanatory features; (b) quantile-to-quantile plots suggest a deviation of residuals from a standard normal distribution (which is one of the assumptions of least-squares linear regression); (c) the scale–location plot illustrates identifiable homoscedacity; and (d) the residuals vs. leverage plot does not indicate any extensive outliers. The deviations and partial violations observed in the diagnostic plots of both the data and the residuals of the linear regression model suggest that the model may not be fully capturing the underlying patterns in the data, and therefore, caution should be exercised when interpreting the findings in terms of their reliability and generalizability.
The boxplots in Figure 6 for all individual features, encompassing force and explanatory variables, offer a comprehensive view of the deviation in data distribution from normality for each feature.

5.2. Pearson’s Product Moment Correlation Analysis for Force–Effort Associations: Examination at the Macro Level

The results of the Pearson correlation analysis computed to assess the relationship between the annotated effort and measured force (mean force smoothed values extracted from the EMG data) indicate a weak, positive correlation between the two variables, with r = 0.1760773, N = 42. However, the relationship does not appear to be significant (p = 0.2529). The scatterplot in Figure 7 summarises the results, which need to be interpreted with caution due to the high probability value.

6. Discussion

This paper aimed to highlight both the possibilities and challenges in the utilisation of electromyography technologies in music performance and the teaching research of effort-related concepts in oral music traditions. To set the stage for this endeavour, the paper used a specific case study, namely, research in performing and teaching Iranian art music (radif) on the oud. The analysis comprised two sections: one for examining the hypothesis that muscle force is linked to produced sound levels and for devising formalized descriptions for inferring effort from a number of multimodal (acoustic and movement) features and the other for examining the strength and direction of the association between measured muscle force and annotated effort values.
In the first part of the analysis, the null hypothesis was not rejected, leading to the assumption that muscle force is unrelated to sound levels. Additionally, the results of the LM suggest that a good part of variance for muscle force levels can be explained through a small set of (six) statistically significant acoustic and movement features extracted from the raw data, leading to the rejection of the null hypothesis that muscle force is unrelated to the musical context and movement qualities.
Despite the deviations and partial violations in the data and the residuals of the LM and the caution by which one should respect the findings of this study for session 1, its development was a useful first step in examining the relationship between measured muscle force and various extracted acoustic and movement features. Interestingly, although a strong association is evident between (maximum) muscle force and sound levels (measured as the rms of acoustic energy) for repeated strokes (tremolo) of ascending sound levels that Shahhosseini was instructed to perform, this association was not revealed in the linear regression model that was fit to the data of individual strokes for session 1. However, it should be reminded that the analysis of individual strokes for session 1 represents only a limited aspect of reality. It isolates these strokes from their practical functionality within larger melodic phrases and the co-articulation with neighbouring strokes. This means that these strokes are stripped of preparatory phases, and only the core part of the action is examined.
A first attempt to examine the real performance conditions of longer phrases and sections of a live performance was made with a Pearson product moment correlation analysis between measured force values and annotated effort values in the second part of the paper. However, it appears that the annotations refer to a higher level of abstraction than what is measured by the mean values of extracted force data, thus rendering the results inadequate for scrutinizing the null hypothesis.
Based on the insights gained from the present case study and the practical hurdles encountered throughout the empirical investigation, the subsequent section outlines all practical and conceptual challenges.

6.1. Practical Challenges

6.1.1. Challenge #1: Multimodal Data Stream Synchronisation

Given the inadequacy of readily available hardware solutions (be it flash systems, time referencing, or time code synchronization) for multimodal data synchronization (including the limited compatibility with such solutions by a wide range of data-capturing hardware and the additional complexity that such a solution would introduce in the field), particularly in the realm of music research conducted in ecologically valid settings out of the lab, this primarily involved two key aspects: ensuring the continuous alignment of different data streams over time to prevent gradual divergence and effectively generating distinct triggering marker points—typically hand claps at beginning and end—for the precise trimming of individual data streams, including EMG, to synchronize them in post-processing.
The first necessitates averting issues such as unstable sampling rates or dropped individual frames, commonly leading to time length discrepancies between the recordings of various data types. In theory, stable sampling rates are assured for research-grade and high-end commercial equipment; however, this may not be the case in practice, as has been observed for various types of consumer video equipment that is often used in practice in the field (including recordings with GoPro cameras of this study) or motion data captured previously with the Perception Neuron 32 inertial system, leading to a progressive time drift between multiple data streams with no easy solution. Other issues are intermittent malfunctions, including occasional frame drops and, more significantly, sporadic spontaneous shutdowns of GoPro cameras. Although these can be rectified within a controlled laboratory environment by restarting the entire recording session, addressing these issues becomes impractical in ecologically valid settings, resulting in either the collection of unreliable data or necessitating labor-intensive post-production tasks.
For the second case, at the outset of each teaching session (which lasted over several hours), the performer was asked to produce three distinct handclaps, with each clap directed towards one of the three cameras, to be used for synchronising the systems and calibrating the EMG data from the Myos to maximum muscle contraction values. To enable the manual alignment of independent data streams, a common practice involves capturing a comprehensive video of the entire scene to serve as a reference for the visual confirmation of triggering points and the troubleshooting of synchronisation issues among motion capture data streams (e.g., optical and inertial). However, while effective for overt, observable motion, this approach is less suitable for capturing covert data like EMG. Despite instructing participants to produce distinct trigger points with maximum muscle contraction through powerful handclaps at the beginning and end, the resulting profile may not be exclusively tied to handclaps and could potentially occur throughout the entire recording of the actual performance, leading to ambiguity in trimming signalling points, as illustrated in Figure 8. These graphs display captured EMG data for each of the two hands of both the entire recording (right) and a small initial portion (left), illustrating the ambiguity in identifying peaks that correspond to intentional handclaps. In contrast to controlled laboratory experiments with short recordings and well-defined sound generation instructions or audio stimuli where these points are easier to identify, research in ecologically valid settings, as exemplified here, relies on extended, continuous recordings without necessarily having a precise overview, especially in participatory research scenarios.
Pre-synchronised multimodal data streams may provide effective timestamping and EMG trimming capabilities, as demonstrated by the simultaneous capture of EMG and acceleration values of the Myo Armbands, with peak acceleration values serving as highly reliable markers for impulsive gestures, such as handclaps, as illustrated in Figure 9. However, not all systems comprise built-in acceleration sensors, necessitating additional sensors and complicating the setup, as with Plux.
By all means, such complex setup systems necessitate a large team of researchers and technicians to avoid the unnecessary curation and post-processing of data stemming from malfunctioning equipment during recordings in the field.

6.1.2. Challenge #2: Precision in Trigger-Based EMG Motion Data Alignment

Although the underlying principle behind trimming EMG data is that muscle contraction (hence force) reaches its peak at the moment of hand contact during the hand clap (calculated from the mean value of all eight EMG streams for the Myo Armband, as can be seen in Figure 10), the specific timing of EMG peaking remains unclear, introducing ambiguity in the precise synchronisation of motion, sound, and EMG signals. While an alternative approach is possible through the simultaneous capture of acceleration and EMG data from the Myo Armband, which allows precise trimming based on the acceleration values, this is not necessarily the case for other EMG technologies that only capture muscle contraction.
The highest possible sampling rates should be selected for all used equipment to minimize time deviations in the data synchronization process.

6.1.3. Challenge #3: Methodological Issues in Calibrating Biometric Data

Numerous approaches have been developed over time for the calibration process based on MVC ([114,115,116,117,118,119], to name only a few), but a definitive method remains elusive. Moreover, there is no singular way to predefine the most representative gesture and MVC value to be used for calibrating EMG data during an actual music performance. In a recent study on different instrumental music performance gestures, participants were instructed to either engage in an isometric static muscle contraction or play in what they believed to be their most forceful manner. For instance, alongside a recent (unpublished) study on drum gestures, participants were instructed to either perform an initial isometric muscle contraction with or without holding the drumstick or to strike the drumstick in their most powerful way, both through the air and on a cymbal. Besides the complexity in deciding between these four calibration options, the subjective and perceptual dimension of power resulted in a discrepancy between the MVC established during the initial calibration phase by playing in their most forceful manner and the maximum force recorded during the actual performance. Therefore, more work needs to be carried out on MVC-based calibrations in music performance and education, including the development of an appropriate, music-specific protocol.

6.2. Misconceptions and Predispositions

The subsequent discussion delineates the misconceptions and predispositions encountered in the process of deriving findings from the case studies. These issues are widely conceptually embedded in the utilisation of EMG data and commonly present in music research, whether for performance or education.

6.2.1. Misconception #1: Straightforward Relationship between EMG and Muscular Force

Due to the challenging and invasive nature of directly measuring muscle force, researchers typically resort to electromyography as a surrogate indicator for muscle activity, as outlined in [120]. The underlying assumption in utilising electromyography technologies is a straightforward and direct proportional relationship between the generated EMG data and the underlying muscular forces. However, this might not be as straightforward as initially thought [121] due to the fact that EMG reflects the electrical rather than the mechanical event of a contraction [122]. While some may advocate for a linear relationship between force and EMG, especially for small muscles [123] and static isometric contractions [124], others have suggested that this assumption is not necessarily valid but far more complex [121,125,126,127,128,129,130,131]. As an important concern in extracting EMG as a representative measure of muscular force is the choice of a suitable muscle [132], the findings indicate that interpreting force based on electromyography (EMG) should be approached with caution, especially when considering a diverse range of activities.
For instance, it has been shown that while the relationship between EMG and force may exhibit linearity at some force magnitudes, for other value ranges, it is more likely to be nonlinear [121,127,133]. Additionally, the variability in EMG signals is influenced by several factors, such as the following: electrode geometry; electrode placement; physiological and structural properties of muscle fibres; morphological characteristics (type, shape, size, length, velocity, and distance from the skin); impedance at the skin–electrode interface; subcutaneous fat [121,122,133,134,135]; fatigue; individual subjective capacity in achieving physically demanding goals; and even signal processing techniques (e.g., filtering) applied to the EMG data [136]. Despite the acknowledgment of these complexities, there persists an ongoing assumption of a direct association between EMG and underlying muscular forces. This assumption is evident in various applications that infer muscular force from the monitored muscular activation levels through EMG data amplitude [137]. While this may be acceptable for non-medical applications in which sophisticated biomechanical analyses of high accuracy or methodological justifications are not critical, it contrasts with the selection of high-end medical-grade equipment in the context of musical practice when establishing conceptual links to muscular force, as seen in studies like [85], or occasionally by [138].
Furthermore, the magnitude of surface electromyography (sEMG) data are characterised as the time-varying standard deviation of EMG [137] and are often calculated using methods such as the root mean square envelope (RMS), rectification (including squaring or full-wave rectification) and normalisation, or other non-linear transformations, followed by smoothing through low-pass linear filtering, as demonstrated in [139] and illustrated in Figure 10. Alternative approaches have also been devised to mitigate the undesirable smoothing of intentional rapid changes in the signal, thereby enhancing responsiveness to physical agility. One such approach involves recursive on-line Bayesian algorithms [135], such as the pipo.bayesfilter external object for MaxMsp by [140], which was employed in our studies to track the EMG envelope from the Myo Armbands in the MaxMsp environment. However, it is important to acknowledge that this probabilistic approach introduces a non-transparent and non-linear layer. Most importantly, it restricts direct human intervention and may have an effect, both practical and aesthetic, on the use of EMG data, especially for real-time applications, such as in driving synthetic sounds in audio interaction, new electronic musical instruments, etc.

6.2.2. Misconception #2: Straightforward Relationship between EMG/Muscular Force and Effort

The nature of the relationship between effort (often referred to under the more technical term rate of perceived exertion, RPE) and EMG (as a surrogate measure related to the amount of muscular force exerted during a task) has also been a long subject of controversy. The proponents of a rather simple force–effort relationship attest that beyond the linear relation between the force produced and the electromyographic measure of muscle activity demonstrated by [141] during isometric contractions, electromyographic measures are also related to psychological experiences of fatigue, heaviness, and ratings of perceived effort [142,143,144,145,146,147,148]. They recognize muscular activity—specifically the ratio of muscular activity to lifting acceleration [149,150,151,152]—during tasks of dynamic touch [149] as a fundamental factor in perceiving the heaviness of handheld objects, a concept akin to physical/muscular effort. Intuitively, objects feel heavier primarily because they are lifted with greater force than lighter-feeling objects. Drawing on this, psychophysiological research has also revealed that the perception of heaviness increases with electromyographic measures of muscle activity [144].
However, for those who question the notion of a simple relationship between EMG and effort, such as [153,154], advocating that this relationship might be more complex, the main criterion resides upon the perceptual and subjective character of effort [155] and its compound nature, consisting of both physical and mental aspects, that make it difficult to quantify. In other words, according to this strand, effort, being a psychophysical measure, is considered inherently subjective, raising concerns about its assessment validity compared to direct measures like force measurement and electromyography (EMG). Although precision in psychophysical measures has been enhanced in recent years through task-specific experiences [156], such as the calibration of perceived sensations by [157], such methods can only be applied in designed experiments in controlled laboratory conditions but not in real music performance and education settings, where participants are not instructed to follow or produce specific distinct stimuli.

6.2.3. Misconception #3: Straightforward Relationship between EMG/Muscular Force/Effort and Sound

There is a prevailing inclination to associate more powerful gestures with higher levels of musical tension or acoustic intensity. It stands to reason that there is an inherent causal gesture–sound relationship embedded in our comprehension that may be ascribed to energetic couplings between powerful actions and the resulting sonic output in sound-producing gestures, whether instrumental or not. It is currently widely acknowledged that there exists a deeper relationship between movement and sound, one that goes beyond mere mechanical couplings, extending to more fundamental and pervasive cognitive schemata [24]. Interestingly, according to [158], a particular amount of muscular effort is necessarily required to perceive heaviness; in other words, a minimum threshold of muscle activation is required for the perception of weight. Drawing on this notion, it would not be a serious leap to also argue that a minimum amount of muscular activation is also required by musicians to haptically feel sound, even in cases of sound-accompanying gestures, such as MIIOs, which explains why musicians are often observed exerting effort in their hands during MIIOs while singing.
However, this does not necessarily imply that more physically powerful movements will necessarily lead to perceptually more musically tensed or perceptually louder outputs. Similarly, visually perceivable powerful movements may not exclusively correlate with increased tension in gesture-accompanying gestures, such as in singing. From personal communication with a number of musicians, it appears that they do not necessarily associate loud sounds with tense muscles. Instead, there seems to be a widespread agreement among musicians that generating loud sounds involves swift movements coupled with a heightened sense of control but with minimal muscle tension. These are still unresolved matters that warrant further systematic research for examination.

7. Conclusions

The current paper has raised concerns regarding practical challenges as well as misconceptions and predispositions embedded in the common use of electromyography. Practical insights were drawn from a particular case study in the oud performance and education of Persian art music under semi-controlled research conditions, and they were situated within the relevant literature. On the one hand, the paper has highlighted the unique potential of EMG data as a surrogate of music-related physical effort, a concept that has been acknowledged as critical in music endeavours. Simultaneously, recognising the intricate and subjective nature of effort as a concept related to measured force, and considering the controversies surrounding EMG–force relationships, this paper has also highlighted concerns regarding the simplistic and at times naive application of EMG technology as a metric for power and effort in music research. Concerning the discussed practical challenges encountered during the collection of multimodal data in the field, such as the rigorous demands for data synchronization and post-processing, ongoing technological advancements, like the increase in sampling rates and the evolution of AI synchronisation techniques, are anticipated to offer significant support and reduce the time needed for post-processing [159].
It is essential to recognize the project’s limitations in that the LM has been applied to isolated strokes, stripped from their embedded functionality within a real performance, and that the Pearson correlation analysis has been applied to a long section of the performance, where the measured force values and the annotated effort values prove to be inadequate. In addition, given that not all types of gestures under study are necessarily directly responsible for sound production, alternative technologies may prove more suitable for this type of work, such as electroencephalography signals [160]. Therefore, more systematic work needs to be carried out in scrutinizing effort-related concepts and suitable technologies and measures as well as their relationship. While the concepts of force and effort have received a thorough examination in sports science, their significance in music performance studies warrants equal attention.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Research Ethics Committee of the Hellenic Mediterranean University with protocol code 10242 and date of approval 21.3.2024.

Informed Consent Statement

Written informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

I want to express my gratitude to Yasamin Shahhosseini for her participation.

Conflicts of Interest

The author declares no conflicts of interest.

Appendix A

Figure A1. Yasamin Shahhosseini (plucking gestures): session 1.3. (a) Maximum smoothed force and (b) raw filtered EMG, along with visually pertinent features plotted over time: (c) sound levels (acoustic energy RMS) in blue and filtered values in orange, (d) logarithmic spectral brightness in blue and filtered values in orange, (e) vertical (z) distance from rest position, (f) vertical (z) smoothed velocity, (g) norm of raw acceleration data, (h) norm of smoothed absolute acceleration values. Time is displayed in different scales (msec, sec, or number of frames, which depend on the frame rate for each respective capture technology). However, all refer to the same duration (~23 s).
Figure A1. Yasamin Shahhosseini (plucking gestures): session 1.3. (a) Maximum smoothed force and (b) raw filtered EMG, along with visually pertinent features plotted over time: (c) sound levels (acoustic energy RMS) in blue and filtered values in orange, (d) logarithmic spectral brightness in blue and filtered values in orange, (e) vertical (z) distance from rest position, (f) vertical (z) smoothed velocity, (g) norm of raw acceleration data, (h) norm of smoothed absolute acceleration values. Time is displayed in different scales (msec, sec, or number of frames, which depend on the frame rate for each respective capture technology). However, all refer to the same duration (~23 s).
Mti 08 00037 g0a1
Figure A2. Yasamin Shahhosseini (plucking gestures): session 1.9. (a) Maximum smoothed force and (b) raw filtered EMG, along with visually pertinent features plotted over time: (c) sound levels (acoustic energy RMS) in blue and filtered values in orange, (d) logarithmic spectral brightness in blue and filtered values in orange, (e) vertical (z) distance from rest position, (f) vertical (z) smoothed velocity, (g) norm of raw acceleration data, (h) norm of smoothed absolute acceleration values. Time is displayed in different scales (msec, sec, or number of frames, which depend on the frame rate for each respective capture technology). However, all refer to the same duration (~25 s).
Figure A2. Yasamin Shahhosseini (plucking gestures): session 1.9. (a) Maximum smoothed force and (b) raw filtered EMG, along with visually pertinent features plotted over time: (c) sound levels (acoustic energy RMS) in blue and filtered values in orange, (d) logarithmic spectral brightness in blue and filtered values in orange, (e) vertical (z) distance from rest position, (f) vertical (z) smoothed velocity, (g) norm of raw acceleration data, (h) norm of smoothed absolute acceleration values. Time is displayed in different scales (msec, sec, or number of frames, which depend on the frame rate for each respective capture technology). However, all refer to the same duration (~25 s).
Mti 08 00037 g0a2

References

  1. Leman, M. Embodied Music Cognition and Mediation Technology; The MIT Press: Cambridge, MA, USA, 2007; ISBN 978-0-262-25655-1. [Google Scholar]
  2. Ratcliffe, M. Perception, Exploration, and the Primacy of Touch. In The Oxford Handbook of 4E Cognition; Newen, A., De Bruin, L., Gallagher, S., Eds.; Oxford University Press: Oxford, UK, 2018; pp. 280–300. ISBN 978-0-19-873541-0. [Google Scholar]
  3. Glennie, E. Evelyn Glennie: Teach the World to Listen. Available online: https://www.evelyn.co.uk/hearing-essay/ (accessed on 1 April 2024).
  4. Papetti, S.; Saitis, C. Musical Haptics: Introduction. In Musical Haptics; Papetti, S., Saitis, C., Eds.; Springer Series on Touch and Haptic Systems; Springer International Publishing: Cham, Switzerland, 2018; pp. 1–7. ISBN 978-3-319-58316-7. [Google Scholar]
  5. Paschalidou, S. Effort Inference and Prediction by Acoustic and Movement Descriptors in Interactions with Imaginary Objects during Dhrupad Vocal Improvisation. Wearable Technol. 2022, 3, e14. [Google Scholar] [CrossRef]
  6. Tanaka, A. Intention, Effort, and Restraint: The EMG in Musical Performance. Leonardo 2015, 48, 298–299. [Google Scholar] [CrossRef]
  7. Tanaka, A.; Altavilla, A.; Spowage, N. Gestural Musical Affordances. In Proceedings of the 9th Sound and Music Computing Conference, Copenhagen, Denmark, 11–14 July 2012; pp. 318–325. [Google Scholar]
  8. Krefeld, V.; Waisvisz, M. The Hand in the Web: An Interview with Michel Waisvisz. Comput. Music J. 1990, 14, 28. [Google Scholar] [CrossRef]
  9. Bennett, P.; Ward, N.; O’Modhrain, S.; Rebelo, P. DAMPER: A Platform for Effortful Interface Development. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression, New York, NY, USA, 6–10 June 2007; Association for Computing Machinery: New York, NY, USA, 2007; pp. 273–276. [Google Scholar]
  10. Rahaim, M. Musicking Bodies: Gesture and Voice in Hindustani Music; Music/culture; Wesleyan University Press: Middletown, CT, USA, 2012; ISBN 978-0-8195-7325-4. [Google Scholar]
  11. De Poli, G.; Mion, L.; Rodà, A. Toward an Action Based Metaphor for Gestural Interaction with Musical Contents. J. New Music Res. 2009, 38, 295–307. [Google Scholar] [CrossRef]
  12. Johnson, M.L.; Larson, S. “Something in the Way She Moves”—Metaphors of Musical Motion. Metaphor Symbol 2003, 18, 63–84. [Google Scholar] [CrossRef]
  13. Pearson, L.; Pouw, W. Gesture-Vocal Coupling in Karnatak Music Performance: A Neuro-Bodily Distributed Aesthetic Entanglement. Ann. N. Y. Acad. Sci. 2022, 1515, 219–236. [Google Scholar] [CrossRef]
  14. Paschalidou, S. Effort in Gestural Interactions with Imaginary Objects in Hindustani Dhrupad Vocal Music. Ph.D. Thesis, Durham University, Durham, UK, 2017. [Google Scholar]
  15. Luciani, A.; Florens, J.-L.; Couroussé, D.; Cadoz, C. Ergotic Sounds A New Way to Improve Playability, Believability and Presence of Digital Musical Instruments. In Proceedings of the ENACTIVE/07, Grenoble, France, 22 November 2007. [Google Scholar]
  16. Merleau-Ponty, M.; Landes, D.A.; Carman, T.; Lefort, C.; Merleau-Ponty, M. Phenomenology of Perception, 1st ed.; Routledge: London, UK; New York, NY, USA, 2014; ISBN 978-0-415-83433-9. [Google Scholar]
  17. Shapiro, L.A. Embodied Cognition, 2nd ed.; New Problems of Philosophy; Routledge/Taylor & Francis Group: London, UK; New York, NY, USA, 2019; ISBN 978-1-138-74699-2. [Google Scholar]
  18. Noë, A. Action in Perception, 1st ed.; Representation and Mind; MIT Press: Cambridge, MA, USA, 2006; ISBN 978-0-262-64063-3. [Google Scholar]
  19. Varela, F.J.; Thompson, E.; Rosch, E. The Embodied Mind: Cognitive Science and Human Experience; MIT Press: Cambridge, MA, USA, 1993; ISBN 978-0-262-72021-2. [Google Scholar]
  20. Gibson, J.J. The Ecological Approach to Visual Perception; Erlbaum: Hillsdale, NJ, USA, 1986; ISBN 978-0-89859-958-9. [Google Scholar]
  21. Gallagher, S. Embodied and Enactive Approaches to Cognition; Cambridge Elements in Philosophy of Mind; Cambridge University Press: Cambridge, UK; New York, NY, USA, 2023; ISBN 978-1-00-920980-9. [Google Scholar]
  22. Schiavio, A.; Menin, D. Embodied Music Cognition and Mediation Technology: A Critical Review. Psychol. Music 2013, 41, 804–814. [Google Scholar] [CrossRef]
  23. Zbikowski, L.M. Conceptualizing Music: Cognitive Structure, Theory, and Analysis; AMS Studies in Music; Oxford University Press: Oxford, UK; New York, NY, USA, 2002; ISBN 978-0-19-514023-1. [Google Scholar]
  24. Godøy, R.I. Gestural-Sonorous Objects: Embodied Extensions of Schaeffer’s Conceptual Apparatus. Org. Sound 2006, 11, 149–157. [Google Scholar] [CrossRef]
  25. van der Schyff, D.; Schiavio, A.; Walton, A.; Velardo, V.; Chemero, A. Musical Creativity and the Embodied Mind: Exploring the Possibilities of 4E Cognition and Dynamical Systems Theory. Music Sci. 2018, 1, 205920431879231. [Google Scholar] [CrossRef]
  26. Clayton, M.; Rao, P.; Shikarpur, N.; Roychowdhury, S.; Li, J. Raga Classification from Vocal Performances Using Multimodal Analysis. In Proceedings of the 23rd International Society for Music Information Retrieval Conference, Bengaluru, India, 16 December 2022. [Google Scholar]
  27. Naveda, L.; Nunes-Silva, M. Breaking down the Musician’s Minds: How Small Changes in the Musical Instrument Can Impair Your Musical Performance. J. New Music Res. 2021, 50, 373–391. [Google Scholar] [CrossRef]
  28. Reybrouck, M. Music Listening as Kangaroo Mother Care: From Skin-to-Skin Contact to Being Touched by the Music. Acoustics 2024, 6, 35–64. [Google Scholar] [CrossRef]
  29. Reybrouck, M. Musical Sense-Making: Enactment, Experience and Computation, 1st ed.; Routledge/Taylor & Francis Group: London, UK, 2021; ISBN 978-0-429-27401-5. [Google Scholar]
  30. Bremmer, M.; Nijs, L. The Role of the Body in Instrumental and Vocal Music Pedagogy: A Dynamical Systems Theory Perspective on the Music Teacher’s Bodily Engagement in Teaching and Learning. Front. Educ. 2020, 5, 79. [Google Scholar] [CrossRef]
  31. Leman, M.; Maes, P.-J.; Nijs, L.; Van Dyck, E. What Is Embodied Music Cognition? In Springer Handbook of Systematic Musicology; Bader, R., Ed.; Springer Handbooks; Springer: Berlin/Heidelberg, Germany, 2018; pp. 747–760. ISBN 978-3-662-55002-1. [Google Scholar]
  32. Lesaffre, M. Investigating Embodied Music Cognition for Health and Well-Being. In Springer Handbook of Systematic Musicology; Bader, R., Ed.; Springer Handbooks; Springer: Berlin/Heidelberg, Germany, 2018; pp. 779–791. ISBN 978-3-662-55002-1. [Google Scholar]
  33. Van Der Schyff, D.; Schiavio, A. Evolutionary Musicology Meets Embodied Cognition: Biocultural Coevolution and the Enactive Origins of Human Musicality. Front. Neurosci. 2017, 11, 519. [Google Scholar] [CrossRef] [PubMed]
  34. Pearson, L. Gesture in Karnatak Music: Pedagogy and Musical Structure in South India. Ph.D. Thesis, Durham University, Durham, UK, 2016. [Google Scholar]
  35. Cox, A. Music and Embodied Cognition: Listening, Moving, Feeling, and Thinking; Indiana University Press: Bloomington, IN, USA, 2016; ISBN 978-0-253-02167-0. [Google Scholar]
  36. Schiavio, A.; Høffding, S. Playing Together without Communicating? A Pre-Reflective and Enactive Account of Joint Musical Performance. Music. Sci. 2015, 19, 366–388. [Google Scholar] [CrossRef]
  37. Clayton, M.; Leante, L. Embodiment in Music Performance. In Experience and Meaning in Music Performance; Clayton, M., Dueck, B., Leante, L., Eds.; Oxford University Press: Oxford, UK, 2013; pp. 188–207. ISBN 978-0-19-981132-8. [Google Scholar]
  38. Johnson, M.L. The Body in the Mind: The Bodily Basis of Meaning, Imagination, and Reason; University of Chicago Press: Chicago, IL, USA, 1992; ISBN 978-0-226-40318-2. [Google Scholar]
  39. Thompson, M. The Application of Motion Capture to Embodied Music Cognition Research. Ph.D. Thesis, University of Jyväskylä, Jyväskylä, UK, 2012. [Google Scholar]
  40. Maletic, V. Body, Space, Expression: The Development of Rudolf Laban’s Movement and Dance Concepts; Approaches to Semiotics; Mouton de Gruyter: Berlin, Germany; New York, NY, USA; Amsterdam, The Netherlands, 1987; ISBN 978-3-11-010780-7. [Google Scholar]
  41. Bartenieff, I.; Lewis, D. Body Movement: Coping with the Environment; Gordon and Breach Science Publishers: New York, NY, USA, 1980; ISBN 978-0-677-05500-8. [Google Scholar]
  42. Von Laban, R.; Lawrence, F.C. Effort: Economy in Body Movement, 2nd ed.; Plays: Boston, MA, USA, 1974; ISBN 978-0-8238-0160-2. [Google Scholar]
  43. Camurri, A.; Lagerlöf, I.; Volpe, G. Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. Int. J. Hum.-Comput. Stud. 2003, 59, 213–225. [Google Scholar] [CrossRef]
  44. Ryan, J. Some Remarks on Musical Instrument Design at STEIM. Contemp. Music Rev. 1991, 6, 3–17. [Google Scholar] [CrossRef]
  45. Kurth, E. Grundlagen des Linearen Kontrapunkts; Bachs Melodische Polyphonie; M. Hesse: Berlin, Germany, 1922. [Google Scholar]
  46. Lerdahl, F.; Krumhansl, C.L. Modeling Tonal Tension. Music Percept. 2007, 24, 329–366. [Google Scholar] [CrossRef]
  47. Olsen, K.N.; Dean, R.T. Does Perceived Exertion Influence Perceived Affect in Response to Music? Investigating the “FEELA” Hypothesis. Psychomusicol. Music Mind Brain 2016, 26, 257–269. [Google Scholar] [CrossRef]
  48. Vertegaal, R.; Ungvary, T.; Kieslinger, M. Towards a Musician’s Cockpit: Transducers, Feedback and Musical Function; ICMA: Ann Arbor, MI, USA, 1996. [Google Scholar]
  49. d’Escriván, J. To Sing the Body Electric: Instruments and Effort in the Performance of Electronic Music. Contemp. Music Rev. 2006, 25, 183–191. [Google Scholar] [CrossRef]
  50. Garnett, G.E.; Goudeseune, C. Performance Factors in Control of High-Dimensional Spaces. In Proceedings of the 25th International Computer Music Conference, ICMC 1999, Beijing, China, 22–28 October 1999; Michigan Publishing: Beijing, China, 1999. [Google Scholar]
  51. Castagné, N.; Cadoz, C. A Goals-Based Review of Physical Modelling. In Proceedings of the International Computer Music Conference—ICMC 2005, Barcelona, Spain, 4–10 September 2005; pp. 343–346. [Google Scholar]
  52. Ward, N. Effortful Interaction: A New Paradigm for the Design of Digital Musical Instruments. Ph.D. Thesis, Queens University Belfas, Belfast, UK, 2013. [Google Scholar]
  53. Solomos, M. Iannis Xenakis; P.O. Editions: Mercuès, France, 1996. [Google Scholar]
  54. Varga, B.A. Conversations with Iannis Xenakis; Faber and Faber: London, UK, 1996; ISBN 978-0-571-17959-6. [Google Scholar]
  55. Steele, J. What Is (Perception of) Effort? Objective and Subjective Effort during Attempted Task Performance. PsyArXiv 2020. [Google Scholar] [CrossRef]
  56. Dewey, J. The Psychology of Effort. Philos. Rev. 1897, 6, 43. [Google Scholar] [CrossRef]
  57. Tanaka, A.; Knapp, B. Multimodal Interaction In Music Using The Electromyogram And Relative Position Sensing. In Proceedings of the New Interfaces for Musical Expression, NIME 2002, Dublin, Ireland, 1 June 2002; Zenodo: Dublin, Ireland, 2002. [Google Scholar]
  58. Piana, S.; Mancini, M.; Camurri, A.; Varni, G.; Volpe, G. Automated Analysis of Non-Verbal Expressive Gesture. In Human Aspects in Ambient Intelligence; Bosse, T., Cook, D.J., Neerincx, M., Sadri, F., Eds.; Atlantis Ambient and Pervasive Intelligence; Atlantis Press: Paris, France, 2013; Volume 8, pp. 41–54. ISBN 978-94-6239-017-1. [Google Scholar]
  59. Castellano, G.; Mancini, M. Analysis of Emotional Gestures for the Generation of Expressive Copying Behaviour in an Embodied Agent. In Gesture-Based Human-Computer Interaction and Simulation; Sales Dias, M., Gibet, S., Wanderley, M.M., Bastos, R., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2009; Volume 5085, pp. 193–198. ISBN 978-3-540-92864-5. [Google Scholar]
  60. Mazzarino, B.; Peinado, M.; Boulic, R.; Volpe, G.; Wanderley, M.M. Improving the Believability of Virtual Characters Using Qualitative Gesture Analysis. In Gesture-Based Human-Computer Interaction and Simulation, Proceedings of the 7th International Gesture Workshop, GW 2007, Lisbon, Portugal, 23–25 May 2007; Sales Dias, M., Gibet, S., Wanderley, M.M., Bastos, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; pp. 48–56. [Google Scholar]
  61. Chi, D.M. A Motion Control Scheme for Animating Expressive Arm Movements. Ph.D. Thesis, University of Pennsylvania, Philadelphia, PA, USA, 1999. [Google Scholar]
  62. MIDI Association. Official MIDI Specifications. Available online: https://www.midi.org/specifications/midi-2-0-specifications (accessed on 1 April 2024).
  63. Pageaux, B. Perception of Effort in Exercise Science: Definition, Measurement and Perspectives. Eur. J. Sport Sci. 2016, 16, 885–894. [Google Scholar] [CrossRef] [PubMed]
  64. Jagiello, R.; Pomper, U.; Yoneya, M.; Zhao, S.; Chait, M. Rapid Brain Responses to Familiar vs. Unfamiliar Music—An EEG and Pupillometry Study. Sci. Rep. 2019, 9, 15570. [Google Scholar] [CrossRef] [PubMed]
  65. O’shea, H.; Moran, A. Are Fast Complex Movements Unimaginable? Pupillometric Studies of Motor Imagery in Expert Piano Playing. J. Mot. Behav. 2019, 51, 371–384. [Google Scholar] [CrossRef] [PubMed]
  66. Ward, N.; Ortiz, M.; Bernardo, F.; Tanaka, A. Designing and Measuring Gesture Using Laban Movement Analysis and Electromyogram. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, Heidelberg, Germany, 12 September 2016; ACM: Heidelberg, Germany, 2016; pp. 995–1000. [Google Scholar]
  67. Gibet, S. Sensorimotor Control of Sound-Producing Gestures, Musical Gestures—Sound, Movement, and Meaning. In Musical Gestures: Sound, Movement, and Meaning; Routledge: Milton Park, UK, 2010. [Google Scholar]
  68. Marcora, S. Perception of Effort during Exercise Is Independent of Afferent Feedback from Skeletal Muscles, Heart, and Lungs. J. Appl. Physiol. 2009, 106, 2060–2062. [Google Scholar] [CrossRef]
  69. Enoka, R.M.; Stuart, D.G. Neurobiology of Muscle Fatigue. J. Appl. Physiol. 1992, 72, 1631–1648. [Google Scholar] [CrossRef] [PubMed]
  70. Kahneman, D. Attention and Effort; Prentice-Hall: Englewood Cliffs, NJ, USA, 1973. [Google Scholar]
  71. Hashimoto, Y.; Ushiba, J.; Kimura, A.; Liu, M.; Tomita, Y. Correlation between EEG–EMG Coherence during Isometric Contraction and Its Imaginary Execution. Acta Neurobiol. Exp. 2010, 70, 76–85. [Google Scholar] [CrossRef]
  72. Bi, L.; Feleke, G.A.; Guan, C. A Review on EMG-Based Motor Intention Prediction of Continuous Human Upper Limb Motion for Human-Robot Collaboration. Biomed. Signal Process. Control 2019, 51, 113–127. [Google Scholar] [CrossRef]
  73. Cavalcanti Garcia, M.A.; Vieira, T.M.M. Surface Electromyography: Why, When and How to Use It. Rev. Andal. Med. Deporte 2011, 4, 17–28. [Google Scholar]
  74. Jung, J.-K.; Im, Y.-G. Can the Subject Reliably Reproduce Maximum Voluntary Contraction of Temporalis and Masseter Muscles in Surface EMG? J. Craniomandib. Sleep Pract. 2022, 1–10. [Google Scholar] [CrossRef]
  75. Hofmann, D.; Jiang, N.; Vujaklija, I.; Farina, D. Bayesian Filtering of Surface EMG for Accurate Simultaneous and Proportional Prosthetic Control. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 24, 1333–1341. [Google Scholar] [CrossRef] [PubMed]
  76. Donnarumma, M.; Tanaka, A. Principles, Challenges and Future Directions of Physiological Computing for the Physical Performance of Digital Musical Instruments. In Proceedings of the 9th Conference on Interdisciplinary Musicology—CIM14, Berlin, Germany, 4–6 December 2014. [Google Scholar]
  77. Tanaka, A.; Fierro, D.; Klang, M.; Whitmarsh, S. The EAVI ExG Muscle/Brain Hybrid Physiological Sensing. In Proceedings of the New Instruments for Musical Expression, Mexico City, Mexico, 31 May–3 June 2023. [Google Scholar]
  78. Françoise, J.; Fdili Alaoui, S.; Candau, Y. CO/DA: Live-Coding Movement-Sound Interactions for Dance Improvisation. In Proceedings of the CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April 2022; ACM: New Orleans, LA, USA, 2022; pp. 1–13. [Google Scholar]
  79. Zbyszynski, M.; Tanaka, A.; Visi, F. Interactive Machine Learning: Strategies for Live Performance Using Electromyography. In Open Source Biomedical Engineering; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
  80. Dalmazzo, D.; Tassani, S.; Ramírez, R. A Machine Learning Approach to Violin Bow Technique Classification: A Comparison Between IMU and MOCAP Systems. In Proceedings of the 5th International Workshop on Sensor-based Activity Recognition and Interaction, Berlin, Germany, 20 September 2018; ACM: Berlin, Germany, 2018; pp. 1–8. [Google Scholar]
  81. Caramiaux, B.; Donnarumma, M.; Tanaka, A. Understanding Gesture Expressivity through Muscle Sensing. ACM Trans. Comput.-Hum. Interact. 2015, 21, 1–26. [Google Scholar] [CrossRef]
  82. Chong, H.J.; Kim, S.J.; Lee, E.K.; Yoo, G.E. Analysis of Surface EMG Activation in Hand Percussion Playing Depending on the Grasping Type and the Tempo. J. Exerc. Rehabil. 2015, 11, 228–235. [Google Scholar] [CrossRef] [PubMed]
  83. Visentin, P.; Shan, G. Applications of EMG Pertsining to Music Performance—A Review. In Arts Biomechanics; Nova Science Publishers: Hauppauge, NY, USA, 2011; Volume 1, ISBN 2156-5724. [Google Scholar]
  84. Tsubouchi, Y.; Suzuki, K. BioTones: A Wearable Device for EMG Auditory Biofeedback. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; IEEE: Buenos Aires, Argentina, 2010; pp. 6543–6546. [Google Scholar]
  85. Verdugo, F.; Ceglia, A.; Frisson, C.; Burton, A.; Begon, M.; Gibet, S.; Wanderley, M.M. Feeling the Effort of Classical Musicians—A Pipeline from Electromyography to Smartphone Vibration for Live Music Performance. In Proceedings of the NIME 2022, Auckland, New Zealand, 28 June 2022; The University of Auckland: Auckland, New Zealand, 2022. [Google Scholar]
  86. Kjelland, J.M. Application of Electromyography and Electromyographic Biofeedback in Music Performance Research: A Review of the Literature since 1985. Med. Probl. Perform. Artist. 2000, 15, 115–118. [Google Scholar] [CrossRef]
  87. Morasky, R.L.; Creech, R.; Clarke, G. Using Biofeedback to Reduce Left Arm Extensor EMG of String Players during Musical Performance. Biofeedback Self-Regul. 1981, 6, 565–572. [Google Scholar] [CrossRef] [PubMed]
  88. Tateno, S.; Liu, H.; Ou, J. Development of Sign Language Motion Recognition System for Hearing-Impaired People Using Electromyography Signal. Sensors 2020, 20, 5807. [Google Scholar] [CrossRef] [PubMed]
  89. Antoniadis, P.; Paschalidou, S.; Duval, A.; Jégo, J.-F.; Bevilacqua, F. Rendering Embodied Experience into Multimodal Data: Concepts, Tools and Applications for Xenakis’ Piano Performance. In Proceedings of the Xenakis 22: Centenary International Symposium, Athens & Nafplio, Greece, 24–29 May 2022. [Google Scholar]
  90. Antoniadis, P. Physicality as a Performer-Specific Perspectival Point to I. Xenakis’s Piano Work: Case Study Mists. In Proceedings of the Xenakis International Symposium, Southbank Centre, London, UK, 1–3 April 2011. [Google Scholar]
  91. Tanaka, A.; Ortiz, M. Gestural Musical Performance with Physiological Sensors, Focusing on the Electromyogram. In The Routledge Companion to Embodied Music Interaction; Lesaffre, M., Maes, P.-J., Leman, M., Eds.; Routledge: New York, NY, USA; London, UK, 2017; pp. 420–428. ISBN 978-1-315-62136-4. [Google Scholar]
  92. Tanaka, A. The Use of Electromyogram Signals (EMG) in Musical Performance: A Personal Survey of Two Decades of Practice by Atau Tanaka. CECeContact Biotechnol. Perform. Pract./Prat. Perform. Biotechnol. 2014, 14, 10. [Google Scholar]
  93. Donnarumma, M.; Caramiaux, B.; Tanaka, A. Muscular Interactions Combining EMG and MMG Sensing for Musical Practice; KAIST: Daejeon, Republic of Korea, 2013. [Google Scholar]
  94. Tanaka, A. Musical Technical Issues in Using Interactive Instrument Technology with Application to the BioMuse; International Computer Music Association: San Francisco, CA, USA, 1993. [Google Scholar]
  95. Bongers, B. Sensorband An Interview with Sensorband. Comput. Music J. 1998, 22, 13. [Google Scholar] [CrossRef]
  96. Donnarumma, M. XTH SENSE: A Study of Muscle Sounds for an Experimental Paradigm of Musical Performance. In Proceedings of the International Computer Music Conference, Huddersfield, UK, 31 July–5 August 2011. [Google Scholar]
  97. Antoniadis, P.; Jego, J.-F.; Duval, A.; Paschalidou, S.; Bevilacqua, F.; Solomos, M. Augmented Recital: Habiter (Avec) Xenakis. Available online: https://www.jfcad.com/habiter-avec-xenakis/ (accessed on 1 April 2024).
  98. Raurale, S.A.; McAllister, J.; Del Rincon, J.M. Real-Time Embedded EMG Signal Analysis for Wrist-Hand Pose Identification. IEEE Trans. Signal Process. 2020, 68, 2713–2723. [Google Scholar] [CrossRef]
  99. François, J. Myo for Max. Available online: https://github.com/JulesFrancoise/myo-for-max (accessed on 1 April 2024).
  100. Cycling ’74 Max. Available online: https://cycling74.com/products/max (accessed on 1 April 2024).
  101. BITalino (r)Evolution Plugged Kit BLE/BT. Available online: https://www.pluxbiosignals.com/collections/biosignals-for-education/products/bitalino-revolution-plugged-kit-ble-bt (accessed on 1 April 2024).
  102. Optitrack. Available online: https://optitrack.com/ (accessed on 1 April 2024).
  103. Xsens Link. Available online: https://www.movella.com/products/motion-capture/xsens-mvn-link (accessed on 1 April 2024).
  104. Shadow. Available online: https://www.motionshadow.com/ (accessed on 1 April 2024).
  105. Noitom Perception Neuron Series. Available online: https://www.noitom.com/perception-neuron-series (accessed on 1 April 2024).
  106. Shafiei, S. Analysis of Vocal Ornamentation in Iranian Classical Music. In Proceedings of the 16th Sound and Music Computing Conference, Málaga, Spain, 28–31 May 2019; pp. 437–441. [Google Scholar]
  107. Nikzat, B.; Caro Repetto, R. KDC: An Open Corpus for Computational Research of Dastgahi Music. In Proceedings of the 23rd International Society for Music Information Retrieval Conference (ISMIR), Bengaluru, India, 4–8 December 2022; pp. 321–328. [Google Scholar]
  108. Candau, Y.; Françoise, J.; Alaoui, S.F.; Schiphorst, T. Cultivating Kinaesthetic Awareness through Interaction: Perspectives from Somatic Practices and Embodied Cognition. In Proceedings of the 4th International Conference on Movement Computing, London, UK, 28 June 2017; ACM: London, UK, 2017; pp. 1–8. [Google Scholar]
  109. Hofmann, D. Myoelectric Signal Processing for Prosthesis Control. Ph.D. Thesis, University of Göttingen, Göttingen, Germany, 2013. [Google Scholar]
  110. Schubert, E. Correlation Analysis of Continuous Emotional Response to Music: Correcting for the Effects of Serial Correlation. Music. Sci. 2001, 5, 213–236. [Google Scholar] [CrossRef]
  111. Stergiou, N.; Decker, L.M. Human Movement Variability, Nonlinear Dynamics, and Pathology: Is There a Connection? Hum. Mov. Sci. 2011, 30, 869–888. [Google Scholar] [CrossRef] [PubMed]
  112. Moore, D.S.; McCabe, G.P. Introduction to the Practice of Statistics, 3rd ed.; W.H. Freeman: New York, NY, USA, 1999; ISBN 978-0-7167-3502-1. [Google Scholar]
  113. Godøy, R.I. Understanding Coarticulation in Musical Experience. In Sound, Music, and Motion; Aramaki, M., Derrien, O., Kronland-Martinet, R., Ystad, S., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2014; Volume 8905, pp. 535–547. ISBN 978-3-319-12975-4. [Google Scholar]
  114. Akinnola, O.O.; Vardakastani, V.; Kedgley, A.E. Identifying Tasks to Elicit Maximum Voluntary Contraction in the Muscles of the Forearm. J. Electromyogr. Kinesiol. 2020, 55, 102463. [Google Scholar] [CrossRef] [PubMed]
  115. Kukla, M.; Wieczorek, B.; Warguła, Ł. Development of Methods for Performing the Maximum Voluntary Contraction (MVC) Test. MATEC Web Conf. 2018, 157, e05015. [Google Scholar] [CrossRef]
  116. Dahlqvist, C.; Nordander, C.; Granqvist, L.; Forsman, M.; Hansson, G.-Å. Comparing Two Methods to Record Maximal Voluntary Contractions and Different Electrode Positions in Recordings of Forearm Extensor Muscle Activity: Refining Risk Assessments for Work-Related Wrist Disorders. WOR 2018, 59, 231–242. [Google Scholar] [CrossRef] [PubMed]
  117. Al-Qaisi, S.; Aghazadeh, F. Electromyography Analysis: Comparison of Maximum Voluntary Contraction Methods for Anterior Deltoid and Trapezius Muscles. Procedia Manuf. 2015, 3, 4578–4583. [Google Scholar] [CrossRef]
  118. Rainoldi, A.; Bullock-Saxton, J.E.; Cavarretta, F.; Hogan, N. Repeatability of Maximal Voluntary Force and of Surface EMG Variables during Voluntary Isometric Contraction of Quadriceps Muscles in Healthy Subjects. J. Electromyogr. Kinesiol. 2001, 11, 425–438. [Google Scholar] [CrossRef]
  119. Farina, D.; Merletti, R.; Rainoldi, A.; Buonocore, M.; Casale, R. Two Methods for the Measurement of Voluntary Contraction Torque in the Biceps Brachii Muscle. Med. Eng. Phys. 1999, 21, 533–540. [Google Scholar] [CrossRef] [PubMed]
  120. Staudenmann, D.; Roeleveld, K.; Stegeman, D.F.; Van Dieën, J.H. Methodological Aspects of SEMG Recordings for Force Estimation—A Tutorial and Review. J. Electromyogr. Kinesiol. 2010, 20, 375–387. [Google Scholar] [CrossRef] [PubMed]
  121. Kuriki, U.H.; De Azevedo, F.M.; Ota Takahashi, L.S.; Mello, E.M.; Negrao Filho, R.D.F.; Alves, N. The Relationship Between Electromyography and Muscle Force. In EMG Methods for Evaluating Muscle and Nerve Function; Schwartz, M., Ed.; InTech: Houston, TX, USA, 2012; ISBN 978-953-307-793-2. [Google Scholar]
  122. Zhou, P.; Rymer, W.Z. Factors Governing the Form of the Relation between Muscle Force and the EMG: A Simulation Study. J. Neurophysiol. 2004, 92, 2878–2886. [Google Scholar] [CrossRef]
  123. Fukuda, T.Y.; Echeimberg, J.O.; Pompeu, J.E.; Lucareli, P.R.G.; Garbelotti, S.; Gimenes, R.O.; Apolinário, A. Root Mean Square Value of the Electromyographic Signal in the Isometric Torque of the Quadriceps, Hamstrings and Brachial Biceps Muscles in Female Subjects. J. Appl. Res. 2010, 10, 32–39. [Google Scholar]
  124. Hof, A.L. The relationship between electromyogram and muscle force. Sportverletzung Sportschaden 1997, 11, 79–86. [Google Scholar] [CrossRef] [PubMed]
  125. Hug, F.; Hodges, P.W.; Tucker, K. Muscle Force Cannot Be Directly Inferred From Muscle Activation: Illustrated by the Proposed Imbalance of Force Between the Vastus Medialis and Vastus Lateralis in People With Patellofemoral Pain. J. Orthop. Sports Phys. Ther. 2015, 45, 360–365. [Google Scholar] [CrossRef] [PubMed]
  126. Pain, M.T.G.; Forrester, S.E. Predicting Maximum Eccentric Strength from Surface EMG Measurements. J. Biomech. 2009, 42, 1598–1603. [Google Scholar] [CrossRef] [PubMed]
  127. Kaminski, T.W.; Royer, T.D. Electromyography and Muscle Force: Caution Ahead. Athl. Ther. Today 2005, 10, 43–45. [Google Scholar] [CrossRef]
  128. West, W.; Hicks, A.; Clements, L.; Dowling, J. The Relationship between Voluntary Electromyogram, Endurance Time and Intensity of Effort in Isometric Handgrip Exercise. Europ. J. Appl. Physiol. 1995, 71, 301–305. [Google Scholar] [CrossRef] [PubMed]
  129. Herzog, W.; Sokolosky, J.; Zhang, Y.T.; Guimarães, A.C.S. EMG-Force Relation in Dynamically Contracting Cat Plantaris Muscle. J. Electromyogr. Kinesiol. 1998, 8, 147–155. [Google Scholar] [CrossRef] [PubMed]
  130. Woods, J.J.; Bigland-Ritchie, B. Linear and Non-Linear Surface EMG/Force Relationships in Human Muscles: An Anatomical/Functional Argument for the Existence of Both. Am. J. Phys. Med. Rehabil. 1983, 62, 287–299. [Google Scholar]
  131. Jones, L.A.; Hunter, I.W. Force and EMG Correlates of Constant Effort Contractions. Europ. J. Appl. Physiol. 1983, 51, 75–83. [Google Scholar] [CrossRef] [PubMed]
  132. Bigland-Ritchie, B. EMG/Force Relations and Fatigue of Human Voluntary Contractions. Exerc. Sport Sci. 1981, 9, 75–118. [Google Scholar] [CrossRef]
  133. Roberts, T.J.; Gabaldon, A.M. Interpreting Muscle Function from EMG: Lessons Learned from Direct Measurements of Muscle Force. Integr. Comp. Biol. 2008, 48, 312–320. [Google Scholar] [CrossRef]
  134. Disselhorst-Klug, C.; Schmitz-Rode, T.; Rau, G. Surface Electromyography and Muscle Force: Limits in sEMG–Force Relationship and New Approaches for Applications. Clin. Biomech. 2009, 24, 225–235. [Google Scholar] [CrossRef]
  135. Sanger, T.D. Bayesian Filtering of Myoelectric Signals. J. Neurophysiol. 2007, 97, 1839–1845. [Google Scholar] [CrossRef]
  136. Arnold, E.M.; Hamner, S.R.; Seth, A.; Millard, M.; Delp, S.L. How Muscle Fiber Lengths and Velocities Affect Muscle Force Generation as Humans Walk and Run at Different Speeds. J. Exp. Biol. 2013, 216, 2150–2160. [Google Scholar] [CrossRef]
  137. Clancy, E.A.; Morin, E.L.; Merletti, R. Sampling, Noise-Reduction and Amplitude Estimation Issues in Surface Electromyography. J. Electromyogr. Kinesiol. 2002, 12, 1–16. [Google Scholar] [CrossRef]
  138. Erdem, Ç.; Jensenius, A.R. RAW: Exploring Control Structures for Muscle-Based Interaction in Collective Improvisation. In Proceedings of the New Interfaces for Musical Expression NIME’20, Birmingham, UK, 21–25 July 2020; Royal Birmingham Conservatoire, Birmingham City University: Birmingham, UK, 2020. [Google Scholar]
  139. Hogan, N.; Mann, R.W. Myoelectric Signal Processing: Optimal Estimation Applied to Electromyography—Part I: Derivation of the Optimal Myoprocessor. IEEE Trans. Biomed. Eng. 1980, 7, 382–395. [Google Scholar] [CrossRef]
  140. Françoise, J. Motion-Sound Mapping by Demonstration. Ph.D. Thesis, Université Pierre et Marie Curie, Paris, France, 2015. [Google Scholar]
  141. Lippold, O.C.J. The Relation between Integrated Action Potentials in a Human Muscle and Its Isometric Tension. J. Physiol. 1952, 117, 492–499. [Google Scholar] [CrossRef]
  142. Hagberg, M. On Evaluation of Local Muscular Load and Fatigue by Electromyography; National Board of Occupational Safety and Health, Department of Occupational Health, Work Physiology Unit: Leicester, UK, 1981. [Google Scholar]
  143. Bouisset, S.; Goubel, F. Integrated Electromyographical Activity and Muscle Work. J. Appl. Physiol. 1973, 35, 695–702. [Google Scholar] [CrossRef]
  144. De Morree, H.M.; Klein, C.; Marcora, S.M. Perception of Effort Reflects Central Motor Command during Movement Execution. Psychophysiology 2012, 49, 1242–1253. [Google Scholar] [CrossRef]
  145. Ericsson, B.; Hagberg, M. EMG Power Spectra versus Muscular Contraction Level. Acta Neurol. Scand. 1979, 73, 60–163. [Google Scholar]
  146. Wang, J.; Pang, M.; Yu, P.; Tang, B.; Xiang, K.; Ju, Z. Effect of Muscle Fatigue on Surface Electromyography-Based Hand Grasp Force Estimation. Appl. Bionics Biomech. 2021, 2021, 8817480. [Google Scholar] [CrossRef]
  147. Gupta, V.; Suryanarayanan, S.; Reddy, N.P. Fractal Analysis of Surface EMG Signals from the Biceps. Int. J. Med. Inform. 1997, 45, 185–192. [Google Scholar] [CrossRef]
  148. Hagberg, M. Muscular Endurance and Surface Electromyogram in Isometric and Dynamic Exercise. J. Appl. Physiol. 1981, 51, 1–7. [Google Scholar] [CrossRef]
  149. Waddell, M.L.; Amazeen, E.L. Evaluating the Contributions of Muscle Activity and Joint Kinematics to Weight Perception across Multiple Joints. Exp. Brain Res. 2017, 235, 2437–2448. [Google Scholar] [CrossRef]
  150. Waddell, M.L.; Amazeen, E.L. Lift Speed Moderates the Effects of Muscle Activity on Perceived Heaviness. Q. J. Exp. Psychol. 2018, 71, 2174–2185. [Google Scholar] [CrossRef]
  151. Waddell, M.L.; Amazeen, E.L. Leg Perception of Object Heaviness. Ecol. Psychol. 2018, 30, 314–325. [Google Scholar] [CrossRef]
  152. Waddell, M.L.; Fine, J.M.; Likens, A.D.; Amazeen, E.L.; Amazeen, P.G. Perceived Heaviness in the Context of Newton’s Second Law: Combined Effects of Muscle Activity and Lifting Kinematics. J. Exp. Psychol. Hum. Percept. Perform. 2016, 42, 363–374. [Google Scholar] [CrossRef]
  153. Chiu, L.Z.F. Biomechanical Methods to Quantify Muscle Effort During Resistance Exercise. J. Strength Cond. Res. 2018, 32, 502–513. [Google Scholar] [CrossRef]
  154. Grant, K.A.; Habes, D.J.; Putz-Anderson, V. Psychophysical and EMG Correlates of Force Exertion in Manual Work. Int. J. Ind. Ergon. 1994, 13, 31–39. [Google Scholar] [CrossRef]
  155. Razon, S.; Tenenbaum, G. Measurement in Sport and Exercise Psychology. In Exploring Sport and Exercise Psychology, 3rd ed.; Van Raalte, J.L., Brewer, B.W., Eds.; American Psychological Association: Washington, DC, USA, 2014; pp. 279–309. ISBN 978-1-4338-1357-3. [Google Scholar]
  156. Marshall, M.M.; Armstrong, T.J.; Ebersole, M.L. Verbal Estimation of Peak Exertion Intensity. Hum. Factors 2004, 46, 697–710. [Google Scholar] [CrossRef]
  157. Spielholz, P. Calibrating Borg Scale Ratings of Hand Force Exertion. Appl. Ergon. 2006, 37, 615–618. [Google Scholar] [CrossRef]
  158. Mangalam, M.; Conners, J.D.; Singh, T. Muscular Effort Differentially Mediates Perception of Heaviness and Length via Dynamic Touch. Exp. Brain Res. 2019, 237, 237–246. [Google Scholar] [CrossRef] [PubMed]
  159. Anantrasirichai, N.; Bull, D. Artificial Intelligence in the Creative Industries: A Review. Artif. Intell. Rev. 2022, 55, 589–656. [Google Scholar] [CrossRef]
  160. Yaghmour, M.; Sarada, P.; Roach, S.; Kadar, I.; Pesheva, Z.; Chaari, A.; Bendriss, G. EEG Correlates of Middle Eastern Music Improvisations on the Ney Instrument. Front. Psychol. 2021, 12, 701761. [Google Scholar] [CrossRef] [PubMed]
Figure 1. EMG use in music embodiment research of instrumental sound-producing gestures in performance and teaching (a) in controlled laboratory settings, where EMG is captured by Myo Armbands and Bitalino Plux aimed at in-between comparison, combined with a full-body inertial motion capture (XSens) and a marker-based optical multicamera (Optitrack) system, (b) in a controlled environment of a domestic space (private lesson and performance on the Iranian oud in radif music of various dastgahs with Yasamin Shahhosseini), where EMG is captured by a Myo Armband on the right hand (plucking) and full-body motion captured by Xsens full-body inertial (IMU) system and a markerless multicamera system (three GoPro Hero 7 Black cameras), (c) custom-made program in Cycling’74 MaxMsp v.7 for simultaneous monitoring, capturing, and playback of EMG and inertial data.
Figure 1. EMG use in music embodiment research of instrumental sound-producing gestures in performance and teaching (a) in controlled laboratory settings, where EMG is captured by Myo Armbands and Bitalino Plux aimed at in-between comparison, combined with a full-body inertial motion capture (XSens) and a marker-based optical multicamera (Optitrack) system, (b) in a controlled environment of a domestic space (private lesson and performance on the Iranian oud in radif music of various dastgahs with Yasamin Shahhosseini), where EMG is captured by a Myo Armband on the right hand (plucking) and full-body motion captured by Xsens full-body inertial (IMU) system and a markerless multicamera system (three GoPro Hero 7 Black cameras), (c) custom-made program in Cycling’74 MaxMsp v.7 for simultaneous monitoring, capturing, and playback of EMG and inertial data.
Mti 08 00037 g001
Figure 2. Yasamin Shahhosseini (plucking gestures): Case study on EMG use in music embodiment research of oud sound-producing plucking gestures in performance and teaching under semi-controlled settings. (ac) Perspectives captured by three distinct cameras, (d) a still extracted from Xsens motion capture animation output.
Figure 2. Yasamin Shahhosseini (plucking gestures): Case study on EMG use in music embodiment research of oud sound-producing plucking gestures in performance and teaching under semi-controlled settings. (ac) Perspectives captured by three distinct cameras, (d) a still extracted from Xsens motion capture animation output.
Mti 08 00037 g002
Figure 3. Yasamin Shahhosseini (plucking gestures): Histogram of maximum smoothed force values displaying deviations from a normal distribution.
Figure 3. Yasamin Shahhosseini (plucking gestures): Histogram of maximum smoothed force values displaying deviations from a normal distribution.
Mti 08 00037 g003
Figure 4. Yasamin Shahhosseini (plucking gestures): LM for isolated plucking gestures of session 1. Scatterplots of explanatory features in pairs.
Figure 4. Yasamin Shahhosseini (plucking gestures): LM for isolated plucking gestures of session 1. Scatterplots of explanatory features in pairs.
Mti 08 00037 g004
Figure 5. Yasamin Shahhosseini (plucking gestures): LM for isolated plucking gestures of session 1. Diagnostic plots: (a) residuals vs. fitted values, (b) normal quantile to quantile, (c) scale–location, and (d) residuals vs. leverage.
Figure 5. Yasamin Shahhosseini (plucking gestures): LM for isolated plucking gestures of session 1. Diagnostic plots: (a) residuals vs. fitted values, (b) normal quantile to quantile, (c) scale–location, and (d) residuals vs. leverage.
Mti 08 00037 g005
Figure 6. Yasamin Shahhosseini (plucking gestures): LM for isolated plucking gestures of session 1. Boxplots for all features.
Figure 6. Yasamin Shahhosseini (plucking gestures): LM for isolated plucking gestures of session 1. Boxplots for all features.
Mti 08 00037 g006
Figure 7. Yasamin Shahhosseini (plucking gestures): real oud performance conditions of session 3. Scatterplot between extracted mean force smoothed values and annotated effort levels.
Figure 7. Yasamin Shahhosseini (plucking gestures): real oud performance conditions of session 3. Scatterplot between extracted mean force smoothed values and annotated effort levels.
Mti 08 00037 g007
Figure 8. Raw EMG Plux data for left and right hand (below and above): (a) left: zoomed in to recording start, where the peaks of handclaps can be readily deduced, and (b) right: for the entire duration of a recording, where EMG peak of various instances of impulsive gestures can be identified, rendering the extraction of handclaps at recording beginning and end challenging.
Figure 8. Raw EMG Plux data for left and right hand (below and above): (a) left: zoomed in to recording start, where the peaks of handclaps can be readily deduced, and (b) right: for the entire duration of a recording, where EMG peak of various instances of impulsive gestures can be identified, rendering the extraction of handclaps at recording beginning and end challenging.
Mti 08 00037 g008
Figure 9. Three-dimensional acceleration data plotted for all joints of a full-body inertial motion capture system over time (in frames). Peaks of well-discernible handclaps can be readily identified for both hands at beginning and end.
Figure 9. Three-dimensional acceleration data plotted for all joints of a full-body inertial motion capture system over time (in frames). Peaks of well-discernible handclaps can be readily identified for both hands at beginning and end.
Mti 08 00037 g009
Figure 10. The eight streams of EMG data of the Myo Armband, left and right hand, calibrated and converted to force values for detecting peak moments and segmenting.
Figure 10. The eight streams of EMG data of the Myo Armband, left and right hand, calibrated and converted to force values for detecting peak moments and segmenting.
Mti 08 00037 g010
Table 1. Best LM for inferring maximum force values of isolated oud-plucking gestures by Yasamin Shahhosseini. Analysis made on sessions 1.1a, 1.1b, 1.2, 1.4, 1.6, and 1.7.
Table 1. Best LM for inferring maximum force values of isolated oud-plucking gestures by Yasamin Shahhosseini. Analysis made on sessions 1.1a, 1.1b, 1.2, 1.4, 1.6, and 1.7.
LM for Inferring Maximum Smoothed Force Values of Isolated Plucking Gestures
LM FeatureCoefficients
[f.1] std_sp_brightness_log_filt_onset−0.6377 **
[f.2] min_sp_brightness_log_filt_onset−1.0331 ***
[f.3] max_sp_brightness_log_filt_onset0.5952 *
[f.4] mean_right_dist_z_fromrest_onset−0.4575 ***
[f.5] mean_v_right_smooth_z_onset0.3790 *
[f.6] min_a_right_norm_smooth0.2430 . 1
R2adj. (p < 0.001)0.56
1 Signif. codes: 0 ‘***’, 0.001 ‘**’, 0.01 ‘*’, 0.05 ‘.’, 0.1 ‘ ’ 1.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Paschalidou, S. Multimodal Embodiment Research of Oral Music Traditions: Electromyography in Oud Performance and Education Research of Persian Art Music. Multimodal Technol. Interact. 2024, 8, 37. https://doi.org/10.3390/mti8050037

AMA Style

Paschalidou S. Multimodal Embodiment Research of Oral Music Traditions: Electromyography in Oud Performance and Education Research of Persian Art Music. Multimodal Technologies and Interaction. 2024; 8(5):37. https://doi.org/10.3390/mti8050037

Chicago/Turabian Style

Paschalidou, Stella. 2024. "Multimodal Embodiment Research of Oral Music Traditions: Electromyography in Oud Performance and Education Research of Persian Art Music" Multimodal Technologies and Interaction 8, no. 5: 37. https://doi.org/10.3390/mti8050037

Article Metrics

Back to TopTop