skip to main content
10.1145/3613905.3651101acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
Work in Progress
Free Access

MusicTraces: A collaborative music and paint activity for autistic people

Authors Info & Claims
Published:11 May 2024Publication History

Abstract

Painting and music therapy approaches can help to foster social interaction for autistic people. However, the tools sometimes lack of flexibility and fail to keep people’s attention. Unknowns also remain about the effect of combining these approaches. Though, very few studies have investigated how Multisensory Environments (MSEs) could help to address these issues. This paper presents the design of a full-body music and painting activity called “MusicTraces” which aims to foster collaboration between people with moderate to severe learning disabilities and complex needs, and in particular autism, within an MSE. The co-design process with caregivers and people with neurodevelopmental conditions is detailed, including a workshop, the initial design, remote iterations, and a design critique.

Figure 1:

Figure 1: Main design features of the activity MusicTraces. (a.) Front User Interface (UI). 1: Temporary Line. 2: Permanent line (with a sound). 3: Permanent line (without sound). 4: Permanent spot (i.e., sound). 5: Animated dot (i.e., cursor). 6: Hand of red player (b.) Floor UI: 7: player position. 8: Music blob (added after design phase 3). 9: Smart brushes or erasers being detected as held by the users. 10: Floor area to trigger the background music. 11: Floor spots to control the music evolution. (c.) Two users playing together in the room. 12: Line crossing blending their colors. 13. Smart brush.

Skip 1INTRODUCTION Section

1 INTRODUCTION

Autism is a Neurodevelopmental Condition (NDC) which involves social communication and interaction difficulties and sensory issues [1]. Autistic people1 can display mild learning disabilities and low support needs (e.g., difficulty initiating conversation), or severe learning disabilities and high support needs (e.g., minimal language), sometimes associated with intellectual disability (ID). The latter are little represented in current research [35]. Creative practices can help to foster social interaction and self-esteem for people with all kinds of abilities. Painting and music improvised practices are common, and can benefit to people with severe learning disabilities [28]. Yet, the tools sometimes lack of flexibility to adapt to people’s sensory issues, and unknowns remain about the potential of combining several art practices [8].

Digital tools help to tackle these issues, as being flexible, predictable, and often appealing for autistic people [26, 29]. More specifically, Multisensory Smart Environments (MSEs) are promising to promote well-being and social interaction [11, 16, 20, 31, 34]. They consist of full-body interactive spaces with multisensory stimuli (e.g., visuals, audio, tactile).

This paper presents the design of a full-body music and painting activity called “MusicTraces” which aims to foster collaboration within an MSE between young adults with moderate to severe learning disabilities and complex needs, and in particular autism. The MSE being used, called the Magic Room, consists of a room augmented with floor and front visual projections, speakers, smart objects and lights, and a Microsoft Kinect 2. The technical setup of the Magic Room, and past applications conducted in this space are described in previous papers [20, 21, 22]. Below, we report on our co-design process with caregivers and people neurodevelopmental conditions, which includes a workshop, the initial design, remote iterations, and a design critique.

Skip 2RELATED WORK Section

2 RELATED WORK

Several digitally-augmented multisensory settings with full-body interactive capabilities (e.g., smart objects, lights) have been created with clinical teams, with positive outcomes regarding relational aspect for autistic children [13, 16, 20, 30, 31, 34]. Some projects, like Mediate [31], Lands of Fogs [16, 30], or RHYME [13], promote multiplayer free exploration of the space. Others like the Magic Room [20] or Sensory Paint [34] are task-oriented and focus on educative goals.

Creative approaches within MSEs have previously been designed for people with NDCs [13, 34]. Sensory Paint is the only project combining music and paint to our knowledge. While focusing more on sonifying the painting experience, it is promising to promote social interaction [34]. RHYME consists of various sub projects including musical smart objects (e.g., puppets) affording various interactions (e.g., microphone), with benefits over social aspects and well-being [13].

Other multisensory projects have been designed for autistic children outside of MSEs with similar objectives [15, 33]. Bendable Sound allows to play music by touching an elastic display, with benefits over attention and motor development. With OSMoSIS, children can play music with their caregiver using their body, to support interactional synchrony [33].

Most of the above-mentioned projects consider music as a way to promote relational aspects within health settings, also called health musicking [37, 39]. This concept entails multiple ways of experiencing, rather than right or wrong way of playing, close to the concept of open work [17, 38]. In this context, the system has an active role to support exploration and the co-creative experience (e.g., with hints) [37]. In particular, the interactive music composition must adapt and evolve based on user actions, such as in the Reactable project [24, 43] or RHYME [12].

At last, to cater for stakeholders’ needs, the projects must be co-designed with them [32]. This process entails common design principles, e.g., support understanding by structuring the information or by using video modelling techniques [6].

Skip 3PHASE 1 - CO-DESIGN WORKSHOP Section

3 PHASE 1 - CO-DESIGN WORKSHOP

The design process of MusicTraces started with a one-hour workshop organized with five caregivers and three people with neurodevelopmental conditions (NDCs) and Intellectual Disability (ID). It aimed to brainstorm ideas about how to best combine music and painting activities within the MSE called the Magic Room [20, 21, 22], to foster collaboration between people with moderate to severe NDC, such as autism. It also intended to validate the consistency of our objective. Three investigators were present: an animator who conducted the session, a secretary who sorted the emerging ideas, and an observer taking notes. After the workshop, the emerging ideas were discussed with an external psychologist with experience in autism.

3.1 Participants

Eight participants were recruited (5 males, 3 females): seven from a music association for people with disabilities and one from our network. They include: the head of the association (H), two educators (Ed1 and Ed2), two psychologists (P1, P2), and three people with NDC and ID (PwD1, PwD2, PwD3). PwD1 (23 year-old male) has an Adams-Oliver Disorder inducing ID and lacks autonomy. PwD2 (44 year-old male) and PwD3 (32 year-old female) both have neurodevelopmental conditions inducing ID, and social and motor issues. The external psychologist (M), who then reviewed the emerging themes, has been working for more than ten years with autistic people at the clinical institute called Fraternità e Amicizia (FeA). Participants participated voluntarily without being paid after signing consent forms.

3.2 Protocol

After the participants were introduced to the investigators, they tried some non-creative activities within the Magic Room (e.g., storytelling) during twenty minutes. Then, they went to a room cleaned from distracting stimuli (e.g., noise), where paint material was added to foster idea generation (i.e., brushes). The animator outlined the workshop’s rationale, organization, and rules: (1) generate as many ideas as you can, (2) do not judge ideas, and (3) feel free to express unusual ideas. Participants used “concept sheets” to write or draw their ideas, thus accommodating for diverse abilities. If possible, for each idea, they noted the best and worst aspects to obtain more details and promote creativity.

The brainstorming session involved two 30-minute rounds, first without and then with cards. Participants always started by noting ideas individually, to not be influenced by others. Then, they shared their idea(s) with others, while also being able to suggest new ideas. The secretary noted the ideas on a Miro board2 projected on the wall, and sorted them based on the discussions, inspired from techniques using sticky notes [27]. During the first part, the participants could take inspiration from manipulating the paint material. Then, they could use 20 cards that we designed, inspired from previous studies [7, 36]. Each card bore one design principle and one example of how to apply it. The ten green cards were design heuristics, being picked from a preexisting set [25]. They stated general principles such as “Allow user to rearrange”. The ten blue cards were design principles related to the activity, inspired from previous studies [36]. For instance, they included “Think about other music possibilities”. Examples of cards are visible on Figure 2.

The workshop ended with concluding thoughts. The participants were thanked, asked if agreeing on participating in future testing, and given gift bags. Then, they could share a snack with the lab members and try other applications.

Figure 2:

Figure 2: Examples of the material used during the co-design workshop. (a.) Example of activity-related card, (b.) Example of design heuristic card, (c.) Example of answers on a “concept sheet” from a participant with a neurodevelopmental condition (PwD1).

Sessions were filmed. Two authors (secretary and observer) analyzed the data using thematic analysis [10], with deductive techniques for themes already existing in the literature (e.g., understanding) and inductive techniques to create new ones. Then, they discussed and precised the themes with a third author (the animator) and (M). The identifiers of the themes and the number of participants mentioning them are noted Tx and y/8 (where x and y are numbers).

3.3 Findings

Seven themes were built: Full-body multisensory interaction (T1, 5/8), Music visualization (T2, 3/8), Collaboration (T3, 2/8), Support (T4, 2/8), Gamification (T5, 3/8), Understanding (T6, 3/8), and Expressivity (T7, 4/8).

Full-body multisensory interaction is expressed by four caregivers and PwD2, to better include people with motor issues. It consists of drawing with the feet (Ed1), or using free (P1) or specific (P2) full-body gestures. Multisensory stimuli are advised (H, P1, P2, PwD2): using tangibles as controllers (e.g., H: a “stick glowing in the dark”), a microphone (P2), and scents (P1). (M) agreed with this theme, but was against using a specific gesture, to not confuse people with ID.

Music visualization is suggested by two caregivers, PwD1, and PwD2. It consists of using a clear visual “grammar” for the music that would be appealing and meaningful. For instance, PwD1 suggested to use pentagrams and P2 emphasized to connect every music element to a visual (and conversely). (M) agreed with this theme.

Collaboration is expressed by two caregivers (H, P2), either with turn-taking (H) or playing simultaneously (H, P2). Scenarios could be task-oriented, with users competing against each other (e.g., to learn gesture combinations) (P2, H), or open-ended (e.g., to draw how they feel) (H). Though, social anxiety could hinder collaboration (H). (M) agreed with this theme.

Support is expressed by two caregivers (H, P2). Based on autism difficulties with abstract thinking, the goal is to prompt creativity by seeing/hearing music or visuals before or during the experience (e.g., painting over a background picture or song). If users get stuck, the caregiver or system should support them. (M) agreed with this theme.

Gamification is evoked by two caregivers and PwD2 to promote engagement. It first consists of having task-oriented use cases (Ed2, P2), such as the discovery of a musical drawing (P2) or a virtual trip (P2). It also concerns the use of a competition logic (P1, PwD2), using levels and challenges (PwD2). (M) agreed with these possibilities.

Promoting understanding is expressed by two caregivers and PwD2, using smart objects to accommodate for people’s abilities (H), or clear rules to lead the music evolution (e.g., based on movements, Ed1). (M) agreed with this theme. He also suggested to map some repetitive movements (e.g., hand flapping) to specific sounds, to give them a meaning.

Expressivity is expressed by four caregivers. The goal is to afford symbolization processes, by enabling users to use various movements (P2, Ed1) or painting parameters (e.g., brush colour) based on what they want to convey (P1, H).

At last, the participants stressed before and after the workshop to individualize the design based on tastes (e.g., music) and the objective (e.g., relaxation or communication). (M) advised individualizing the level of multisensory stimuli.

Skip 4PHASE 2: INITIAL DESIGN Section

4 PHASE 2: INITIAL DESIGN

MusicTraces is a music and painting activity within an MSE called the Magic Room that aims to promote collaboration, inspired from improvised art therapy practices and “health musicking” [37, 40]. As such, it accommodates for multiple ways of being and acting, rather than being task-oriented. The system is considered as a co-creator which supports collaboration with hints [37]. The design is influenced by the themes from our workshop (noted with [Tx], where x is a number) and previous studies.

4.1 Environment of the activity

This two-user activity is inspired from the contemporary music practice called “sound painting” [42]. It includes two spaces: the interaction space and the outside space. In the interaction space, delimited with foam carpets [T6], participants can create musicographic objects on a paper-like front user interface (UI) based on their hand position (see Figure 3.c) [T1]. Smart objects (i.e., brush and eraser) are used to draw [T1], whose design was inspired from existing accessible controllers [18]. The floor is a control space used to activate the background music [T1,T4]. This design aims to balance stimulation while fostering the engagement of people with severe motor issues (e.g., not moving their arms).

The experience relies on multisensory stimuli: audio (from the speakers), visual (with lights and projections), tactile (with the smart objects), and proprioceptive (when drawing and moving) [T1]. Lights become red when users are outside of the interaction space, and turn green when entering it, to prompt agency. All sounds correspond to soothing music instrument based on the literature (i.e., marimba and handpan) to not induce over-arousal [4, 14]. All stimuli are simple to avoid cognitive overload. Information is structured in terms of its role to promote understanding [T6].

The activity is intended for two users, with their caregiver to provide prompts (verbal, physical) and monitor the activity with a tablet (see section 4.5), to prompt understanding and collaboration [T3, T1]. Some features aim to foster collaboration, e.g., the users have different colors and instruments, the colors of the users’ lines blend when they cross [T3].

4.2 The MusicTraces Syntax

Each visual has a music counterpart and conversely, inspired from [T2] and the Reactable project [24]. The experience relies on three layers [2]: sound node, narrative structure, and composition rules [T6]. Nodes are short music patterns (e.g., notes), narrative structure are combinations of nodes (e.g., melodies), and rules are ways of creating the narratives.

Three design metaphors are used [T2, T5, T6]. Short music patterns are represented by paint spots. Melodies are displayed as open or closed lines. They can be played (i.e., as a timeline) when hit by the players. In that case, a cursor - represented by an animated dot - navigates over them at a fixed speed. This metaphor stems from the projects Iannix [23] and Upic [41]. Open lines are only heard once when hit, and closed lines multiple times before the cursor fades out.

Three composition rules are used [T6]. Proximity rules help to combine musicographic element based on their spatial proximity, as in [24], e.g., paint spots close to a line are added to the melody. Harmonization rules guide the interactive music composition based on users’ actions. For instance, users step together on floor circles, which appear after some actions are done [T3], to change the music chord, computed using Markov Chains trained on 37 popular songs3. Rendering rules take the node position on the ordinate and abscissa axes to affect its pitch (relative to the chord) and positioning.

4.3 Interactions

MusicTraces allows users to interact in three ways: Explore, Create, and Play [T1,T7]. Explore consists in exploring the music space, by moving the brush and pressing on it. This creates temporary transparent lines that quickly vanish, and triggers sounds based on the brush position. Create is about leaving permanent paint spots or lines (when using the brush) or erasing them (with the eraser). Creating a spot requires to stay for more than one second in the same position. For the lines, users have to draw a spot while drawing a temporary line (so that every melody contains at least one sound). Users can erase the objects by hovering over them with the eraser. Play consists of touching the spots or lines to play the related sounds. When lines are played, a cursor navigates them starting from the interaction point.

4.4 Hint system

A hint system suggests interactions when it detects idleness, isolation, or repetitive movements, inspired from the Mediate project [31], [T3], and [T4]. About idleness, three hint levels are used. If a user does no action for more than 20 seconds, the brush slightly vibrates to catch their attention. If they remain idle, it lights up. At last, a hint line appears on the front UI from their hand and with their color. About isolation, if they continuously draw in the same area, a notification is sent to the tablet. The caregiver can then choose to display a hint line to redirect their attention elsewhere. Concerning repetitive movements, if a significant similarity is detected (e.g., in terms of shape, length) between the last line drawn and the others, a notification is sent to the caregiver who can decide whether to display a hint line or not.

4.5 Tablet interface

The caregivers use the tablet UI to monitor the activity or stop it if needed [T4, T6]. The left panel gives control over the game mechanics, e.g., pause the game, remove lines. The middle panel mirrors the four main areas of the front screen, signaling overuse with red lighting or where to trigger hints with green lighting. Four hint shapes are available (e.g., house shape). The right panel is a notification system, alerting about repetitive behaviors or overused areas. The UI was created during the initial design and refined after phase 4. It is displayed on Figure 3.

Figure 3:

Figure 3: Design of the tablet user interface (UI), smart brush, and smart eraser. Elements added after the design phase 4 are mentioned with (D4). (a.) Tablet UI. The left panel contains buttons to stop the game, control the tutorial steps, remove lines and paint spots, activate the background music, activate the music evolution (D4), play all melodies, activate the blobs, swap players or hands (security). The center panel manages the hints. The right panel displays the notifications. It can be hidden by clicking on the column at its left (D4). (b.) Smart brush used to draw when pushing on the rainbow button. c. Smart eraser used to erase when holding it.

4.6 Apparatus

The activity is developed with Unity engine (version 2021.2.14f1). A custom package gives access to the devices of the Magic Room by communicating with ad-hoc servers, and to the tablet interface. The web service is developed with Nuxt.js. The tablet UI is built using Vue.js. The activity runs on a windows PC. The hardware components include two projectors, a Microsoft Kinect v2, smart lights, a tablet, and two audio speakers. The entire setup is precisely described in previous papers [20, 21, 22].

The four smart objects were built using parts from commercial objects or being 3D printed, magnets, a module ESP8266 Witty Cloud ESP-12F WiFi, external batteries, vibration engines, and LED strips. All sounds were created using virtual instruments on LogicProX Digital Audio Workstation. The paint visuals were made using Unity shader graph, and the textures with Inkscape software.

Skip 5PHASE 3 - REMOTE AGILE PROCESS Section

5 PHASE 3 - REMOTE AGILE PROCESS

To be able to conduct future testing in the clinical institute called Fraternità e Amicizia, we continued our design process with a psychologist working there, who participated in the first iteration (M). The goal was to adapt our activity to this context.

5.1 Method

As no similar projects existed to our knowledge, an agile method of working was used with (M), by doing small iterative design cycles [5]. Email exchanges occurred every two weeks for a total of three iterations. Each time, feedback was asked about new changes or ideas stemming from the findings from our workshop and from the related work. To clarify the changes, videos were sent for the first two iterations, segmented by different features. Textual feedback were analyzed using a deductive qualitative analysis process, consisting of analyzing the data according to the activity features [19]. Throughout the three iterations, some features were removed, validated, modified, or added. They are reported below.

5.2 Findings

Two ideas were abandoned. First, using additional body movements to create sounds and visuals was removed, to not overstimulate the users. Second, triggering specific audiovisual effects when mimicking the other player was also removed, to not confuse the user since many features were already implemented.

Three elements were validated. During the first iteration, the hint system was deemed very useful to help the users. Then, adapting the line thickness based on the user distance from the screen was confirmed, to make the activity look more appealing and realistic. The smart objects also received approval and made the caregiver enthusiastic about the project.

The hint system was modified to avoid visual clutter. Indeed, (M) suggested that, after the user draws the first line, the hint lines should no longer be automatically added, but rather manually added by the caregiver from the tablet.

Three elements were added to promote agency: closed shapes being automatically filled with color, blobs moving on the floor that make percussive sounds when touched, and lines without nodes. These blobs aim to allow the users being unable to draw with their hands, to be able to make sounds by moving in the space. The lines without musical notes are thought for users only wanting to create visuals.

Skip 6PHASE 4 - DESIGN CRITIQUE Section

6 PHASE 4 - DESIGN CRITIQUE

To further adapt our design to the clinical context of Fraternità e Amicizia (FeA), and validate its acceptability among a clinical team, a design critique was conducted with four caregivers working there: three educators who had not participated in the initial design process and the psychologist included in the third phase (M).

6.1 Method

The three educators (Ed3, Ed4, Ed5) have been conducting weekly creative activities at FeA, respectively for 4 year, more than 10 years, and 5 years. Ed3 (male) and Id4 (male) do group-based manual activities (e.g., painting, sculpting). Ed5 (female) conducts group-based painting activities using a music background, and has a training in art therapy. Id3 specifically works with people with severe disabilities. (M)’s profile was presented above.

With the educators, individual 45-minute sessions were organized at FeA’s facility, for organizational reasons. The activity features were presented using a video, as well as visuals of the smart object designs and tablet UI (since not fully developed). Sessions started with an activity outline, followed by general questions based on the video steps, and specific inquiries about the hint system and tablet interface. One week later, (M) tested the activity in our lab. After an activity outline, he tested it while making comments, and being asked about his colleagues’ ideas. Interviews were audio recorded, with caregivers’ agreement. The data were then sorted into themes using thematic analysis [10].

6.2 Findings

All three educators were positive about using MusicTraces at their facility and participating in the testing. Ed5 said that it closely aligns with her practice and could benefit to her activities. Id3 emphasized the unique combination of music and drawing, noting its potential to catch the attention and escape from repetitive behaviors.

About the UI, all caregivers agreed with the aesthetic of the front screen. Two suggested to change the background color (Id3 and Id5), as in Id5’s painting activities. All of them confirmed to make it possible to change the brush color during the game. All validated the floor UI, regarding the floor blobs and music evolution. To adapt to the users, (M) suggested to make the latter either “interactable” (as planned), “automatic” (without the floor circles), or disabled.

The interactions seemed easy to understand (all), despite some imprecisions in the body tracking (M), that we fixed through coding and with the smart objects. The latter seemed to ease the gameplay (all). (M) noted that the colors of the lights were too close from the brush colors and prevented from clearly seeing the screen. Thus, we used a different color for the brushes (yellow), and made the lights brighter outside of the interaction space and darker within.

All caregivers validated the use of hints, especially for isolation and repetitiveness. Three aspects were changed. First, hint lines became dashed, to differ from the players’ lines. Second, wavy lines were added to guide users’ attention toward specific areas, after Ed3’s comment. Finally, while all educators enjoyed the use of hints on the smart objects (vibrations, lights), vibrations became an “and/off” feature, as it could confuse or startle some users (M).

About the tutorial, all caregivers agreed on the different steps, the use of video modelling techniques, and the addition of audio recorded instructions. Id3 advised including all steps, contrary to Id4 who said that “users cannot keep in mind more than three things.” Thus, we decided to use the tablet to choose the steps to include. (M) agreed with this change. At last, Id3 suggested adding dashed lines that users could follow with their hands.

All caregivers were positive about the tablet features. The notifications were helpful (Id5). However, the UI contained too much information (M), leading to move some buttons to the left and to make it possible to hide the notification panel. Indeed, when playing with a user the caregiver would not need to see this panel (M).

One additional insight was suggested by Id3, which (M) agreed on. Since “some individuals may stare at the floor,” he suggested to activate only one screen at first, and then complement it with the second screen (e.g., fading in). Though, it was kept for perspectives to not include too many features and potentially confuse the users.

Skip 7CONCLUSION AND FUTURE WORKS Section

7 CONCLUSION AND FUTURE WORKS

This paper has introduced the design of “MusicTraces”: an open-ended music and paint activity within an MSE, inspired from improvised art therapy practices, intended to foster collaboration between young adults with moderate to severe neurodevelopmental conditions, and more particularly autism. The co-design process was conducted with caregivers and people with NDC, through a design workshop, the initial design, a remote design process, and a design critique. It also took inspiration from existing studies [15, 24].

MusicTraces has three main contributions to our knowledge. First, it is the only project that equally combines features about music and paint. Indeed, similar projects leaned more toward paint [34] or music [13]. Secondly, it is the only full-body music activity designed for two individuals with NDC, and not for a child and their caregiver, as in Osmosis [33]. Thirdly, it is one a the few creative activities intended for people with severe conditions, such as Mediate [31]. Yet, the project is currently bounded to the Magic Room, inducing issues in terms of portability. Thus, future plans involve porting it to Virtual or Augmented Reality headsets. Including other insights is also considered to support agency, e.g., a microphone control as in RHYME [3, 13].

Next research steps include acceptability and usability testing with people with disabilities, followed by an empirical study with around ten people with moderate to severe NDC at Fraternità e Amicizia. A within-group experimental design will be used, where our activity will be compared with a group-based painting activity using background music.

Skip ACKNOWLEDGMENTS Section

ACKNOWLEDGMENTS

The authors would like to thank all the participants who participated in this research, and in particular the psychologist M. Mores. Without them, designing this application would not have been possible. The authors would also like to thank other people who contributed to the development of this project: N. Sasannia, H.D. Foley, E. Özgünay, Z.Ş. Timar, S.S. Vega, and D. Collado.

This research was carried out within MUSA – Multilayered Urban Sustainability Action – project, funded by the European Union – NextGenerationEU, under the National Recovery and Resilience Plan (NRRP) Mission 4 Component 2 Investment Line 1.5: Strengthening of research structures and creation of R&D “innovation ecosystems”, set up of “territorial leaders in R&D”. G. Caslini’s PhD grant is supported by TIM S.p.A., Innovation Department.

Footnotes

  1. 1 This paper adopts autism stakeholders’ language preferences [9], e.g., identity first-language (e.g., autistic people), no offending terms (e.g., “disorder”).

    Footnote
  2. 2 Miro application: https://miro.com/fr/

    Footnote
  3. 3 The dataset can be found at this link: https://www.kaggle.com/datasets/taylorflandro/lyrics-and-chords-from-ultimateguitar?select=pop_lyrics_df.csv.

    Footnote
Skip Supplemental Material Section

Supplemental Material

3613905.3651101-talk-video.mp4

Talk Video

mp4

15.3 MB

References

  1. American Psychiatric Association. 2013. Diagnostic and Statistical Manual of Mental Disorders (fifth edition ed.). American Psychiatric Association, Arlington, VA. https://doi.org/10.1176/appi.books.9780890425596Google ScholarGoogle ScholarCross RefCross Ref
  2. Anders-Petter Andersson and Birgitta Cappelen. 2008. Same but Different – Composing for Interactivity. In Proceeding of AudioMostly 2008. ACM, Piteå, Sweden, 80–85.Google ScholarGoogle Scholar
  3. Anders-Petter Andersson and Birgitta Cappelen. 2014. Vocal and tangible interaction in RHYME. In Music, Health, Technology and Design, Karette Stensæth (Ed.). Number 8 in Series from the Centre for Music and Health. Norwegian Academy of Music, Oslo, 21–38.Google ScholarGoogle Scholar
  4. Valentin Bauer, Ali Adjorlu, Linnea Bjerregaard Pedersen, Tifanie Bouchara, and Stefania Serafin. 2023. Music Therapy in Virtual Reality for Autistic Children with Severe Learning Disabilities. In 29th ACM Symposium on Virtual Reality Software and Technology. ACM, Christchurch New Zealand, 1–9. https://doi.org/10.1145/3611659.3615713Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Kent Beck, Mike Beedle, Arie van Bennekum, Alistair Cockburn, Ward Cunningham, Martin Fowler, James Grenning, Jim Highsmith, Andrew Hunt, Ron Jeffries, Jon Kern, Brian Marick, Robert C. Martin, Steve Mellor, Ken Schwaber, Jeff Sutherland, and Dave Thomas. 2001. Manifesto for Agile Software Development. http://www.agilemanifesto.org/Google ScholarGoogle Scholar
  6. Scott Bellini and Jennifer Akullian. 2007. A Meta-Analysis of Video Modeling and Video Self-Modeling Interventions for Children and Adolescents with Autism Spectrum Disorders. Exceptional Children 73, 3 (April 2007), 264–287. https://doi.org/10.1177/001440290707300301 Publisher: SAGE Publications Inc.Google ScholarGoogle ScholarCross RefCross Ref
  7. Laura Benton, Asimina Vasalou, Rilla Khaled, Hilary Johnson, and Daniel Gooch. 2014. Diversity for design: a framework for involving neurodiverse children in the technology design process. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Toronto Ontario Canada, 3747–3756. https://doi.org/10.1145/2556288.2557244Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Allison Bernier, Karen Ratcliff, Claudia Hilton, Patricia Fingerhut, and Chi-Ying Li. 2022. Art Interventions for Children With Autism Spectrum Disorder: A Scoping Review. The American Journal of Occupational Therapy 76, 5 (Sept. 2022), 7605205030. https://doi.org/10.5014/ajot.2022.049320Google ScholarGoogle ScholarCross RefCross Ref
  9. Kristen Bottema-Beutel, Steven K. Kapp, Jessica Nina Lester, Noah J. Sasson, and Brittany N. Hand. 2021. Avoiding Ableist Language: Suggestions for Autism Researchers. Autism in Adulthood 3, 1 (March 2021), 18–29. https://doi.org/10.1089/aut.2020.0014Google ScholarGoogle ScholarCross RefCross Ref
  10. Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology 3, 2 (Jan. 2006), 77–101. https://doi.org/10.1191/1478088706qp063oaGoogle ScholarGoogle ScholarCross RefCross Ref
  11. Scott Andrew Brown, David Silvera-Tawil, Petra Gemeinboeck, and John McGhee. 2016. The case for conversation: a design research framework for participatory feedback from autistic children. In Proceedings of the 28th Australian Conference on Computer-Human Interaction - OzCHI ’16. ACM Press, Launceston, Tasmania, Australia, 605–613. https://doi.org/10.5014/ajot.2010.09077Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Birgitta Cappelen and Anders-Petter Andersson. 2016. Embodied and Distributed Parallel DJing. In Universal Design 2016: Learning from the Past, Designing for the Future (ios press ed.), T. Walsh, D. Swallow, L. Sandoval, A. Lewis, C. Power, H. Petrie, and J. Darzentas (Eds.). Studies in Health Technology and Informatics, Vol. 229. IOS Press BV, Netherlands, 528 – 539.Google ScholarGoogle Scholar
  13. Birgitta Cappelen and Anders-Petter Andersson. 2016. Health Improving Multi-Sensorial and Musical Environments. In Proceedings of the Audio Mostly 2016. ACM, Norrköping Sweden, 178–185. https://doi.org/10.1145/2986416.2986427Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Franceli L. Cibrian, Jose Mercado, Lizbeth Escobedo, and Monica Tentori. 2018. A Step towards Identifying the Sound Preferences of Children with Autism. In Proceedings of the 12th EAI International Conference on Pervasive Computing Technologies for Healthcare. ACM, New York NY USA, 158–167. https://doi.org/10.1145/3240925.3240958Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Franceli L. Cibrian, Oscar Peña, Deysi Ortega, and Monica Tentori. 2017. BendableSound: An elastic multisensory surface using touch-based interactions to assist children with severe autism during music therapy. International Journal of Human-Computer Studies 107 (Nov. 2017), 22–37. https://doi.org/10.1016/j.ijhcs.2017.05.003Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Ciera Crowell, Batuhan Sayis, Juan Pedro Benitez, and Narcis Pares. 2020. Mixed Reality, Full-Body Interactive Experience to Encourage Social Initiation for Autism: Comparison with a Control Nondigital Intervention. Cyberpsychology, Behavior, and Social Networking 23, 1 (1 1 2020), 5–9.Google ScholarGoogle Scholar
  17. Umberto Eco. 1989. The open work. Hutchinson Radius, London.Google ScholarGoogle Scholar
  18. Katie Ellis and Kai-Ti Kao. 2019. Who Gets to Play? Disability, Open Literacy, Gaming. Cultural Science Journal 11, 1 (Dec. 2019), 111–125. https://doi.org/10.5334/csci.128Google ScholarGoogle ScholarCross RefCross Ref
  19. Satu Elo and Helvi Kyngäs. 2008. The qualitative content analysis process. Journal of Advanced Nursing 62, 1 (April 2008), 107–115. https://doi.org/10.1111/j.1365-2648.2007.04569.xGoogle ScholarGoogle ScholarCross RefCross Ref
  20. Franca Garzotto, Eleonora Beccaluva, Mattia Gianotti, and Fabiano Riccardi. 2020. Interactive Multisensory Environments for Primary School Children. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu, HI, USA, 1–12. https://doi.org/10.1145/3313831.3376343Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Franca Garzotto and Mirko Gelsomini. 2018. Magic Room: A Smart Space for Children with Neurodevelopmental Disorder. IEEE Pervasive Computing 17, 1 (Jan. 2018), 38–48. https://doi.org/10.1109/MPRV.2018.011591060Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Mirko Gelsomini, Giulia Cosentino, Micol Spitale, Mattia Gianotti, Davide Fisicaro, Giulia Leonardi, Fabiano Riccardi, Agnese Piselli, Eleonora Beccaluva, Barbara Bonadies, Lucia Di Terlizzi, Martino Zinzone, Shanti Alberti, Christelle Rebourg, Marina Carulli, Franca Garzotto, Venanzio Arquilla, Mario Bisson, Barbara Del Curto, and Monica Bordegoni. 2019. Magika, a Multisensory Environment for Play, Education and Inclusion. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Glasgow Scotland Uk, 1–6.Google ScholarGoogle Scholar
  23. G. Jacquemin, T. Coduys, and M. Ranc. 2012. IANNIX 0.8. In Proceedings of the Journées d’Informatique Musicale (JIM 2012). Mons, France, 10.Google ScholarGoogle Scholar
  24. Sergi Jordà, Günter Geiger, Marcos Alonso, and Martin Kaltenbrunner. 2007. The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In Proceedings of the 1st international conference on Tangible and embedded interaction. ACM, Baton Rouge Louisiana, 139–146. https://doi.org/10.1145/1226969.1226998Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. J Kramer, S.R. Daly, S Yilmaz, and C.M. Seifert. 2014. A case-study analysis of Design Heuristics in an upper-level cross-disciplinary design course.. In Proceedings of the Annual Conference of American Society of Engineering Education (ASEE). Indianapolis, IN, 16 pages.Google ScholarGoogle Scholar
  26. Margaret Holmes Laurie, Petra Warreyn, Blanca Villamía Uriarte, Charlotte Boonen, and Sue Fletcher-Watson. 2019. An International Survey of Parental Attitudes to Technology Use by Their Autistic Children at Home. Journal of Autism and Developmental Disorders 49, 4 (April 2019), 1517–1530. https://doi.org/10.1007/s10803-018-3798-0Google ScholarGoogle ScholarCross RefCross Ref
  27. Wendy E. Mackay. 2020. Chapter 10 - Designing with sticky notes. In Sticky Creativity, Bo T. Christensen, Kim Halskov, and Clemens N. Klokmose (Eds.). Academic Press, Cambridge, Massachusetts, US, 231–256. https://doi.org/10.1016/B978-0-12-816566-9.00010-0Google ScholarGoogle ScholarCross RefCross Ref
  28. Hanna Mayer-Benarous, Xavier Benarous, François Vonthron, and David Cohen. 2021. Music Therapy for Children With Autistic Spectrum Disorder and/or Other Neurodevelopmental Disorders: A Systematic Review. Frontiers in Psychiatry 12 (April 2021), 643234. https://doi.org/10.3389/fpsyt.2021.643234Google ScholarGoogle ScholarCross RefCross Ref
  29. Micah O. Mazurek, Christopher R. Engelhardt, and Kelsey E. Clark. 2015. Video games from the perspective of adults with autism spectrum disorder. Computers in Human Behavior 51 (Oct. 2015), 122–130. https://doi.org/10.1016/j.chb.2015.04.062Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Joan Mora-Guiard, Ciera Crowell, Narcis Pares, and Pamela Heaton. 2017. Sparking social initiation behaviors in children with Autism through full-body Interaction. International Journal of Child-Computer Interaction 11 (Jan. 2017), 62–71. https://doi.org/10.1016/j.ijcci.2016.10.006Google ScholarGoogle ScholarCross RefCross Ref
  31. N. Pares, P. Masri, G. van Wolferen, and C. Creed. 2005. Achieving Dialogue with Children with Severe Autism in an Adaptive Multisensory Interaction: The "MEDIATE" Project. IEEE Transactions on Visualization and Computer Graphics 11, 6 (Nov. 2005), 734–743.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Sarah Parsons, Nicola Yuill, Judith Good, and Mark Brosnan. 2020. ‘Whose agenda? Who knows best? Whose voice?’ Co-creating a technology research roadmap with autism stakeholders. Disability & Society 35, 2 (2020), 201–234.Google ScholarGoogle ScholarCross RefCross Ref
  33. Grazia Ragone, Kate Howland, and Emeline Brulé. 2022. Evaluating Interactional Synchrony in Full-Body Interaction with Autistic Children. In Interaction Design and Children. ACM, Braga Portugal, 1–12. https://doi.org/10.1145/3501712.3529729Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Kathryn E. Ringland, Rodrigo Zalapa, Megan Neal, Lizbeth Escobedo, Monica Tentori, and Gillian R. Hayes. 2014. SensoryPaint: a multimodal sensory intervention for children with neurodevelopmental disorders. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing - UbiComp ’14 Adjunct. ACM Press, Seattle, Washington, 873–884. https://doi.org/10.1145/2632048.2632065 SensoryPaint.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Ginny Russell, William Mandy, Daisy Elliott, Rhianna White, Tom Pittwood, and Tamsin Ford. 2019. Selection bias on intellectual ability in autism research: a cross-sectional review and meta-analysis. Molecular Autism 10, 1 (Dec. 2019), 9. https://doi.org/10.1186/s13229-019-0260-xGoogle ScholarGoogle ScholarCross RefCross Ref
  36. Laura Scheepmaker, Kay Kender, Christopher Frauenberger, and Geraldine Fitzpatrick. 2021. Leaving the Field: Designing a Socio-Material Toolkit for Teachers to Continue to Design Technology with Children. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–14. https://doi.org/10.1145/3411764.3445462Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Karette Stensæth. 2013. “Musical co-creation”? Exploring health-promoting potentials on the use of musical and interactive tangibles for families with children with disabilities. International Journal of Qualitative Studies on Health and Well-being 8, 1 (Jan. 2013), 20704. https://doi.org/10.3402/qhw.v8i0.20704Google ScholarGoogle ScholarCross RefCross Ref
  38. Karette Stensæth and Ingelill B Eide. 2017. Umberto Eco’s notions of the ‘open work’ and the ‘field of possibilities’: New perspectives on music therapy and co-creation?British Journal of Music Therapy 31, 2 (Nov. 2017), 86–96. https://doi.org/10.1177/1359457516678622Google ScholarGoogle ScholarCross RefCross Ref
  39. Brynjulf Stige. 2006. On a Notion of Participation in Music Therapy. Nordic Journal of Music Therapy 15, 2 (Jan. 2006), 121–138. https://doi.org/10.1080/08098130609478159Google ScholarGoogle ScholarCross RefCross Ref
  40. Brynjulf Stige. 2012. Health Musicking: A Perspective on Music and Health as Action and Performance. In Music, Health, and Wellbeing, Raymond MacDonald, Gunter Kreutz, and Laura Mitchell (Eds.). Oxford University Press, Oxford, UK, 184–195. https://doi.org/10.1093/acprof:oso/9780199586974.003.0014Google ScholarGoogle ScholarCross RefCross Ref
  41. Jean-Baptiste Thiebaut, Patrick G T Healey, and Nick Bryan Kinns. 2008. Drawing Electroacoustic Music. In Proceedings of the 2008 International Computer Music Conference, (ICMC). Michigan Publishing, Belfast, Ireland, 8 pages.Google ScholarGoogle Scholar
  42. Walter Thompson. 2006. Soundpainting: the art of live composition. Workbook I. Walter Thompson, New York, N.Y.OCLC: 65221853.Google ScholarGoogle Scholar
  43. Lilia Villafuerte, Sergi Jordà, and Milena S Markova. 2012. Acquisition of Social Abilities Through Musical Tangible User Interface: Children with Autism Spectrum Condition and the Reactable. In CHI ’12 Extended Abstracts on Human Factors in Computing Systems. ACM, Austin Texas USA, 745––760. https://doi.org/10.1145/2212776.2212847Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. MusicTraces: A collaborative music and paint activity for autistic people

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              CHI EA '24: Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems
              May 2024
              4761 pages
              ISBN:9798400703317
              DOI:10.1145/3613905

              Copyright © 2024 Owner/Author

              Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 11 May 2024

              Check for updates

              Qualifiers

              • Work in Progress
              • Research
              • Refereed limited

              Acceptance Rates

              Overall Acceptance Rate6,164of23,696submissions,26%
            • Article Metrics

              • Downloads (Last 12 months)106
              • Downloads (Last 6 weeks)106

              Other Metrics

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader

            HTML Format

            View this article in HTML Format .

            View HTML Format