skip to main content
10.1145/3613904.3642518acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Free Access
Artifacts Available / v1.1

ArmDeformation: Inducing the Sensation of Arm Deformation in Virtual Reality Using Skin-Stretching

Published:11 May 2024Publication History

Abstract

With the development of virtual reality (VR) technology, research is being actively conducted on how incorporating multisensory feedback can create the illusion that virtual avatars are perceived as an extension of the body in VR. In line with this research direction, we introduce ArmDeformation, a wearable device employing skin-stretching to enhance virtual forearm ownership during arm deformation illusion. We conducted five user studies with 98 participants. Using a developed tabletop device, we confirmed the optimal number of actuators and the ideal skin-stretching design effectively increases the user’s body ownership. Additionally, we explored the maximum visual threshold for forearm bending and the minimum detectable bending direction angle when using skin-stretching in VR. Finally, our study demonstrates that using ArmDeformation in VR applications enhances user realism and enjoyment compared to relying on visual feedback alone.

Skip 1INTRODUCTION Section

1 INTRODUCTION

Figure 1:

Figure 1: ArmDeformation device is a wearable haptic device that utilizes the skin-stretching technique, resulting in the perceptual illusion of arm deformation. This manipulation includes the forearm’s elongating, shortening, and bending within a certain range. Here, we present our device that effectively achieves the capability of forearm elongating (left) and bending (right).

The advancement of VR technology has facilitated the ability of users to assume the perspective of avatars, experiencing VR from a first-person viewpoint  [75, 80]. These avatars are skillfully designed to reflect the appearance and movements of the user’s physical body in real time  [79]. When the visual representation of an avatar closely corresponds to the user’s physical body within a defined range of accuracy, the brain perceives all information related to the body as originating from a singular source referred to as the "my body" [45]. As a result of this phenomenon, the brain interprets the virtual avatar as the source of bodily sensations, resulting in the embodiment of the avatar in VR.

While users are generally able to quickly adapt to avatars in VR when the shape of the avatar is similar to their real body, it is still challenging for users to realistically experience such avatars when the VR avatar assumes shapes that are not possible in reality, for example, when experiencing the superpower of Rubberman  [1] in VR. To overcome this challenge, earlier research attempting to create the body deformation illusion has mainly utilized two key strategies. The first uses visual feedback, whereby the avatars’ shape in VR is manipulated  [50, 62, 87]. The second strategy uses haptic feedback techniques, such as skin-stretching  [93]. Combining these two methods is expected, with researchers leveraging both strategies to amplify user immersion and enrich the overall experience  [13, 45]. However, previous body deformation research focuses on manipulating perceived body part length, such as the arm  [46, 50, 66, 93], or changing the overall body size  [5, 10, 64, 75, 87].

Whereas forearm elongating enjoys using skin-stretching  [93] to make the surrealistic extension realistic, it maintains the direction of the forearm, according to the user’s intuition. Forearm bending generates an equivalent to a new joint along the forearm that modifies the direction of the forearm and highly extends the locations that can be reached by the forearm using smaller physical motion. This means forearm bending in VR can overcome this limitation: humans can only rotate their forearms within a limited range  [70]. Besides its use for games and entertainment, arm deformation has the potential to increase the productivity of a VR worker, such that users can reach objects that are far away without having to move physically  [7, 92]. On the other hand, adding new degrees of freedom requires haptic feedback to increase its embodiment  [52, 72] and enable feedback for new controls and arm shape sensing, even outside the user’s visual focus.

To tackle these challenges and expand the domain, we introduce ArmDeformation, a novel wearable haptic device designed to create the perceptual illusion of arm deformation using skin-stretching. ArmDeformation uses two actuators to dive a dot shape of 30mm diameter silicone pads to generate skin-stretching. We use a DC motor and two actuators to control the skin-stretching mechanism to rotate to a set position and attach them to the skin, with a rotational accuracy of 1°. Therefore, our device can effectively enhance the comfort and immersion of the user in experiencing the body deformation illusion through skin-stretching.

This paper presents the following contributions: (1) A perception study was conducted to determine how many actuators were needed to create the illusion of forearm elongating and shortening. As a result, we decided the optimal number of actuators required to induce these illusions. (2) Another perception study was conducted to determine the most effective design for skin-stretching to create an illusion of forearm bending. Through these two perception studies, we also confirmed that using skin-stretching feedback increases body ownership more effectively than using visual feedback alone. (3) The third perception study aimed to determine the maximum comfortable visual bending angle with the skin-stretching design decided by our second perception study. (4) The design of the ArmDeformation prototype and conducted technical evaluation. (5) Using ArmDeformation, a rotational Just Noticeable Difference (JND) study explored the minimum perceptible angle of forearm bending direction. (6) Four possible applications are presented to illustrate the possible use cases for our system. Additionally, we conducted a final user study using two of these applications to examine the level of realism and enjoyment experienced by users when interacting with our system.

Skip 2RELATED WORK Section

2 RELATED WORK

Our research centers on the impact of virtual avatar arm deformation on the user’s sense of body ownership in VR. We aim to delve deeper into whether leveraging visual feedback  [13, 87] and haptic feedback through skin-stretching techniques [93] can boost the user’s perceived body ownership over the avatar’s arm deformation.

2.1 Body Ownership

Body ownership is a well-studied topic in the fields of psychology  [16] and neuroscience  [69, 85]. Previous research has shown that the sense of body ownership arises from integrating multisensory information, including visual, haptic, and proprioceptive cues  [27, 45].

In Human–Computer Interaction (HCI) research, body ownership has been studied in VR and Augmented Reality (AR) applications  [13, 30]. One commonly used technique for inducing body ownership illusions is the Rubber Hand Illusion  [15, 78], in which participants see their hidden hand being stroked while simultaneously viewing a rubber hand being stroked. This creates the illusion that the rubber hand is part of their own body. Sungchul Jung et al.  [42] discovered that using customized virtual hands improved participants’ ability to estimate the size of virtual objects and increased participants’ sense of virtual body ownership and spatial presence compared to using generic virtual hands. Irene Senna et al.  [74] gradually replaced the sound of a small hammer striking skin with the sound of a hammer striking a block of marble while continually gently striking participants’ hands with the hammer to create a false perception of the material characteristics of the hand.

Research on haptic feedback systems has also explored ways to enhance the sense of body ownership in VR. Valentin Schwind et al.  [73] conducted a VR user study to investigate how the brain combines visual and haptic signals in response to different hand appearances using a cue-conflict paradigm. According to Andrea Webb Luangrath et al.  [56], the active aspect of product contact, which results in a notion of physical ownership of the virtual hand, influences consumer psychological ownership and product valuation. The "slime hand illusion"  [49] is a method where participants watch an experimenter manipulate slime in a mirror while having their hidden hand handled similarly. This causes a nonproprioceptive ownership distortion and gives the participants the impression that their finger or hand is being stretched or deformed like the visible slime.

In summary, research has shown that the illusion of body ownership can be induced in reality and VR through coordinated sensory feedback. Yet, there needs to be more literature concerning the specific illusion of forearm bending and its influence on users’ perception of body ownership in VR. In our paper, we align with the existing research that explores body ownership during the body deformation illusion, and we also explore how varying forearm bending degrees impact the sense of body ownership.

2.2 Body Illusion

To enhance users’ immersion with deformed avatars in VR and reality, previous research gives users visual feedback or haptic feedback that allows them to better adapt to the avatar  [13, 45, 46, 87, 93].

Previous perceptual psychology research suggests that when proprioception and vestibular sensation are not aligned, vision tends to dominate them  [14, 38]. Therefore, visual feedback can make the user adapt to the avatar. Gulliver’s virtual travels  [75] examine the effects of body size manipulation on user experience. The results show that reducing the body avatar size resulted in a noticeable decrease in participants’ body image perception, while no significant change was observed when zooming in on the body. Being Barbie  [87] showed that participants who experienced owning a Barbie-sized body perceived objects as larger and farther away, while those who experienced owning a larger body perceived objects as smaller and closer. Mar Gonzalez-Franco et al.’s  [30] experiments show that users unconsciously follow their avatars as long as their virtual body does not overlap with their physical body. Beyond Human  [51] proved that taking on nonhuman morphologies (including extra body parts and access to their respective superhuman skills) would lead to high levels of gameplay enjoyment. Andrea Stevenson Won et al.  [92] demonstrated that people can quickly learn to use a new avatar and succeed in that avatar, even if the avatar model is completely different from the user’s own body.

Although the user can adapt to the mismatch between the avatar body and the physical body, providing some haptic feedback can enhance the immersion of the user avatar. Christopher C. Berger et al.  [13] recreated a physical illusion in VR through which the perceived length of a person’s nose and arm was extended. Participants tapped themselves on the tip of the avatar’s nose, as seen from a first-person perspective, and the avatar’s nose slowly grew longer with each tap. Konstantina Kilteni et al.  [46] explored the extent of user ownership over a virtual arm when extended beyond the length of their real arm. The findings indicate that participants felt a sense of ownership over the virtual arm when it was up to three times longer than their real arm, but this feeling weakened when the virtual arm was elongated up to four times the length of their real arm. Majed Samad et al.  [72] reproduced the rubber hand illusion and demonstrated that the experiment could be achieved without the sense of touch, but that synchronized stroking would enhance it.

Unlike earlier research that focused solely on the elongation of specific body parts  [13, 46, 93] or the overall enlargement or reduction of the entire body  [75, 87]. Alternatively, only practical applications of forearm bending are considered  [7], and research on body ownership after forearm bending is missing. In contrast, our work explores the effect on body ownership when the forearm is bent. While earlier research favored the psychological study of body deformation illusion, our work combined this with practical applications, and we presented several possible application scenarios.

2.3 Skin-Stretching Feedback

Currently, haptic feedback is commonly used in both reality and VR, and much research has used haptic feedback to enhance user immersion  [48, 77] or provide additional guidance information[35, 84]. And the following are some common ways to enhance haptic feedback: skin-stretching  [12, 22, 36, 59, 65, 89], vibration  [11, 25, 43], force  [18, 21, 91] and so on. Our paper mainly uses skin-stretching to create the illusion of arm deformation.

There are two main applications of haptic feedback from skin-stretching: one is to enhance the user’s sense of reality  [77, 89, 93], and the other is to convey information  [17, 19, 20]. An example of skin-stretching is Gum-gum shooting  [93], which uses a stretching simulation device attached to the forearm. The operation of the device depends on the rotation of two motors to lengthen and restore the skin of the forearm, which can make the user feel that the forearm is longer. Masque  [88] applied skin-stretching technology to the face. It stretches the skin of the face to simulate the bumps and inertia of riding a motorcycle. Skin Stretch Stylus  [68] affects the user’s perception of surface stiffness by stretching the skin. When the device is pulled down, the intensity of the skin-stretching will be according to the normal force applied to the virtual surface, which simulates different degrees of stiffness of the virtual object the device is in contact with.

Nathaniel A. Caswell et al.  [17] developed a device that can apply the directional cues to the forearm and doesn’t need to occupy the fingers. Francesco Chinello et al.  [20] designed a lightweight hand strap with four actuators, each driving a cylindrical actuator that can rotate to perform skin-stretching and use the information from the skin-stretching as navigation. In addition to directional information, skin-stretching can convey geometric shapes or characters. Skin Drag Displays  [39] can produce two different stimulus types instead of the vibrotactile array, which enables the user to recognize tactile shapes through skin-stretching better. tactoRing  [41] is a new tactile display that allows for more accurate cue recognition by dragging a small tactile device over the skin around the finger and stimulating multiple skin areas.

Unlike previous work that only stretched the skin in a fixed area of the forearm in a specific direction, we innovatively designed a device that allows the skin on any side of the forearm to undergo skin-stretching in either of two directions. Concurrently, we integrate this skin-stretching with the illusion of arm deformation. Our approach differs from research like Gum-gum shooting  [93], as we conducted perception studies before designing our wearable device. This exploration helped us determine the optimal number of skin-stretching actuators and the designs of skin-stretching that could most effectively induce the illusion of arm deformation.

Skip 3PERCEPTION STUDY 1: LENGTH ADJUSTMENT Section

3 PERCEPTION STUDY 1: LENGTH ADJUSTMENT

Previous research introduced various ways to deliver haptic feedback to the forearm, such as Electrical Muscle Stimulation (EMS)  [54, 55], vibration  [33, 57], and skin-stretching  [77, 93]. We decided to use skin-stretching, which has proven its potential in gum-gum shooting  [93]. However, previous research  [93] only suggested that possibility and didn’t prove it through studies. Therefore, we redesigned the pad from the band design of the previous research to dot-shaped pads to enable the movement freely in back and forth direction and examined whether our skin-stretching technique is effective by comparing it with only visual conditions.

Another unexplored dimension in previous research  [46, 93] is the ideal number of actuators for skin-stretching that most effectively create the illusion of arm deformation, including elongating and shortening. So, we also sought to determine the optimal number of actuators to optimize our final wearable hardware design.

Consequently, we conducted a perceptual study to ascertain the number of actuators best suited to induce the arm deformation illusion (elongating and shortening) through skin-stretching. The experimental design incorporated one visual-only condition and three visual with haptic conditions, with the haptic stimuli produced by one, two, or three actuators on each side of the forearm, respectively.

3.1 ArmDeformationBox

This study used a unique ArmDeformationBox firmly fixed to a table. The device is a layered structure comprising three skin-stretching layers. The layers are arranged at intervals of 60 mm. Each layer adopts a square shape (290mm × 290mm) with a 150 mm diameter circular hole at the center, allowing the forearm to pass through. Two actuators (MG996R) are positioned on each skin-stretching layer’s upper and lower sections. These actuators can be adjusted along the track to fit the thickness of each user’s forearm. Each actuator propels a silicone pad with a 30mm diameter, which can stretch the user’s skin up to a maximum distance of 20mm (10mm inward and 10mm outward)  [88]. To ensure forward-and-backward movement of the silicone pad for skin-stretching feedback in our design, we designed the gap between each layer to be 60 mm. We also took into account the user’s forearm length, which ranges from 252 to 320mm for males and 225 to 298mm for females  [34, 94]. Based on these factors, we determined that the maximum number of layers should be three, consisting of six actuators. The elastic cord connecting the two actuators can be easily adjusted to vary the pressure on the user’s forearms, ensuring user comfort and effective skin-stretching.

Figure 2:

Figure 2: Dimensions of ArmDeformationBox and skin-stretching layer from the front and side views, respectively.

3.2 Procedure

Figure 3:

Figure 3: a: the user’s virtual forearm elongates to twice its original length; b: the user’s original length of the virtual forearm; c: the user’s virtual forearm is shortened to twice its original length; Right: participants wore an HTC VIVE Pro Eye, sat down with forearms secured within the ArmDeformationBox and felt the skin-stretching on their dominant hand. Then, participants used a VIVE controller with their non-dominant hand to choose their answers.

We begin by having the user pass their dominant hand through the ArmDeformationBox and hold it steady. Visual distractions emanating from the ArmDeformationBox were suppressed using an HTC VIVE Pro Eye head-mounted display (HMD) to optimize focus on the experimental stimuli. The experimenter then made the actuator tightly in contact with the skin using the elastic cords. In VR, a virtual version of their forearm appeared. In this experiment, participants are made to experience an illusion of their forearm either elongating or shortening, as shown in Fig. 3. Crucially, they were instructed to maintain a consistent forearm posture during skin-stretching. The sensory feedback was varied: participants received either a visual with haptic feedback or visual-only feedback. The haptic feedback also varied, with one, two, or three actuators activated per side. Notably, the experimental design was within subjects, meaning every participant experienced five iterations across these four conditions. The silicone pad is adjusted from the center to - 10mm (inward) or 10mm (outward), which is mapped visually 1/2 times (shortening) or two times (elongating) the original forearm length, respectively.

The experiment subjected participants to four conditions presented randomly. Each condition differs in the number of actuators used to render skin-stretching along the hand:

Visual stimuli only: No actuators were activated

Two actuators: Actuators 3 and 4 were activated (See Fig. 2).

Four actuators: Actuators 1, 2, 5, and 6 were activated.

Six actuators: All actuators (1 to 6) were activated.

All haptic conditions start at the home position and gradually move to maximum skin-stretching distance within an equivalent time frame. At the same time, the visual forearm’s deformation mirrored the intensity of the skin-stretching. After the participants answered the question, the silicone pad returned to its home position.

The directionality of the actuators’ movement was aligned with the illusion. In simulations of forearm elongation, the actuators pushed the silicone pads outward. Conversely, in the illusion of forearm shortening, the movement was inward.

A post-stretch assessment phase followed each trial, during which participants needed to respond to a series of questions. To probe the effects of the body deformation illusion on the user experience in VR, we employed the embodiment questionnaire created by Peck et al.  [63] (see 3.4). This questionnaire consists of seven Likert scales, where responses span from -3 (strong disagreement) to 3 (strong agreement). We selected this questionnaire due to its user-friendliness and adaptability. Prior research has evidenced that subjective embodiment measures, garnered through questionnaires, align with objective metrics, such as electroencephalograms  [32]. Embodiment intersects with other key factors like presence  [61], perception  [8, 29], and ultimately behavior  [28]. Therefore, it can be a useful metric that also helps discuss the effects of the body deformation illusion on other critical aspects of the VR experience. Adopting a comprehensive embodiment questionnaire was our first step in revealing the effects of the body deformation illusion on body embodiment in VR  [31, 63, 76, 83].

Participants used a VIVE controller to answer seven questions by moving a slider bar to select a score between -3 and 3, as shown in Fig.  3. Participants were engaged in this structured experiment for roughly 40 minutes and allowed to take a 5-minute break at any point to prevent potential fatigue.

3.3 Participants

In the forearm elongating illusion experiment, we recruited 18 participants (8 self-identified females, 10 self-identified males) aged 20-27 (M = 24.2, SD = 2.2), with forearm length (including the hand) ranging from 400 to 450 mm (M = 427.2, SD = 18.7), all right-handed. Among them, 13 had previous VR experience, with 3 familiar with visuo-haptic illusions. For the forearm shortening experiment, we recruited 18 participants (8 self-identified females, 10 self-identified males) aged 22-27 (M = 24.9, SD = 1.8), with forearm length (including the hand) spanning 400 to 480 mm (M = 431.7, SD = 20.3) and all of them are right-handed. Of these participants, 15 had prior VR experience, and 4 were acquainted with visuo-haptic illusions. All participants were compensated with a locally equivalent amount of 10 USD for their involvement. The study was approved by our Institutional Review Board.

3.4 Results and Findings

Figure 4:

Figure 4: Embodiment rating in the illusion of forearm elongating (left) and shortening (right)

To probe the effects of the body deformation illusion on the user experience in VR, we employed the embodiment questionnaire created by Peck et al.  [63], consisting of seven Likert scales ranging from -3 (strong disagreement) to 3 (strong agreement):

Q1: I felt as if the virtual body were my body.

Q2: It felt like the virtual body I saw belonged to someone else.

Q3: I felt my body was where I saw the virtual body.

Q4: I felt like I could control the virtual body as if it were my own.

Q5: I felt out of my body.

Q6: The virtual body began to resemble my body.

Q7: As the forearm’s length changed, I felt the instinct to move my hand.

As underscored in prior research, the measured scores typify participants undergoing a profound embodiment sensation [31, 63, 83]. The degree to which participants felt the virtual body was their own, known as embodiment score, is calculated as: \(\begin{align*} &\text{Embodiment Score}\\ & = \cfrac{((Q_1-Q_2)/2*2+Q_4*2+(Q_3-Q_5)/2*2+Q_6+Q_7)}{8} \end{align*}\)

Where the weights of the categories are based on their perceived importance in the overall embodiment experience [44, 64].

By calculating the above equation, we can get the following results. In the case of the forearm elongating, visual-only (M = -0.715, SD = 0.702), two actuators (M = 0.675, SD = 0.771), four actuators (M = 0.974, SD = 0.868), and six actuators (M = 0.997, SD = 0.827); in the case of forearm shortening, visual-only (M = -0.499, SD = 0.990), two actuators (M = 0.521, SD = 0.670), four actuators (M = 0.613, SD = 0.696), and six actuators (M = 0.535, SD = 0.665).

We used the Friedman Test post-Wilcoxon signed rank test (See Fig. 4) which shows significant differences between visual-only and visual with haptics (p < 0.05); however, there were no significant differences between the haptic groups (p > 0.05) except for one group (two actuators vs six actuators with the forearm elongating illusion(p = 0.011)).

First, the embodiment sensation is statistically significantly higher in all cases with skin-stretching feedback compared to visual stimuli only. These results confirm that skin-stretching feedback can significantly improve body ownership when forearm deformation is applied in VR. This aligns with previous research showing that body ownership improves when provided with haptic feedback  [52, 72]. Second, in most cases, there was no significant difference between the cases with haptic feedback. So, we decided to use two actuators for our final wearable prototype. To make the decision, we also consider the cost-effectiveness and weight of the device. We considered that the fewer actuators we use, the lighter and more comfortable the device will be. This decision was made keeping in mind the effectiveness of the illusion, the practicality of the device, and user comfort.

Skip 4PERCEPTION STUDY 2: BENDING MOTION Section

4 PERCEPTION STUDY 2: BENDING MOTION

In Perception Study 1, we determined that two actuators are enough to induce forearm elongating and shortening illusions. This study will use this knowledge to identify the optimal skin-stretching design to induce the forearm bending illusion.

4.1 Mathematical Model for Visual Forearm Bending

We define a bend of the forearm by looking at the angle between the arm axis near the elbow and the axis direction near the wrist (See Fig.  5). The new curved axis is a circle arc tangent to the two directions at both ends of the bent forearm. The model of the forearm, as it is rendered in the Unity game engine we use, is deformed by calculating a new position for each mesh vertex. The calculation of the geometric deform is described in appendix  A.

4.2 Procedure

Figure 5:

Figure 5: Setup for bending motion study. The four figures at the top represent forearms bending upwards, while the four at the bottom represent forearms bending downwards. The six figures on the right correspond to the different skin-stretching designs.

Based on Perception Study 1, we concluded that using two actuators can effectively generate skin-stretching. So, we set our hardware to render the skin-stretching on the top and bottom of the forearm. This study aimed to find the optimal skin-stretching design for forearm bending.

For this study, we utilized the same ArmDeformationBox as in Perception Study 1 with a single skin-stretching layer: two actuators on the upper and lower sides of the forearm. Another difference from Perception Study 1 is that as we render forearm bending, the actuators will render opposing directions of skin-stretching.

In a pilot study with four participants (3 self-identified males and 1 self-identified female, 22-25 years old, SD: 1.25) With the same procedure in study 3 inspired by  [4, 58], we determined that participants maintain embodiment while their forearms are rendered bending between an upward bending of 56.3° and a downward bending of 45.2°. So, we decided the bending range was an upward bending of 45° and a downward bending of 45°. In Perception Study 3, we conducted a detailed visual embodied bending threshold study with the skin-stretching design that we found in this study.

Similar to Perception Study 1, We begin by having the user pass their dominant hand through the ArmDeformationBox and hold it steady. The participant’s visual display shows their forearm gradually bending, while ArmDeformationBox actuators gradually stretch the skin on top and bottom of the forearm. For the case of the forearm bending upward, the top actuator drives the silicone pad inward, and the bottom actuator drives the silicone pad outward to simulate the changes in skin lengths due to the bending. When the forearm is bent downward, the actuator flips accordingly.

The participants experience bending at randomized directions, and the haptic rendering is randomly chosen out of 4 conditions: full rendering where both top and bottom actuators are rendering corresponding and opposing skin-stretching forces, rendering using the top actuator alone, using the lower actuator alone, and a control condition where neither actuator is moving. While the order of conditions is random, each condition is repeated five times.

Each forearm bending condition requires reaching the maximum angle of 45° from 0° in the same time frame, synchronized with a gradual increase of the skin-stretching to a maximum of 10mm each time. After reaching maximum visual forearm bending and maximum skin-stretching distance, participants answered the same seven body ownership questions as in Perception Study 1. Participants used a VIVE controller to answer seven questions by moving a slider bar to select a score between -3 and 3, as shown in Fig.  3. After participants answered the questions, the actuators returned to the home position.

4.3 Participants

We recruited 18 participants (8 self-identified females, 10 self-identified males) aged 21-27 years (M = 23.8, SD = 1.7), all right-handed. Their forearm length (including the hand) varied between 410 and 480 mm (M = 429.4, SD = 21.5). Of these, 13 participants reported prior experience with VR technology, and six were acquainted with visuo-haptic illusions. All participants were reimbursed with an amount equivalent to 10 USD in their local currency for their participation. Our Institutional Review Board approved the study.

4.4 Results and Findings

Figure 6:

Figure 6: Embodiment rating in the illusion of forearm bending. The left figure is the forearm bending upward, and the right is the forearm bending downward.

We used the Friedman Test post-Wilcoxon signed rank test, as shown in Fig.  6. When the forearm bending upward, the visual-only condition scored (M = -1.067, SD = 0.998), both actuators’ condition scored (M = 0.558, SD = 0.823), the top actuator-only condition scored (M = -0.072, SD = 0.703), and the bottom actuator-only condition scored (M = 0.063, SD = 0.649). When the forearm bending downward, the visual-only condition scored (M = -1.141, SD = 0.999), both actuators’ condition scored (M = 0.780, SD = 0.758), the top actuator-only condition scored (M = 0.513, SD = 0.794), and the bottom actuator-only condition scored (M = 0.203, SD = 0.683).

Our results indicate that using both actuators together performs better than using only one actuator or relying on visual feedback alone (p < 0.05) in both upward and downward bending scenarios (as shown in Fig.  6). This is likely because two actuators can provide more complete haptic feedback, which helps the user feel like their forearm is bending. Consequently, we chose to use two actuators operating synchronously to evoke the forearm bending illusion.

Skip 5PERCEPTION STUDY 3: EMBODIED BENDING THRESHOLD Section

5 PERCEPTION STUDY 3: EMBODIED BENDING THRESHOLD

Building upon the results obtained from the two studies above (using two actuators synchronously), we intend to investigate the maximum acceptable forearm bending angles, both upward and downward. We are using a method from previous research  [4] to estimate the threshold of the forearm bending angle. This threshold will help us understand the point at which users can still feel a sense of body ownership over their forearms when forearm bending.

5.1 Procedure

We utilized the ArmDeformationBox as in the Study 2. In this study, we implemented a method involving the synchronous movement of two actuators to simulate the forearm bending deformation. During the experiment, participants experience an illusion of forearm bending at different angles, ranging from 20° to 160° in increments of 20° , with each condition repeated four times. The experiment was categorized into upward and downward bending segments. In each trial, the actuators synchronously drove the silicone pad from its home position to the maximum distance within the same time frame, providing consistent haptic feedback stimuli when bent in the same direction.

Earlier research  [4] and we are both focused on illusions in VR, aiming to pinpoint the threshold above which users strongly believe the illusion. However, our unique interest lies in understanding when users feel most connected or possessive of the virtual forearm. To dive deeper into this aspect, we tweaked the original questions from previous research  [4, 58] and asked participants to answer the two following questions after each trial:

(1)

I felt as if the virtual arm was my arm.

(2)

How confident do you feel about your answer from 1 to 5? Choose 1 for not confident at all and 5 for very confident.

Participants used a VIVE controller to respond to these two questions as Fig.  3 shown. For the first question, participants answered either "yes" or "no," while for the second question, participants moved a slider bar to select a score from 1 to 5.

5.2 Participants

We recruited 16 participants (6 self-identified females, 10 self-identified males) aged 20-27 years (M = 23.9, SD = 2.2), all right-handed. Their forearm length (including the hand) varied between 400 and 480 mm (M = 438.1, SD = 27.9). Of these, 13 participants reported prior experience with VR technology, and six were acquainted with visuo-haptic illusions. All participants were reimbursed with an amount equivalent to 10 USD in their local currency for their participation. Our Institutional Review Board approved the study.

5.3 Results and Findings

Figure 7:

Figure 7: The correlation between body ownership ratio and bending angle. The bending angle threshold is 49.4830° (Upward) and 50.5061° (Downward). The angle corresponding to 1 represents a most embodied angle, whereas 0 indicates an unembodied angle. The error bars correspond to the \(95\%\) bootstrap confidence intervals.

Referred from previous work [4, 58], to calculate the bending threshold for every participant, we divided the instances wherein the participant experienced the virtual forearm as their own at that specific angle by the total repetitions as the Body Ownership Rate. The concept of Body Ownership Rate refers to the extent to which a user perceives a virtual forearm as a natural extension of their own body. A participant was deemed to have experienced the illusion of the virtual forearm being their own when they chose "yes" for the first answer and reported a confidence level of 3 or higher. Following this, we averaged the confidence rates across participants for each angle and charted the results. We then employed a threshold estimation curve of the form (where a and b are constants): (1) \(\begin{equation} f(x) = \cfrac{1}{1+e^{ax+b}} \end{equation}\)

As previous research  [4, 58, 81] suggests 0.75 is a reasonable acceptance rate, we chose the angle value corresponding to an average Body Ownership Rate equal to 0.75 as the threshold of forearm bending angle.

For upward forearm bending, with constants a = 0.0313, b = −2.6490, the threshold was determined to be 49.4830°. For downward forearm bending, with constants a = 0.0361, b = −2.9244, the threshold was calculated as 50.5061° , as Fig.  7 shows.

These findings suggest that the range of visual forearm bending should be considered in developing forearm bending VR applications with haptic feedback. We designed the applications we propose hereafter by comprehensively considering our study results and the results of previous research  [46] on arm-length adjustment.

Skip 6ARMDEFORMATION Section

6 ARMDEFORMATION

We have developed a wearable haptic device named ArmDeformation, designed to stretch the skin on opposing sides of the forearm. This technique enhances the illusion of arm deformation, enriching the user’s realistic experience in VR.

6.1 Hardware

The device’s hardware consists of three subsystems: a rotation subsystem, an auto-adjustment subsystem, a skin-stretching subsystem, and a system board. The device can be secured to the forearm with a rubber band. Depending on the range of arm deformation in VR, the hardware stretches the skin to varying extents, simulating the effect of different degrees of arm deformation. Our hardware frames are 3D-printed, as depicted in Fig.  8, using black PLA and orange PETG. The device’s overall weight is 834 grams.

Figure 8:

Figure 8: Side view of ArmDeformation, and front views of the three subsystems: auto-adjustment (A), rotation subsystem (B), skin-stretching subsystem (C), and the system board (D).

6.1.1 Rotation Subsystem.

While ArmDeformationBox, used for prior studies, could render bending of the forearm in a single plane using top and bottom actuators, the wearable ArmDeformation can render bending in an arbitrary plane. To enable this flexibility, the rotation subsystem can rotate the actuators in opposing locations on both sides of the forearm. The rotation subsystem hosts a round track in a plane orthogonal to the forearm axis (the XY plane). A circular ring platform can rotate along the track around the forearm axis. The ring has teeth on its inner face, rotated by a toothed DC motor (MG513P30_12V), named XY motor, with a 58:10 gear ratio. Using a hall encoder, the motor calculates the current position relative to the starting point, thereby achieving relative position control.

6.1.2 Auto-Adjustment Subsystem.

This subsystem enables a tight fit of the ArmDeformation to the forearm of the user. Whenever the rotation subsystem requires the rotation of the rings and the silicone pads around the forearm, this subsystem behaves as a clutch, enabling free rotation with minimal impact on the user, and then adapts back to the forearm shape and dimension. The subsystem comprises two tracks with racks placed at opposite ends of the rings near the elbow carrying the skin-stretching subsystem. Each rack is driven by a toothed actuator (MG996R) called the auto-adjusting actuator. It retracts the rack and the skin-stretching actuators from the user’s forearm when the ring needs to rotate to a new bending plane. When the ring reaches the desired rotation, it fits the rack back to touch the user’s forearm, adjusting according to size and shape.

6.1.3 Skin-Stretching Subsystem.

The skin-stretching actuator (the toothed actuator (MG996R)) is secured at the end of the rack and drives the silicone pad (Shore 15) to stretch the user’s skin. We constructed the silicone model using 3D printing. Referring to the previous research  [88], 30mm is effective in generating skin-stretching on the skin, and considering the hardware structure and components and forearm space, we chose the silicone pad with a 30mm diameter.

6.1.4 System Board Design.

To enable the operation of four actuators operating at 6V. Please refer to Appendix  B.

Figure 9:

Figure 9: Overview of a user wearing the ArmDeformation. The figure on the right shows the user wearing the ArmDeformation with the arm raised.

6.2 Device Software

The device code runs on Arduino Mega, and our VR applications are developed using the Unity Game engine (2021.3.26f1c1) on an ASUS Vivobook Pro 16X OLED (K6604) PC. The users put on the ArmDeformation along their forearm axis, where the Rings are orthogonal to the forearm axis and define the XY axis. The y-axis is set to point straight up from the forearm. When the device is turned on, it will wait for a command from Unity. By default, the device’s auto-adjusting actuator pulls the racks away from the user’s forearm, keeping the silicone pads distant from the skin. At the same time, the skin-stretching actuators place the silicone pads in the middle of their range of motion, allowing the pads to move in both directions upon command.

Whenever an arm deformation is needed, the Unity application sends the specific type of deformation required to the device. For forearm bending, it sends bending direction and angle; for elongation or shortening, it sends the proportion of length to be deformed. After the device receives the command, if it is a forearm bending, it will use the XY motor to rotate the rings to the set position along its track. Then, in all arm deformation conditions, the auto-adjusting actuators push the rack nearer to the forearm skin, stopping when the microswitch is triggered. After the auto-adjusting actuators have stopped working, the skin-stretching actuators perform operations depending on the type of arm deformation. The skin-stretching actuators move the silicone pad toward the forearm bending direction for the forearm bending. The actuator on the same side of the bending direction moves the silicone pad inward, and the actuator on the opposing side moves the silicone pad outward. The movement range of skin-stretching actuators depends on the forearm bending intensity: a larger bending angle results in more extensive pad displacements. For the forearm elongating and shortening, the skin-stretching actuators drive the silicone pad in sync with the avatar’s forearm-length changing. Once the virtual forearm completes its deformation, the auto-adjustment actuators retract the rack for the next skin-stretching.

6.3 Technical Evaluation

We conducted a technical evaluation of our ArmDeformation prototype to learn about its capabilities and limitations. We were particularly interested in answering four technical questions: (1) What is the normal pressure produced by the auto-adjustment actuator? (2) What is the relationship between the distance the silicone pad moves and the force produced by the skin-stretching actuator? (3) What is the latency of ArmDeformation, from device off to an actual skin-stretching? (4) What is the typical noise produced by Deformation? This section attempts to answer these questions and better inform the system’s design.

6.3.1 Pressure and Force for Skin-Stretching Feedback.

Figure 10:

Figure 10: (a) Experimental setup. (b) The fitted line of torque versus current. (c) The fitted curve of distance versus force. The error bars correspond to the \(95\%\) bootstrap confidence intervals.

To evaluate the pressure on the skin and force generated by skin-stretching feedback, we built a special experiment setup comprised of a simplified ArmDeformation prototype and an artificial silicon skin placed on a base positioned beneath it, as shown in Fig.  10 (a). The simplified ArmDeformation prototype consists of an auto-adjustment actuator, which generates pressure, and a skin-stretching actuator, which creates a force for skin-stretching feedback, both used the same actuator (MG996R) as the actual prototype. First, we find the relationship between the torque (T) and current (I) of the actuator we used. The torque-current relation of the actuator was determined by having it hold a weight (50g to 550g) on the end of a gear, the distance between the weight and actuator being 250mm, and recording the current draw of the actuator. After conducting tests on 55 different data points, torques ranged from 122.5N · mm to 1347.5N · mm. We fit a linear function to these data points using the least squares method and found that the actuator had a linear relation of T = 1.83I, where T is in N · mm, and I is in mA, as shown in Fig.  10 (b). The line of best fit had an R2 value of 0.942. Second, to measure the pressure, We conducted 30 measurements of the current of the auto-adjustment actuator. The current was 203.33mA (SD: 20.90), and the calculated torque was 372.10N · mm (SD: 38.24). In the end, we divided the torque by the gear’s radius (29.5mm) to determine the pressure we applied on the skin. Based on this calculation, the pressure of skin-stretching feedback in our device is 12.61N (SD: 1.30). Third, we measured the current of the skin-stretching actuator to find the force for skin-stretching feedback. Considering the isotropic elastic behavior of the skin  [71], we controlled the actuator rotated to move the silicone pad between 0 to 10mm with 1mm increments and repeated three times. Paired displacement and current values readings were taken in 1mm increments, resulting in 60 data points. We used the least squares method to fit a power function to these data points. Based on this, we can figure out force 0.45N (SD: 0.24) in 1mm to 20.05N (SD: 5.48) in 10mm. Also, based on the result, we calculated the power function between distance traveled (D) and force (F), which is F = 0.02D2.91, as shown in Fig.  10 (c).

6.3.2 Latency.

We have optimized the speed of the XY motor and the auto-adjustment actuator to balance efficiency with user comfort. Although these devices are not ideally instantaneous, they are rated at 293±21 rpm (MG513P30_12V) and 0.160 sec/60° (MG996R), respectively. We measured the minimum overall latency, the time it takes for the device to be applied directly to the skin without needing rotation, at 30 different instances. This yielded an average time of 4.71 seconds (SD: 0.39). Similarly, we measured the maximum overall latency, the time the device requires a 90° rotation before skin application, at 30 different instances. This resulted in an average duration of 6.11 seconds (SD: 0.50).

6.3.3 Noise.

To measure operational noise, we used a TES-1350A sound level meter in a controlled environment—an empty room (35.21 dB, SD: 1.25) and a distance of 0.5 meters from our device. We conducted 30 times measurements for each condition: rotation subsystem, auto-adjustment subsystem, and skin-stretching subsystem. Each subsystem was operated individually through an Arduino script to evaluate the device’s noise precisely. Specifically, the rotation subsystem was activated from 0° to 90° , while the auto-adjustment and skin-stretching subsystems were cycled from their home positions to their maximum extents and back. The results were as follows: The rotation subsystem averaged 52.95 dB (SD: 2.92), the auto-adjustment subsystem averaged 40.10 dB (SD: 0.54) during attachment and 51.89 dB (SD: 1.55) during detachment, and the skin-stretching subsystem averaged at 40.19 dB (SD: 0.53). Compared with the average noise level of an air conditioner (60 dB) and a washing machine(70 dB)  [2], our device noise is slight.

Skip 7USER STUDY 4: ROTATIONAL JND Section

7 USER STUDY 4: ROTATIONAL JND

Our objective for this study was to determine the minimum angle of forearm bending direction discernible by the user. The experiment tests eight directions of forearm bending, thoroughly assessing user perception and demonstrating the device’s rotation feature. Therefore, we can ensure that forearm bending direction changes are discernible for users.

7.1 Procedure

Figure 11:

Figure 11: Left: different reference and test direction. Right: participants sat down with forearms secured within the ArmDeformation and felt the skin-stretching on their dominant hand. Then, participants used a VIVE controller with their non-dominant hand to choose their answers.

In this study, we implemented a 2AFC (two-alternative forced-choice) test, where participants were tasked with identifying the direction of the test direction—counterclockwise or clockwise—concerning the reference direction within a sequence of two different forearm bending directions [9, 47], which can estimate PSE and JND values from various directions.

The reference forearm bending was set in eight directions at 45° intervals. The test bending direction was positioned ± 20° , ± 45° , and ± 90° relative to these references [9, 47]. Each reference and test bending direction combination was repeated five times, resulting in 240 trials per participant (calculated as 8 reference bending directions × 6 test directions × 5 repetitions).

The order of the reference and test bending directions was randomized to avoid any potential pattern recognition or predictability. Both the reference and test bending directions were displayed for a 5-second duration. Participants were able to answer by selecting either the counterclockwise or clockwise option using the left or right side of the VIVE controller touchpad, respectively, as shown in Fig.  11.

7.2 Participants

We recruited 12 participants (six self-identified females and six self-identified males) aged between 22 and 27 years (M = 24.2, SD = 1.3), all right-handed. Their forearm length (including the hand) ranged from 410 to 450 mm (M = 425, SD = 14.5). Of these participants, 13 reported using VR technology, and six were aware of visuo-haptic illusions. All participants were reimbursed with an amount equivalent to 15 USD in their local currency for their participation. Our Institutional Review Board approved the study.

7.3 Results and Findings

Figure 12:

Figure 12: JND study results. a: The average accuracy when the test and reference directions differed by 90°. b: The average accuracy when the test and reference directions differed by 45°. c: The average accuracy when the test and reference directions differed by 20°. d: Sigmoid equations that fit participants’ clockwise response ratios. e: PSE estimation results. f: JND estimation results.

Fig.  12 shows the results from our user evaluations. With angle deviations of ± 20°, ±45°, ±90° , the mean accuracy rates registered at \(88.33\%\) (SD: 0.086), \(92.81\%\) (SD: 0.051), and \(94.58\%\) (SD: 0.055) respectively. This indicates a correlation where a decrease in angle difference leads to a reduction in the rate of correct identification.

The point of subjective equality (PSE) shows the difference between veridical and perceived directions, and JND shows the precision of direction judgments. In determining the PSEs and JNDs across the eight directions, we adopted a methodology consistent with prior research  [9, 47], leveraging a sigmoid equation (f(x) = 1/(1 + e− ((xα)/β)). This equation was fitted to the participants’ clockwise response ratio, with α and β as constants determined through a least squares approach. We used the ratio of the clockwise answer in the ± 20°, ±45°, ±90° test directions to fit the formula to obtain the user’s PSEs and JNDs for the eight reference directions. In Fig.  12 (d), the plotted dots represent the proportions of clockwise responses, while the sigmoid equation delineates the fitted psychometric function. The PSE values are given by a proportion of 50 percent for the psychometric function  [9]. The specific PSE values for the eight directions are illustrated in Fig.  12 (e). The difference between the two angles, where the curves correspond to values of 0.84 and 0.5, determines each JND value. Based on the JND for each participant, we conducted a Friedman Test post-Wilcoxon signed rank test. The result shows significant differences between 0° and 135° (p = 0.003), 0° and 180° (p = 0.007), 0° and 225° (p = 0.005), 0° and 270° (p = 0.036), 0° and 315° (p = 0.016), 45° and 135° (p = 0.004), 45° and 180° (p = 0.014), 45° and 225° (p = 0.005), and 90° and 225° (p = 0.033). The specific JND values for the eight directions are illustrated in Fig.  12 (f).

Our PSE values range from -8.4° to 9.0° and JND values span from 5.64° to 27.97°. Using this data, Immersion can be enhanced by adjusting the visual feedback in VR and controlling the ArmDeformation to match the user’s expectations and perceptions. The PSEs show how users tend to perceive bending directions, so visual cues in VR applications can be fine-tuned to make them a more natural and immersive experience. We also can use the value of JND to overcome the latency limitations of ArmDeformation, e.g., ArmDeformation doesn’t need to rotate when the difference in the angle of the direction in which the user’s forearms are bending is less than the corresponding JND value.

Skip 8USER STUDY 5: REALISM AND ENJOYMENT Section

8 USER STUDY 5: REALISM AND ENJOYMENT

This study aims to understand how the user experiences the illusion of arm deformation after combining haptic and visual feedback. It is known from previous research that multisensory information can reconstruct our perception of the shape of the body  [46, 50]. Two VR applications were developed to evaluate the realism and enjoyment of the user’s experience and to determine if using ArmDeformation in VR improved the user’s perception. As previous user studies have assessed the effect of skin-stretching on the arm deformation illusion, we focus on understanding the user’s VR experience and subjective perceptions in this study.

8.1 Procedure

Figure 13:

Figure 13: Left: Boxing application, participants control forearm elongating to hit sandbags via the VIVE controller; Right: Tentacle application, participants control the bending of the tentacle through the VIVE tracker.

The first application we explored was a virtual boxing game. Participants can control their forearm elongating by pressing the trigger button on the VIVE controller, allowing them to interact with the sandbags. When the hand hits the sandbag or reaches its maximum distance, there is a rebound recovery process for the forearm. In this application, since we don’t need to change the position of the skin-stretching subsystem, the skin-stretching feedback will not have a delay.

The next application provides users with an experience that controls the movement of a tentacle. The skin-stretching feature of ArmDeformation allows users to feel the various bending of the tentacle, which they can control by wearing a VIVE tracker on the back of their hands and forearms next to the elbow and bending it at any angle. Through a first-person perspective, users can fully immerse themselves in the experience of controlling the tentacle’s movements in both horizontal and vertical directions. They can easily switch between these directions by pressing the trigger button on the VIVE controller, making the experience even more engaging. In this application, if we do not need to change the bending direction, the skin-stretching feedback will not have a delay.

In each application, the user wore the ArmDeformation in their dominant hand while wearing the VIVE HMD. Participants were asked to experience the two applications in two conditions (visual-only, visual + haptic) for at least 3 minutes. Following each condition, the participants evaluated the level of realism and enjoyment of their experience, as has been done in previous research [40, 55, 95], by filling out a paper questionnaire with 7-point scale Likert questions. The experiment ended with a post-hoc interview. The experiment took about 30 minutes to complete.

Our study utilized a within-subject design with a single modality factor: haptic (our device) versus no-haptic experience (baseline condition). During the haptic condition, participants were exposed to VR applications with skin-stretching feedback generated through the ArmDeformation. Conversely, during the no-haptic condition, participants experienced VR applications without haptic feedback, relying solely on visual feedback.

8.2 Participants

We recruited 16 participants (six self-identified females and ten self-identified males) aged between 21 and 27 years (M = 24.4, SD = 1.9), all right-handed. Their forearm length (including the hand) ranged from 400 to 490 mm (M = 433.7, SD = 27.2). Of these participants, 14 reported being familiar with VR technology, and ten were aware of visuo-haptic illusions. All participants were reimbursed with an amount equivalent to 10 USD in their local currency for their participation. Our Institutional Review Board approved the study.

8.3 Results and Findings

Figure 14:

Figure 14: Participants’ perceived realism and enjoyment in both VR tasks (tentacle and boxing) for visual-only and visual with haptics condition. We found that ArmDeformation improves both metrics. The error bar represents a \(95\%\) confidence interval.

Fig.  14 provides the subjective evaluations of realism and enjoyment across conditions. We subjected the results to the Wilcoxon signed-rank test. The evaluations highlighted that, when compared to the visual-only condition, the visual with haptic condition was significantly better in terms of realism for both boxing (Z = −3.559, p = 0.0003) and tentacles scenarios (Z = −3.035, p = 0.002). The same superiority in the visual with haptic condition was evident for enjoyment, with significant differences observed for boxing (Z = −3.211, p = 0.001) and tentacles scenarios (Z = −3.559, p = 0.001). These results suggest that our system can significantly improve the realism and enjoyment of VR experiences.

8.4 Qualitative Findings

Qualitative results from interviews with our participants further confirmed our previous findings. All participants agreed that using ArmDeformation felt more realistic and enjoyable. As a result, it was also more immersive. For example, P8 said, "It was clear to feel the difference between visual-only and visual with haptic conditions, and it felt as if you were reaching a sandbag yourself with the haptic feedback, rather than moving an abstract thing inside the computer.", and others described it as "novel" (P4) and "interesting" (P12). When asked about their favorite part of the experiment, for the tentacle scenario, P5 said, "In the tentacle scenario, I would feel that the angle of the bending of the tentacle could be matched with the movement of the hand and feel that my hand was the tentacle. It felt the movements were pretty synchronized and realistic." Participants also commented on the tentacle bending solution and were satisfied with it, with P11 saying, "It’s hard to feel that the tentacle is my hand, but when the tentacle is moving, it still feels quite real, it can give me a feeling that the tentacle is soft, not a rigid body like the forearm."; and for the application scenario of boxing, P12 said, "When the device works, it will feel like my forearm is elongating, especially when the forearm hits far away, the device will give more strength, and it will feel like much realism.", P9 said, "The added haptic feedback would feel like the forearm is exerting force." However, some participants suggested that the forearm elongating length is a bit unrealistic; P8 said, "There will be a certain gap between the forearm elongating to a certain length and the reality." P11 said, "The forearm elongating is too long. It feels a bit unrealistic and like an animation." But they still felt more realistic than the visual condition. At the same time, some participants also suggested application scenarios, "I can feel the difference between haptic and no haptic. In more reality-oriented scenarios, I prefer no haptic; in scenarios that require fine manipulation and the forearm or tentacle in the VR will move along with the movement in reality, I prefer haptic, such as the movement of the tentacles."

When asked which application scenario participants preferred, most participants (P1, P3, P4, P5, P6, P10, P11, P12, P14, P16) found the boxing application scenario to be more immersive than the tentacle application scenario; this is not surprising as many people are not used to non-human avatars. P6 and P16 noted that it was more realistic and enjoyable because "there were things in the scenario that were able to be interacted "; P4 and P12 mainly appreciated the feeling of the forearm elongating, "It felt more realistic, it felt like the forearm elongating that long.". As for the scenarios where tentacles are used, P7 and P13 state that "the tentacle ideas are better, and the feeling of the tentacle movement is exciting."

Finally, when asked what needs to be further improved, all participants hit on the fact that they wish the device could just be a little bit lighter, even though the current experience is satisfactory. They commented that if the device was more lightweight, it could be applied to more parts of the body, e.g., P1 said, "If the hardware could be lighter, it could be placed in a VR gym application scenario, where the device would be placed on different parts of the body such as the forearms, palms, thighs, etc., and it could be used to stimulate people through skin-stretching to feel the muscle in the feeling of movement." P6 said, "I would be more willing to use the device if it was lighter, and it would be better if it could be made into a glove or something like that, where you can feel the force feedback on the back of your hand and the palm of your hand.". At the same time, some participants pointed out that it would be nice if the device could be immobilized, P7 "Preferably in scenarios where the hardware can be immobilized, such as when using an exoskeleton mech." Some participants (P1, P13) also noted that controlling forearm elongating in boxing scenarios would be nice through forearm movements, "It would be more natural if body movements triggered it." Finally, many participants (P5, P6, P7, P13, P16) noted that having more things to interact with would be nice, "It feels like more things could be added to the scene that can be interacted with."

Skip 9DISCUSSION Section

9 DISCUSSION

9.1 Reflections on Studies

Our research demonstrated the positive impact of using body deformation illusions in VR to enhance user body ownership and immersion. Additionally, we found that haptic feedback, explicitly skin-stretching, can further enhance body ownership. In the length adjustment study, we aimed to determine the minimal number of actuators required to induce forearm elongating and shortening illusions convincingly. Our findings indicate that using only two actuators is sufficient to generate these illusions effectively, and it is significantly better than the visual-only condition. This diverges from existing work, such as Gum-gum shooting  [93], which shows the efficacy of skin-stretching techniques but is limited to forearm elongating illusions and lacks empirical investigation into the optimal number of actuators for inducing such illusions. Additionally, our Perception Study 2 result suggests synchronous actuator movements provide a more convincing illusion than using a single actuator or visual feedback alone.

The results of Study 1 were a surprise. The skin-stretch stimuli represent the elongating or shortening of the forearm’s skin due to the deformation. As such, we expect that two locations along the forearm (4 actuators) may render this virtual scaling stronger than one location on the forearm. The result (2 actuators) enabled us to design a lighter, smaller device that requires less power without sacrificing the immersive experience. These findings provide a solid foundation for future research on arm deformation illusions generated through skin-stretching of the forearm.

These insights were integrated into the motion design of our final wearable device, ArmDeformation, which utilizes two skin-stretching actuators that operate synchronously during arm deformation illusion. Furthermore, while previous research have focused on the thresholds for forearm elongating  [46] or practical techniques  [7], our work expands this by identifying angular thresholds for forearm bending illusion. Specifically, our results show that angles of up to 49.4830° for upward bending and 50.5061° for downward bending do not adversely impact the user’s sense of body ownership. This empirical guideline informed our software development, leading us to set a conservative maximum forearm bending angle of 45° in our applications. Such new flexibility can be used in the future to examine new ways of interactions, where small physical movements of the limb may be mapped to a richer set of virtual limb actions, and using haptic feedback to control them better or sense them when they are not of the visual sight.

Our experimental evaluation confirms that ArmDeformation notably augments user immersion in VR. Unlike prior research  [47, 88] that centered on the PSE and JND for haptic perception, we aimed to identify the visual PSE and JND for forearm bending directions that users could reliably perceive. Our JND results indicate that users can distinguish forearm bending directions between 5.64° and 27.97° , and our PSE result shows the PSE values range from -8.4° to 9.0°. These findings offer valuable design considerations, enabling us to employ consistent forearm bending direction within this perceptual range, optimizing visual effects without compromising user experience. To assess the broader implications of our device, we developed possible applications and conducted an immersive study focusing on enhancing realism and enjoyment. The study results show that ArmDeformation significantly elevates realism and enjoyment of the superpower scenario. As Fig.  14 shows, using ArmDeformation is considerably preferred by users over visual-only conditions. This study emphasizes the effectiveness of skin-stretching methods in producing illusions of arm deformation, whereas Gum-gum shooting  [93] lacks supporting data.

9.2 Skin-stretching Feedback for Arm Deformation Illusion

Humans have a limited range of motion when it comes to rotating their forearms. They can only rotate them within a range of 104.54 ± 11.09 degrees  [70], which limits their ability to express the bending of their forearm. However, our research overcomes this limitation using alternate morphologies in body transformations  [6] in VR, which is forearm deformation. Our study results show that this approach is consistent with earlier research studies  [10, 75, 87], which aimed to enhance users’ joy in VR through different body deformations.

In previous studies  [45, 60, 67], it has been introduced that when sensory information is integrated, users tend to have a stronger sense of embodiment with their avatars, aligning with our findings. This integration, particularly the addition of haptic feedback, is known to lessen the cognitive load on users  [37]. Our research aligns with this by employing a dot-shaped silicone pad for skin-stretching feedback, enhancing the user’s sense of body ownership. This approach differs from Gum-gum shooting’s  [93] method of using bands for skin-stretching to simulate elongation. Instead, we use a dot-shaped pad that stretches the skin forward and backward, creating an immersive illusion of forearm bending. This method was also chosen over other techniques like EMS  [54, 55] or vibration  [33, 57] to deliver a strong and bidirectional force feedback at a single skin point. Our study’s results indicate that our skin-stretching design effectively simulates the sensation of forearm bending.

Users often struggle with full immersion in VR when embodying avatars with fantastical features, like the superhuman abilities of Rubberman  [15]. To overcome this, our research introduced an immersive arm deformation illusion by focusing on creating, differing from, and aligning with previous research that adjusts body parts’ length  [46, 50, 66, 93] or overall body size  [5, 10, 64, 75, 87]. Our initial studies, Studies 1 and 2, have proven that combining visual and haptic feedback enhances the sense of body ownership during these arm deformation experiences. Following the insights from Study 3, we can more convincingly generate the sensation of forearm deformation in VR, ensuring that users perceive the virtual forearm as their own. This approach allows us to present the illusion of forearm deformation without compromising the user’s sense of body ownership, such as deepening immersion into superhuman capabilities in VR. According to Study 5’s findings, users can have a realistic and enjoyable experience in VR, using ArmDeformation to elongate, shorten, or bend their forearms, thereby achieving a convincing and immersive arm deformation illusion.

9.3 Possible Applications

Figure 15:

Figure 15: Left: Baseball application, participants catch the baseball and feel the forearm bending; Right: Remote click application, Participants interact with different buttons by controlling the forearm elongating with VIVE controller.

Beyond the earlier-mentioned possible applications, we’ve thought of two more, as Fig.  15 shows. The first is a baseball game where our device makes players superhumans, and the haptic feedback enables them to control their rubber-like forearms. As catchers, players can sway their arms to catch balls they could not have done before, and as pitchers, they can control this bending using their wrists and use the new virtual elasticity of their limbs to generate super-powerful swings. This match-up is an example of adding new physical abilities that were impossible in real life and can generate new possibilities and interest in existing games. Our second idea is a "remote click" application for intelligent homes (see Fig.  15 right). Here, users control home devices like lights and TVs by elongating their forearms in VR and even bending them across a corner.

Besides entertainment, the ability to map users’ small motions to large and even physically impossible motions in VR is an essential capability for applications such as productivity and accessibility  [6, 53, 86]. It enables users to do large actions in the visual space without getting exhausted in the physical world or to level the virtual plane field for users limited in the extent and range of their physical motions in the real world. Haptic feedback that renders the new capabilities to the users is essential to learn new dexterities and better control, as well as feedback to the user when the limbs are not in the focus of the visual field  [3, 82, 90].

9.4 Limitations and Future Works

While we have tried to make our user research as good as possible, it still inevitably has limitations. In Studies 1 and 2, we considered the skin-stretching design after determining the number of actuators, but the skin-stretching design may also have different impacts on body ownership with different numbers of actuators. In future work, we will recruit more people to conduct more complete experiments that consider different skin-stretching designs with different numbers of actuators. In Study 3, we only considered thresholds for forearm bending upward and downward and did not consider thresholds for forearm bending in other directions. In future work, we will conduct more detailed experiments to establish complete thresholds for other bending directions. In Study 4, we use the same approach as in the previous research  [9, 47], using an estimation method to assess the JND and PSE values, which may have discrepancies between the actual values. In future research, we will recruit more participants for the JND and PSE study and use more precise measurement methods. In Study 5, we only consider realism and enjoyment. In future work, we can consider more dimensions, such as long-term usability and cognitive load.

ArmDeformation can enhance the realism and enjoyment of VR experiences, as demonstrated in User Study 5. However, it still has some limitations due to its prototype nature. Its weight can be reduced by using lighter materials like carbon fiber or LW-PLA, and its design can be more ergonomic and comfortable. The motors used can also be more lightweight, such as Dynamixel XL-320, and DF9GMS. This could also help reduce the size of the overall wearable.

Rotating the actuators in a new direction may delay the identification of a wish to bend the forearm and the actual skin-stretching rendering. This latency may be reduced by rotating the actuator prehand based on anticipating possible rotation directions but at the cost of increasing power consumption. The usage of predictive algorithms can also reduce average delay. Another limitation of the current implementation is using a fixed bending direction once the bending has begun. While the dominance of the visual sense can be used to show some rotation of the bending direction, enabling this capability may become a subject of future research.

We found that using natural anchor points of the skin (e.g., the elbow) might be enough to generate a skin-stretching effect that fits the arm deformation illusion. Yet, it ignores other natural anchor points on the other end of the arm, which might have generated an opposing stretch feeling. In future work, we will explore additional natural anchor points for arm deformation illusion creation. The dynamic range of skin-stretching rendering is limited. We intend to explore further ways to better use the stimuli, from understanding the stretch JND to nonlinear mapping from bending angles to haptic-rendered stimuli.

In future work, we will explore the advantages of integrating different haptic feedback methods, like EMS  [54, 55] and vibration  [33, 57], with the skin-stretching technique to enhance body ownership. Moreover, we plan to investigate more haptic pattern designs that can induce various arm deformation illusions. Unlike the Gum-gum shooting  [93], ArmDeformation lacks the function of changing the center of gravity. Therefore, future work will also explore the possibility of incorporating changes to the center of gravity along with our skin-stretching technique. Additionally, we will use faster servo motors to optimize mapping speed between skin-stretching and visual feedback. Other potential applications of arm deformation will be explored, such as retargeting and health.

Finally, the focus of this work was to examine the haptic rendering of forearm bending, and we used a simple interaction scheme, using a VIVE Controller to let the user control the bending. We intend to develop a more natural way of controlling arm deformation in VR, for example, using the wrist  [23] and palm gestures  [24, 26]. We consider a technique that is similar to FingerMapper  [86]. FingerMapper mapped small-scale finger movements to the virtual arm and hand. Referring to this, we explore mapping the movement of the user’s wrist to the degree of bending in the forearm.

Skip 10CONCLUSION Section

10 CONCLUSION

This paper introduces a wearable skin-stretching device, ArmDeformation, which induces the illusion of arm deformation through skin-stretching in VR. Our research distinguishes itself from previous research by exploring four key points: (1) two skin-stretching actuators can be used to induce the forearm elongating or shortening illusions effectively; (2) all actuators move synchronously can be used to induce the forearm bending illusion effectively (3) with a maximal bending angle of about 50° while maintaining embodiment and (4) the rotational JND of the bending direction differs between upward and downward bending, ranging between 5.64° to 27.97°. We demonstrated possible application scenarios that show how designers can utilize the unique properties of our device to induce users to achieve superhuman abilities in VR.

Skip ACKNOWLEDGMENTS Section

ACKNOWLEDGMENTS

Seungwoo Je is supported by the Open Project Program of the State Key Laboratory of Virtual Reality Technology and Systems, Beihang University (No.VRLAB2023A04).

A APPENDIX: GEOMETRIC BENDING OF THE HAND MODEL

In VR, we utilized a specific model to simulate forearm bending. Suppose the forearm’s length is Larm. The forearm gradually assumes a semi-circular shape as the bending angle α increases. In Unity, we calculate the coordinates of the vertices of the bent forearm. The first step is to determine the location of the calculation point projection on the forearm centerline. Subsequently, by introducing suitable variables, we can accurately compute the coordinates of the vertices on any side of the centerline.

Relying on the forearm bending model previously mentioned, we can calculate the coordinates of the changed forearm centerline. In this context, R is the semicircle’s radius, darm represents the ratio of a projection point on the centerline’s distance from the base to the length of the forearm in its initial state, β denotes the point’s bending angle in the model, and γ indicates the direction of the forearm’s bend (the plane is perpendicular to the forearm, with 0° directly above it, the y-axis is aligned with the forearm’s extension, while the x-axis aligns with 0°): (2) \(\begin{align} R&=\frac{L_{\text{arm}}}{\alpha } \end{align}\) (3) \(\begin{align} \beta &=\alpha * d_{\text{arm}} \end{align}\) (4) \(\begin{align} x&=R *(1-\cos \beta) * \sin (\gamma) \end{align}\) (5) \(\begin{align} y&=R * \sin \beta \end{align}\) (6) \(\begin{align} z&=R *(1-\cos \beta) * \cos (\gamma) \end{align}\) After calculating the forearm centerline coordinates, it becomes necessary to compute the offset of the vertex in the XZ plane. In this equation, xo, zo represent the vertex’s initial coordinates, while Loffset denotes the distance of the vertex from the centerline in its initial state, also taking into account the vertex’s direction γthis relative to the x-axis: (7) \(\begin{align} L_{\text{offset }}&=\sqrt [2]{x_o^2+z_o^2 } \end{align}\) (8) \(\begin{align} \gamma _{\text{relative }}&=\arctan \frac{x_o}{z_o}-\gamma \end{align}\) (9) \(\begin{align} x_{\text{relative }}&=L_{\text{offset }} * \sin \left(\gamma _{\text{relative }}\right) \end{align}\) (10) \(\begin{align} z_{\text{relative }}&=L_{\text{offset }} * \cos \left(\gamma _{\text{relative }}\right) * \cos (\beta) \end{align}\) (11) \(\begin{align} x_a&=x_{\text{relative }} * \cos (\gamma)+z_{\text{relative }} * \sin (\gamma) \end{align}\) (12) \(\begin{align} z_a&=-x_{\text{relative }} * \sin (\gamma)+z_{\text{relative }} * \cos (\gamma) \end{align}\) Calculates the offset of the vertex in the y-direction of the point: (13) \(\begin{align} y_b=-L_{\text{offset }} * \sin (\beta) * \cos \left(\gamma _{\text{relative}}\right) \end{align}\) Finally, we can add up the individual quantities to get the coordinates of each vertex(x + xa, y + yb, z + za) of the forearm.

To bend the forearm model while keeping the hand model tangent to it, we calculate the position of each vertex of the hand using the following formula. Suppose the hand’s length is Lhand, and dhand represents the ratio of a point on the centerline’s distance from the base to the length of the hand in its initial state. (14) \(\begin{align} x_{\text{hand}} &= R * (1 - \cos \alpha) * \sin \gamma + L_{hand} * d_{hand} * \cos {(\cfrac{\pi }{2} - \alpha)} * \sin \gamma \end{align}\) (15) \(\begin{align} y_{\text{hand}} &= R * \sin \alpha + L_{hand} * d_{hand} * \sin {(\cfrac{\pi }{2} - \alpha)} \end{align}\) (16) \(\begin{align} z_{\text{hand}} &= R * (1 - \cos \alpha) * \cos \gamma + L_{hand} * d_{hand} * \cos {(\cfrac{\pi }{2} - \alpha)} * \cos \gamma \end{align}\) The offsets of the hand’s vertices in the x, y, and z directions are the xa, yb, and za found when darm = 1. The hand’s computed vertex position is therefore (xhand + xa, yhand + yb, zhand + za)

B APPENDIX: SYSTEM BOARD DESIGN

We opted for the LM7806 voltage regulator to set up a simple circuit that converts 12V to 6V. According to the MG996R datasheet, under a 6V working condition, their current consumption ranges from 500mA to 900mA. For four such actuators, the cumulative current requirement is between 2A and 3.6A. However, the maximum output current of the LM7806 is just 1A, which does not meet the requirement. Additionally, the LM7806 generates significant heat during operation, so we must consider heat dissipation and cost. Therefore, we designed an external circuit for the LM7806. We employed a PNP transistor (TIP2955) to amplify the current. According to the TIP2955 datasheet, when the voltage between the collector and emitter is 4V, the minimum amplification factor is 20, and the collector can output a current of 4A. Given that the emitter of the transistor is connected to the 12V power source and the collector is connected to the 6V output of the LM7806, there’s a voltage difference of 6V. Our calculations determined that a 0.5-ohm resistor connected to the emitter would suffice. However, to be cautious, we opted for a 1-ohm resistor. Even though the amplification factor didn’t reach 20, our tests confirmed that the circuit can still drive all four servos synchronously.

For the DC motor, we selected the motor drive chip (L293D) to drive it. Moreover, the enable pin of the L293D is connected to the PWM pin 3 of the Arduino Mega, allowing us to control the motor’s rotation speed. The DC motor operates at 12V, so the VS pin of the L293D is directly connected to the 12V power source. Additionally, our PCB has two connections to microswitches (1A 125V AC). When the microswitch is not pressed, the Mega reads a high voltage level; when pressed, it reads a low level. These microswitches detect if the skin-stretching subsystem is touching the user’s skin.

Footnotes

  1. The corresponding author.

Skip Supplemental Material Section

Supplemental Material

Video Preview

Video Preview

mp4

4.9 MB

Video Presentation

Video Presentation

mp4

20.2 MB

Supplemental Video

Supplemental video

mp4

28.6 MB

References

  1. [1] 2024. https://www.marvel.com/teams-and-groups/fantastic-fourGoogle ScholarGoogle Scholar
  2. [2] 2024. https://www.cdc.gov/nceh/hearing_loss/what_noises_cause_hearing_loss.htmlGoogle ScholarGoogle Scholar
  3. Moaed A Abd, Joseph Ingicco, Douglas T Hutchinson, Emmanuelle Tognoli, and Erik D Engeberg. 2022. Multichannel haptic feedback unlocks prosthetic hand dexterity. Scientific reports 12, 1 (2022), 2323.Google ScholarGoogle Scholar
  4. Parastoo Abtahi and Sean Follmer. 2018. Visuo-Haptic Illusions for Improving the Perceived Performance of Shape Displays. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (, Montreal QC, Canada,) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3173574.3173724Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Parastoo Abtahi, Mar Gonzalez-Franco, Eyal Ofek, and Anthony Steed. 2019. I’m a Giant: Walking in Large Virtual Environments at High Speed Gains. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300752Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Parastoo Abtahi, Sidney Q. Hough, James A. Landay, and Sean Follmer. 2022. Beyond Being Real: A Sensorimotor Control Perspective on Interactions in Virtual Reality. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (, New Orleans, LA, USA,) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 358, 17 pages. https://doi.org/10.1145/3491102.3517706Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Merwan Achibet, Adrien Girard, Anthony Talvas, Maud Marchal, and Anatole Lécuyer. 2015. Elastic-Arm: Human-scale passive haptic feedback for augmenting interaction and perception in virtual environments. In 2015 IEEE Virtual Reality (VR). 63–68. https://doi.org/10.1109/VR.2015.7223325Google ScholarGoogle ScholarCross RefCross Ref
  8. Kenneth Aizawa. 2007. Understanding the embodiment of perception. The Journal of philosophy 104, 1 (2007), 5–25.Google ScholarGoogle ScholarCross RefCross Ref
  9. Tomohiro Amemiya and Hiroaki Gomi. 2016. Active manual movement improves directional perception of illusory force. IEEE transactions on haptics 9, 4 (2016), 465–473.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Domna Banakou, Raphaela Groten, and Mel Slater. 2013. Illusory ownership of a virtual child body causes overestimation of object sizes and implicit attitude changes. Proceedings of the National Academy of Sciences 110, 31 (2013), 12846–12851.Google ScholarGoogle ScholarCross RefCross Ref
  11. Karlin Bark, Emily Hyman, Frank Tan, Elizabeth Cha, Steven A Jax, Laurel J Buxbaum, and Katherine J Kuchenbecker. 2014. Effects of vibrotactile feedback on human learning of arm motions. IEEE Transactions on Neural Systems and Rehabilitation Engineering 23, 1 (2014), 51–63.Google ScholarGoogle ScholarCross RefCross Ref
  12. Edoardo Battaglia, Janelle P. Clark, Matteo Bianchi, Manuel G. Catalano, Antonio Bicchi, and Marcia K. O’Malley. 2017. The Rice Haptic Rocker: Skin stretch haptic feedback with the Pisa/IIT SoftHand. In 2017 IEEE World Haptics Conference (WHC). 7–12. https://doi.org/10.1109/WHC.2017.7989848Google ScholarGoogle ScholarCross RefCross Ref
  13. Christopher C. Berger, Baihan Lin, Bigna Lenggenhager, Jaron Lanier, and Mar Gonzalez-Franco. 2022. Follow Your Nose: Extended Arm Reach After Pinocchio Illusion in Virtual Reality. Frontiers in Virtual Reality 3 (2022). https://doi.org/10.3389/frvir.2022.712375Google ScholarGoogle ScholarCross RefCross Ref
  14. Alain Berthoz, Bernard Pavard, and Laurence R Young. 1975. Perception of linear horizontal self-motion induced by peripheral vision (linearvection) basic characteristics and visual-vestibular interactions. Experimental brain research 23 (1975), 471–489.Google ScholarGoogle Scholar
  15. Matthew Botvinick and Jonathan Cohen. 1998. Rubber hands ‘feel’touch that eyes see. Nature 391, 6669 (1998), 756–756.Google ScholarGoogle Scholar
  16. Dalila Burin, Konstantina Kilteni, Marco Rabuffetti, Mel Slater, and Lorenzo Pia. 2019. Body ownership increases the interference between observed and executed movements. PloS one 14, 1 (2019), e0209899.Google ScholarGoogle ScholarCross RefCross Ref
  17. Nathaniel A. Caswell, Ryan T. Yardley, Markus N. Montandon, and William R. Provancher. 2012. Design of a forearm-mounted directional skin stretch device. In 2012 IEEE Haptics Symposium (HAPTICS). 365–370. https://doi.org/10.1109/HAPTIC.2012.6183816Google ScholarGoogle ScholarCross RefCross Ref
  18. Hong-Yu Chang, Wen-Jie Tseng, Chia-En Tsai, Hsin-Yu Chen, Roshan Lalintha Peiris, and Liwei Chan. 2018. FacePush: Introducing Normal Force on Face with Head-Mounted Displays. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (, Berlin, Germany,) (UIST ’18). Association for Computing Machinery, New York, NY, USA, 927–935. https://doi.org/10.1145/3242587.3242588Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Francesco Chinello, Claudio Pacchierotti, Joao Bimbo, Nikos G Tsagarakis, and Domenico Prattichizzo. 2017. Design and evaluation of a wearable skin stretch device for haptic guidance. IEEE Robotics and Automation Letters 3, 1 (2017), 524–531.Google ScholarGoogle ScholarCross RefCross Ref
  20. Francesco Chinello, Claudio Pacchierotti, Nikos G. Tsagarakis, and Domenico Prattichizzo. 2016. Design of a wearable skin stretch cutaneous device for the upper limb. In 2016 IEEE Haptics Symposium (HAPTICS). 14–20. https://doi.org/10.1109/HAPTICS.2016.7463149Google ScholarGoogle ScholarCross RefCross Ref
  21. Inrak Choi, Heather Culbertson, Mark R. Miller, Alex Olwal, and Sean Follmer. 2017. Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (Québec City, QC, Canada) (UIST ’17). Association for Computing Machinery, New York, NY, USA, 119–130. https://doi.org/10.1145/3126594.3126599Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Inrak Choi, Eyal Ofek, Hrvoje Benko, Mike Sinclair, and Christian Holz. 2018. CLAW: A Multifunctional Handheld Haptic Controller for Grasping, Touching, and Triggering in Virtual Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (, Montreal QC, Canada,) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3173574.3174228Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Sohan Chowdhury, A K M Amanat Ullah, Nathan Bruce Pelmore, Pourang Irani, and Khalad Hasan. 2022. WriArm: Leveraging Wrist Movement to Design Wrist+Arm Based Teleportation in VR. In 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 317–325. https://doi.org/10.1109/ISMAR55827.2022.00047Google ScholarGoogle ScholarCross RefCross Ref
  24. Karim Cissé, Aprajit Gandhi, Danielle Lottridge, and Robert Amor. 2020. User Elicited Hand Gestures for VR-based Navigation of Architectural Designs. In 2020 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC). 1–5. https://doi.org/10.1109/VL/HCC50065.2020.9127275Google ScholarGoogle ScholarCross RefCross Ref
  25. Heather Culbertson, Julie M. Walker, Michael Raitor, and Allison M. Okamura. 2017. WAVES: A Wearable Asymmetric Vibration Excitation System for Presenting Three-Dimensional Translation and Rotation Cues. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 4972–4982. https://doi.org/10.1145/3025453.3025741Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. William Delamare, Chaklam Silpasuwanchai, Sayan Sarcar, Toshiaki Shiraki, and Xiangshi Ren. 2019. On Gesture Combination: An Exploration of a Solution to Augment Gesture Interaction. In Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces (, Daejeon, Republic of Korea,) (ISS ’19). Association for Computing Machinery, New York, NY, USA, 135–146. https://doi.org/10.1145/3343055.3359706Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Henrik H. Ehrsson. 2020. Chapter 8 - Multisensory processes in body ownership. (2020), 179–200. https://doi.org/10.1016/B978-0-12-812492-5.00008-5Google ScholarGoogle ScholarCross RefCross Ref
  28. Mar Gonzalez-Franco, Parastoo Abtahi, and Anthony Steed. 2019. Individual Differences in Embodied Distance Estimation in Virtual Reality. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 941–943. https://doi.org/10.1109/VR.2019.8798348Google ScholarGoogle ScholarCross RefCross Ref
  29. Mar Gonzalez-Franco and Christopher C Berger. 2019. Avatar embodiment enhances haptic confidence on the out-of-body touch illusion. IEEE transactions on haptics 12, 3 (2019), 319–326.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Mar Gonzalez-Franco, Brian Cohn, Eyal Ofek, Dalila Burin, and Antonella Maselli. 2020. The Self-Avatar Follower Effect in Virtual Reality. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 18–25. https://doi.org/10.1109/VR46266.2020.00019Google ScholarGoogle ScholarCross RefCross Ref
  31. Mar Gonzalez-Franco and Tabitha C Peck. 2018. Avatar embodiment. towards a standardized questionnaire. Frontiers in Robotics and AI 5 (2018), 74.Google ScholarGoogle ScholarCross RefCross Ref
  32. Mar González-Franco, Tabitha C Peck, Antoni Rodríguez-Fornells, and Mel Slater. 2014. A threat to a virtual hand elicits motor cortex activation. Experimental brain research 232 (2014), 875–887.Google ScholarGoogle Scholar
  33. Guy M Goodwin, D Ian McCloskey, and Peter BC Matthews. 1972. Proprioceptive illusions induced by muscle vibration: contribution by muscle spindles to perception?Science 175, 4028 (1972), 1382–1384.Google ScholarGoogle Scholar
  34. Claire C Gordon, Cynthia L Blackwell, Bruce Bradtmiller, Joseph L Parham, Patricia Barrientos, Stephen P Paquette, Brian D Corner, Jeremy M Carson, Joseph C Venezia, Belva M Rockwell, 2014. 2012 anthropometric survey of us army personnel: Methods and summary statistics. Army Natick Soldier Research Development and Engineering Center MA, Tech. Rep (2014).Google ScholarGoogle Scholar
  35. Sebastian Günther, Florian Müller, Markus Funk, Jan Kirchner, Niloofar Dezfuli, and Max Mühlhäuser. 2018. TactileGlove: Assistive Spatial Guidance in 3D Space through Vibrotactile Navigation. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference (Corfu, Greece) (PETRA ’18). Association for Computing Machinery, New York, NY, USA, 273–280. https://doi.org/10.1145/3197768.3197785Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Nur Al-huda Hamdan, Adrian Wagner, Simon Voelker, Jürgen Steimle, and Jan Borchers. 2019. Springlets: Expressive, Flexible and Silent On-Skin Tactile Interfaces. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3290605.3300718Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Masaki Haruna, Masaki Ogino, and Toshiaki Koike-Akino. 2020. Proposal and evaluation of visual haptics for manipulation of remote machine system. Frontiers in Robotics and AI 7 (2020), 529040.Google ScholarGoogle ScholarCross RefCross Ref
  38. David Hecht and Miriam Reiner. 2009. Sensory dominance in combinations of audio, visual and haptic stimuli. Experimental brain research 193 (2009), 307–314.Google ScholarGoogle Scholar
  39. Alexandra Ion, Edward Jay Wang, and Patrick Baudisch. 2015. Skin Drag Displays: Dragging a Physical Tactor across the User’s Skin Produces a Stronger Tactile Stimulus than Vibrotactile. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (Seoul, Republic of Korea) (CHI ’15). Association for Computing Machinery, New York, NY, USA, 2501–2504. https://doi.org/10.1145/2702123.2702459Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Seungwoo Je, Myung Jin Kim, Woojin Lee, Byungjoo Lee, Xing-Dong Yang, Pedro Lopes, and Andrea Bianchi. 2019. Aero-plane: A Handheld Force-Feedback Device that Renders Weight Motion Illusion on a Virtual 2D Plane. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 763–775. https://doi.org/10.1145/3332165.3347926Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Seungwoo Je, Brendan Rooney, Liwei Chan, and Andrea Bianchi. 2017. tactoRing: A Skin-Drag Discrete Display. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 3106–3114. https://doi.org/10.1145/3025453.3025703Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Sungchul Jung, Gerd Bruder, Pamela J. Wisniewski, Christian Sandor, and Charles E. Hughes. 2018. Over My Hand: Using a Personalized Hand in VR to Improve Object Size Estimation, Body Ownership, and Presence. In Proceedings of the 2018 ACM Symposium on Spatial User Interaction (Berlin, Germany) (SUI ’18). Association for Computing Machinery, New York, NY, USA, 60–68. https://doi.org/10.1145/3267782.3267920Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Oliver Beren Kaul and Michael Rohs. 2017. HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 3729–3740. https://doi.org/10.1145/3025453.3025684Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Konstantina Kilteni, Raphaela Groten, and Mel Slater. 2012. The sense of embodiment in virtual reality. Presence: Teleoperators and Virtual Environments 21, 4 (2012), 373–387.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Konstantina Kilteni, Antonella Maselli, Konrad P Kording, and Mel Slater. 2015. Over my fake body: body ownership illusions for studying the multisensory basis of own-body perception. Frontiers in human neuroscience 9 (2015), 141.Google ScholarGoogle Scholar
  46. Konstantina Kilteni, Jean-Marie Normand, Maria V Sanchez-Vives, and Mel Slater. 2012. Extending body space in immersive virtual reality: a very long arm illusion. PloS one 7, 7 (2012), e40867.Google ScholarGoogle ScholarCross RefCross Ref
  47. Hwan Kim, HyeonBeom Yi, Hyein Lee, and Woohun Lee. 2018. HapCube: A Wearable Tactile Device to Provide Tangential and Normal Pseudo-Force Feedback on a Fingertip. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (, Montreal QC, Canada,) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3173574.3174075Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Myung Jin Kim, Neung Ryu, Wooje Chang, Michel Pahud, Mike Sinclair, and Andrea Bianchi. 2022. SpinOcchio: Understanding Haptic-Visual Congruency of Skin-Slip in VR with a Dynamic Grip Controller. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (, New Orleans, LA, USA,) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 433, 14 pages. https://doi.org/10.1145/3491102.3517724Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Kenri Kodaka, Yutaro Sato, and Kento Imai. 2022. The slime hand illusion: Nonproprioceptive ownership distortion specific to the skin region. i-Perception 13, 6 (2022), 20416695221137731.Google ScholarGoogle Scholar
  50. Ryota Kondo, Sachiyo Ueda, Maki Sugimoto, Kouta Minamizawa, Masahiko Inami, and Michiteru Kitazaki. 2018. Invisible Long Arm Illusion: Illusory Body Ownership by Synchronous Movement of Hands and Feet. In ICAT-EGVE 2018 - 28th International Conference on Artificial Reality and Telexistence and 23rd Eurographics Symposium on Virtual Environments(ICAT-EGVE 2018 - 28th International Conference on Artificial Reality and Telexistence and 23rd Eurographics Symposium on Virtual Environments), Gerd Bruder, Sue Cobb, Shunsuke Yoshimoto, Dieter W. Fellner, Werner Hansmann, Werner Purgathofer, and Francois Sillion (Eds.). The Eurographics Association, 21–28. https://doi.org/10.2312/egve.20181310 Funding Information: This study was partly supported by JST ERATO (JPMJER1701) and Grant-in-Aid for Grant-in-Aid for Scientific Research (A) (15H01701), MEXT, Japan. Publisher Copyright: © 2018 The Author(s).; 28th International Conference on Artificial Reality and Telexistence and 23rd Eurographics Symposium on Virtual Environments, ICAT-EGVE 2018 ; Conference date: 07-11-2018 Through 09-11-2018.Google ScholarGoogle ScholarCross RefCross Ref
  51. Andrey Krekhov, Sebastian Cmentowski, Katharina Emmerich, and Jens Krüger. 2019. Beyond Human: Animals as an Escape from Stereotype Avatars in Virtual Reality Games. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play (, Barcelona, Spain,) (CHI PLAY ’19). Association for Computing Machinery, New York, NY, USA, 439–451. https://doi.org/10.1145/3311350.3347172Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. David R Labbe, Kean Kouakoua, Rachid Aissaoui, Sylvie Nadeau, and Cyril Duclos. 2021. Proprioceptive stimulation added to a walking self-avatar enhances the illusory perception of walking in static participants. Frontiers in Virtual Reality 2 (2021), 557783.Google ScholarGoogle ScholarCross RefCross Ref
  53. R.W. Lindeman, J.L. Sibert, and J.N. Templeman. 2001. The effect of 3D widget representation and simulated surface constraints on interaction in virtual environments. 141–148 pages. https://doi.org/10.1109/VR.2001.913780Google ScholarGoogle ScholarCross RefCross Ref
  54. Pedro Lopes, Alexandra Ion, and Patrick Baudisch. 2015. Impacto: Simulating Physical Impact by Combining Tactile Stimulation with Electrical Muscle Stimulation. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (Charlotte, NC, USA) (UIST ’15). Association for Computing Machinery, New York, NY, USA, 11–19. https://doi.org/10.1145/2807442.2807443Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Pedro Lopes, Sijing You, Lung-Pan Cheng, Sebastian Marwecki, and Patrick Baudisch. 2017. Providing Haptics to Walls & Heavy Objects in Virtual Reality by Means of Electrical Muscle Stimulation. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 1471–1482. https://doi.org/10.1145/3025453.3025600Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Andrea Webb Luangrath, Joann Peck, William Hedgcock, and Yixiang Xu. 2022. Observing product touch: The vicarious haptic effect in digital marketing and virtual reality. Journal of Marketing Research 59, 2 (2022), 306–326.Google ScholarGoogle ScholarCross RefCross Ref
  57. Hu Luo, Zemin Wang, Zhicheng Wang, Yuru Zhang, and Dangxiao Wang. 2023. Perceptual Localization Performance of the Whole Hand Vibrotactile Funneling Illusion. EEE Trans. Haptics 16, 2 (apr 2023), 240–250. https://doi.org/10.1109/TOH.2023.3266432Google ScholarGoogle ScholarDigital LibraryDigital Library
  58. Yoky Matsuoka, Sonya J Allin, and Roberta L Klatzky. 2002. The tolerance for visual feedback distortions in a virtual environment. Physiology & behavior 77, 4-5 (2002), 651–655.Google ScholarGoogle Scholar
  59. Taha K. Moriyama, Ayaka Nishi, Rei Sakuragi, Takuto Nakamura, and Hiroyuki Kajimoto. 2018. Development of a wearable haptic device that presents haptics sensation of the finger pad to the forearm. In 2018 IEEE Haptics Symposium (HAPTICS). 180–185. https://doi.org/10.1109/HAPTICS.2018.8357173Google ScholarGoogle ScholarCross RefCross Ref
  60. Jean-Marie Normand, Elias Giannopoulos, Bernhard Spanlang, and Mel Slater. 2011. Multisensory stimulation can induce an illusion of larger belly size in immersive virtual reality. PloS one 6, 1 (2011), e16128.Google ScholarGoogle ScholarCross RefCross Ref
  61. Nicolas Nostadt, David A Abbink, Oliver Christ, and Philipp Beckerle. 2020. Embodiment, presence, and their intersections: teleoperation and beyond. ACM Transactions on Human-Robot Interaction (THRI) 9, 4 (2020), 1–19.Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. Riku Otono, Adélaïde Genay, Monica Perusquía-Hernández, Naoya Isoyama, Hideaki Uchiyama, Martin Hachet, Anatole Lécuyer, and Kiyoshi Kiyokawa. 2023. I’m Transforming! Effects of Visual Transitions to Change of Avatar on the Sense of Embodiment in AR. In 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR). 83–93. https://doi.org/10.1109/VR55154.2023.00024Google ScholarGoogle ScholarCross RefCross Ref
  63. Tabitha C Peck and Mar Gonzalez-Franco. 2021. Avatar embodiment. a standardized questionnaire. Frontiers in Virtual Reality 1 (2021), 575943.Google ScholarGoogle ScholarCross RefCross Ref
  64. Ivelina V Piryankova, Hong Yu Wong, Sally A Linkenauger, Catherine Stinson, Matthew R Longo, Heinrich H Bülthoff, and Betty J Mohler. 2014. Owning an overweight or underweight body: distinguishing the physical, experienced and virtual body. PloS one 9, 8 (2014), e103428.Google ScholarGoogle ScholarCross RefCross Ref
  65. Pornthep Preechayasomboon and Ali Israr. 2020. Crossing the Chasm: Linking with the Virtual World through a Compact Haptic Actuator. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (, Honolulu, HI, USA,) (CHI EA ’20). Association for Computing Machinery, New York, NY, USA, 1–4. https://doi.org/10.1145/3334480.3383137Google ScholarGoogle ScholarDigital LibraryDigital Library
  66. Catherine Preston and Roger Newport. 2012. How long is your arm? Using multisensory illusions to modify body image from the third person perspective. Perception 41, 2 (2012), 247–249.Google ScholarGoogle ScholarCross RefCross Ref
  67. Nora Preuss, B Laufey Brynjarsdóttir, and H Henrik Ehrsson. 2018. Body ownership shapes self-orientation perception. Scientific reports 8, 1 (2018), 16062.Google ScholarGoogle Scholar
  68. Zhan Fan Quek, Samuel B Schorr, Ilana Nisky, Allison M Okamura, and William R Provancher. 2014. Augmentation of stiffness perception with a 1-degree-of-freedom skin stretch device. IEEE Transactions on Human-Machine Systems 44, 6 (2014), 731–742.Google ScholarGoogle ScholarCross RefCross Ref
  69. Arran T Reader and Laura Crucianelli. 2019. A multisensory perspective on the role of the amygdala in body ownership. The Journal of Neuroscience 39, 39 (2019), 7645.Google ScholarGoogle ScholarCross RefCross Ref
  70. Amirreza Sadeghifar, Shahab Ilka, Hasan Dashtbani, and Mansour Sahebozamani. 2014. A comparison of glenohumeral internal and external range of motion and rotation strength in healthy and individuals with recurrent anterior instability. Archives of Bone and Joint Surgery 2, 3 (2014), 215.Google ScholarGoogle Scholar
  71. Steeven Villa Salazar, Claudio Pacchierotti, Xavier de Tinguy, Anderson Maciel, and Maud Marchal. 2020. Altering the stiffness, friction, and shape perception of tangible objects in virtual reality using wearable haptics. IEEE transactions on haptics 13, 1 (2020), 167–174.Google ScholarGoogle ScholarDigital LibraryDigital Library
  72. Majed Samad, Albert Jin Chung, and Ladan Shams. 2015. Perception of body ownership is driven by Bayesian sensory inference. PloS one 10, 2 (2015), e0117178.Google ScholarGoogle ScholarCross RefCross Ref
  73. Valentin Schwind, Lorraine Lin, Massimiliano Di Luca, Sophie Jörg, and James Hillis. 2018. Touch with foreign hands: the effect of virtual hand appearance on visual-haptic integration. In Proceedings of the 15th ACM Symposium on Applied Perception (Vancouver, British Columbia, Canada) (SAP ’18). Association for Computing Machinery, New York, NY, USA, Article 9, 8 pages. https://doi.org/10.1145/3225153.3225158Google ScholarGoogle ScholarDigital LibraryDigital Library
  74. Irene Senna, Angelo Maravita, Nadia Bolognini, and Cesare V Parise. 2014. The marble-hand illusion. PloS one 9, 3 (2014), e91688.Google ScholarGoogle ScholarCross RefCross Ref
  75. Silvia Serino, Federica Scarpina, Alice Chirico, Antonios Dakanalis, Daniele Di Lernia, Desirée Colombo, Valentina Catallo, Elisa Pedroli, and Giuseppe Riva. 2020. Gulliver’s virtual travels: Active embodiment in extreme body sizes for modulating our body representations. Cognitive Processing 21 (2020), 509–520.Google ScholarGoogle ScholarCross RefCross Ref
  76. Vivian Shen, Craig Shultz, and Chris Harrison. 2022. Mouth Haptics in VR using a Headset Ultrasound Phased Array. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (, New Orleans, LA, USA,) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 275, 14 pages. https://doi.org/10.1145/3491102.3501960Google ScholarGoogle ScholarDigital LibraryDigital Library
  77. Youngbo Aram Shim, Taejun Kim, and Geehyuk Lee. 2022. QuadStretch: A Forearm-wearable Multi-dimensional Skin Stretch Display for Immersive VR Haptic Feedback. In Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI EA ’22). Association for Computing Machinery, New York, NY, USA, Article 202, 4 pages. https://doi.org/10.1145/3491101.3519908Google ScholarGoogle ScholarDigital LibraryDigital Library
  78. Filip Škola and Fotis Liarokapis. 2016. Examining the effect of body ownership in immersive virtual and augmented reality environments. The Visual Computer 32 (2016), 761–770.Google ScholarGoogle ScholarDigital LibraryDigital Library
  79. Mel Slater, Bernhard Spanlang, Maria V Sanchez-Vives, and Olaf Blanke. 2010. First person experience of body transfer in virtual reality. PloS one 5, 5 (2010), e10564.Google ScholarGoogle ScholarCross RefCross Ref
  80. Bernhard Spanlang, Jean-Marie Normand, David Borland, Konstantina Kilteni, Elias Giannopoulos, Ausiàs Pomés, Mar González-Franco, Daniel Perez-Marcos, Jorge Arroyo-Palacios, Xavi Navarro Muncunill, 2014. How to build an embodiment lab: achieving body representation illusions in virtual reality. Frontiers in Robotics and AI 1 (2014), 9.Google ScholarGoogle ScholarCross RefCross Ref
  81. Frank Steinicke, Gerd Bruder, Jason Jerald, Harald Frenz, and Markus Lappe. 2009. Estimation of detection thresholds for redirected walking techniques. IEEE transactions on visualization and computer graphics 16, 1 (2009), 17–27.Google ScholarGoogle ScholarDigital LibraryDigital Library
  82. Jennifer L Sullivan, Shivam Pandey, Michael D Byrne, and Marcia K O’Malley. 2021. Haptic feedback based on movement smoothness improves performance in a perceptual-motor task. IEEE Transactions on Haptics 15, 2 (2021), 382–391.Google ScholarGoogle ScholarDigital LibraryDigital Library
  83. Yujie Tao, Cheng Yao Wang, Andrew D Wilson, Eyal Ofek, and Mar Gonzalez-Franco. 2023. Embodying Physics-Aware Avatars in Virtual Reality. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (, Hamburg, Germany,) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 254, 15 pages. https://doi.org/10.1145/3544548.3580979Google ScholarGoogle ScholarDigital LibraryDigital Library
  84. Hsin-Ruey Tsai, Yuan-Chia Chang, Tzu-Yun Wei, Chih-An Tsao, Xander Chin-yuan Koo, Hao-Chuan Wang, and Bing-Yu Chen. 2021. GuideBand: Intuitive 3D Multilevel Force Guidance on a Wristband in Virtual Reality. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (, Yokohama, Japan,) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 134, 13 pages. https://doi.org/10.1145/3411764.3445262Google ScholarGoogle ScholarDigital LibraryDigital Library
  85. Manos Tsakiris. 2010. My body in the brain: a neurocognitive model of body-ownership. Neuropsychologia 48, 3 (2010), 703–712.Google ScholarGoogle ScholarCross RefCross Ref
  86. Wen-Jie Tseng, Samuel Huron, Eric Lecolinet, and Jan Gugenheimer. 2023. FingerMapper: Mapping Finger Motions onto Virtual Arms to Enable Safe Virtual Reality Interaction in Confined Spaces. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (, Hamburg, Germany,) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 874, 14 pages. https://doi.org/10.1145/3544548.3580736Google ScholarGoogle ScholarDigital LibraryDigital Library
  87. Björn Van Der Hoort, Arvid Guterstam, and H Henrik Ehrsson. 2011. Being Barbie: the size of one’s own body determines the perceived size of the world. PloS one 6, 5 (2011), e20195.Google ScholarGoogle ScholarCross RefCross Ref
  88. Chi Wang, Da-Yuan Huang, Shuo-wen Hsu, Chu-En Hou, Yeu-Luen Chiu, Ruei-Che Chang, Jo-Yu Lo, and Bing-Yu Chen. 2019. Masque: Exploring Lateral Skin Stretch Feedback on the Face with Head-Mounted Displays. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 439–451. https://doi.org/10.1145/3332165.3347898Google ScholarGoogle ScholarDigital LibraryDigital Library
  89. Chi Wang, Da-Yuan Huang, Shuo-Wen Hsu, Cheng-Lung Lin, Yeu-Luen Chiu, Chu-En Hou, and Bing-Yu Chen. 2020. Gaiters: Exploring Skin Stretch Feedback on Legs for Enhancing Virtual Reality Experiences. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (, Honolulu, HI, USA,) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3313831.3376865Google ScholarGoogle ScholarDigital LibraryDigital Library
  90. Haoyu Wang, Xiang Cheng, Pei Huang, Meng Yu, Jiaqi Ma, Shigang Peng, Yue Cheng, Yuan Yu, Weimin Yang, Pengfei Wang, and Zhiwei Jiao. 2022. A Soft Electro-Hydraulic Pneumatic Actuator with Self-Sensing Capability toward Multi-Modal Haptic Feedback. Actuators 11, 3. https://doi.org/10.3390/act11030074Google ScholarGoogle ScholarCross RefCross Ref
  91. Eric Whitmire, Hrvoje Benko, Christian Holz, Eyal Ofek, and Mike Sinclair. 2018. Haptic Revolver: Touch, Shear, Texture, and Shape Rendering on a Reconfigurable Virtual Reality Controller. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (, Montreal QC, Canada,) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3173660Google ScholarGoogle ScholarDigital LibraryDigital Library
  92. Andrea Stevenson Won, Jeremy Bailenson, Jimmy Lee, and Jaron Lanier. 2015. Homuncular flexibility in virtual reality. Journal of Computer-Mediated Communication 20, 3 (2015), 241–259.Google ScholarGoogle ScholarDigital LibraryDigital Library
  93. Shunki Yamashita, Ryota Ishida, Arihide Takahashi, Hsueh-Han Wu, Hironori Mitake, and Shoichi Hasegawa. 2018. Gum-gum shooting: inducing a sense of arm elongation via forearm skin-stretch and the change in the center of gravity. In ACM SIGGRAPH 2018 Emerging Technologies. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3214907.3214909Google ScholarGoogle ScholarDigital LibraryDigital Library
  94. N Zarzycka and S Załuska. 1989. [Measurements of the forearm i inhabitants of the Lublin region].Annales Universitatis Mariae Curie-Sklodowska. Sectio D: Medicina 44 (1989), 85–92. https://api.semanticscholar.org/CorpusID:8625063Google ScholarGoogle Scholar
  95. Andre Zenner and Antonio Krüger. 2017. Shifty: A weight-shifting dynamic passive haptic proxy to enhance object perception in virtual reality. IEEE transactions on visualization and computer graphics 23, 4 (2017), 1285–1294.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. ArmDeformation: Inducing the Sensation of Arm Deformation in Virtual Reality Using Skin-Stretching

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '24: Proceedings of the CHI Conference on Human Factors in Computing Systems
      May 2024
      18961 pages
      ISBN:9798400703300
      DOI:10.1145/3613904

      Copyright © 2024 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 11 May 2024

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate6,199of26,314submissions,24%
    • Article Metrics

      • Downloads (Last 12 months)231
      • Downloads (Last 6 weeks)231

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format