skip to main content
10.1145/3613905.3651121acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
Work in Progress
Free Access

Context matters: Investigating information sharing in mixed-visual ability social interactions

Authors Info & Claims
Published:11 May 2024Publication History

Abstract

Social inclusion of disabled people has been a topic of interest in HCI research led by the rise of ubiquitous and camera-based technologies. As the research area is increasing, a comprehensive understanding of blind, partially sighted (BPS), and sighted people's needs in various social settings is needed to fully inform the design of social technologies. To address this, we conducted semi-structured individual and group interviews with 12 BPS and eight sighted participants. Our findings show that context-dependent information-sharing needs of BPS and sighted people vary across social contexts (illustrated in Figure 1). While currently depending on support from sighted companions, BPS participants expressed a strong sense of independence and agency. We discuss the tensions between BPS people's information needs, sighted people's privacy concerns, and implications for the design of social technologies to support the social inclusion of BPS people.

Skip 1INTRODUCTION Section

1 INTRODUCTION

Social interaction supports the critical human need to belong and to form interpersonal relationships [11]. Effective interpersonal communication and social interaction are necessary for professional and personal relationships [37]. Non-verbal communication, such as facial expressions, head movements, body posture, and hand gestures, accounts for more than half of interpersonal communication [6, 7, 24, 28]. Blind and partially sighted (BPS) people often encounter difficulties in social interactions due to the challenges in understanding the social environment and non-verbal social cues of people [14]. Limited access to non-verbal cues in social interactions can cause misunderstanding, resulting in uncomfortable and awkward situations [37] and communication breakdown [36], such as difficulties catching up with a conversation, knowing others’ feelings, and recognizing familiar people [32]. Such barriers are often compounded by impatience and negative attitudes from sighted people [22], leading to social exclusion and isolation of BPS people in the long term [37, 41, 44].

‘Inclusive social interactions’ has been a topic of growing interest in HCI and accessibility research. Most research has focused on camera-based technologies with incredible potential and strong socio-ethical implications [1, 26, 34, 45]. However, prior research has found ethical and design challenges of this technology, such as the consent of bystanders and the social acceptability of such devices in public settings [29, 33] alongside privacy concerns of the camera-based AT [4]. Furthermore, a lack of a contextual understanding of BPS people's challenges and capabilities further adds to the burden of exclusion in social interaction as BPS people may have to deal with irrelevant or insufficient information in addition to managing poorly designed assistive technologies (AT) [40]. On the other hand, sighted bystanders interacting with BPS people in a work environment may have privacy concerns regarding ‘infinitely sharing’ personal information through AT [2]. Given that the willingness to share personal information is highly contextual, more research is needed to investigate the information needs of BPS people in diverse social contexts and the information-sharing concerns of sighted people who have experience interacting with BPS people. To this end, we address these research questions in this work.

RQ1: What are BPS and sighted people's information-sharing needs and preferences in social interactions?

RQ2: How can technology facilitate mixed visual ability social interactions?

The contributions of this work include (1) a deeper insight into the information needs of BPS people and barriers to social interaction in diverse social situations and (2) design implications for technologies to address the diverse information-sharing needs of BPS and sighted people across social situations.

Figure 1:

Figure 1: Information sharing needs of BPS and sighted people in social settings

Skip 2RELATED WORK Section

2 RELATED WORK

Many studies have explored the facilitation of BPS people's social interaction by bridging the information gap due to the inaccessibility of non-verbal interaction [2]. Camera-based assistive technologies have been developed to transmit facial expressions and eye gaze to BPS users. For example, Accessibility Bot on Facebook Messenger1 45], and the iCare Interaction Assistant, a wearable device that assists BPS people in identifying their communication partners [31]. Further examples include a haptic belt prototype to convey people's facial expressions to BPS users through vibrotactile cues [13]; VibroGlove, which transmits different types of facial expressions of the communicator via several vibration patterns [32]; Tactile Band enables BPS people to access gaze signals from surrounding sighted people through tactile feedback [39]. These examples take images and translate them into haptics. Separately, Microsoft Seeing AI includes a facial recognition app to help BPS users identify people's gender, age, and emotion from their pictures. Microsoft PeopleLens [21] is a modified HoloLens that continuously captures images and provides BPS users with information about the name, identity, pose, and gaze direction of detected individuals and the total number of nearby people. Both use vision to audio as modalities [8].

The standing location, attention direction, identities, physical appearance, facial expression, gestures and body motions of surrounding people, and the number of people standing in front has been identified as the information needs of BPS people in social interactions [30]. In addition, knowing people's identity, relative location, physical attributes, and facial expressions is essential for BPS people [73], as well as information about people's relationships and availability for conversation [48]. BPS people desire to know the number of individuals around and their activity, identity, and proximity to feel secure in social contexts [1]. However, the focus has been on people's visual impairment without fully considering the capabilities of BPS people. Although Zhao et al. [43, 45] investigated strategies that BPS people use for recognizing people in social contexts, adequate empirical knowledge is still lacking on how BPS people make accommodations for themselves to perceive visual social cues and the limitations of these solutions.

Providing non-verbal information required by BPS people is the first critical design requirement of assistive technologies for facilitating social interactions [38]. Researchers and designers must recognize the most essential information requirements of BPS people in social interactions. To fill this gap, we investigated the non-verbal information needs of BPS people from a more comprehensive perspective by understanding both their capabilities and challenges in social interactions. As part of communication mechanisms, social interactions are a two-way street. Therefore, besides looking into the information needs of BPS people, we also explore sighted people's needs and barriers in communicating with BPS people.

Skip 3METHOD Section

3 METHOD

We conducted individual and group semi-structured interviews with 20 BPS and sighted participants. We build on previous research to investigate the information-sharing needs of BPS and sighted people, the effect of situational and contextual factors, and the potential role of technology in supporting mixed visual ability social interactions [19] [2, 6, 25].

3.1 Interviews

We conducted online semi-structured individual and group interviews with BPS people and sighted people via Microsoft Teams which lasted between 40 – 60 minutes. Each participant was offered a £10 or $10 Amazon voucher after the interview as compensation for their time. Ethical consent was obtained from the university's departmental ethics committee.

3.1.1 Individual Interviews.

The individual interviews with BPS participants focused on their lived experiences of social interactions with BPS and sighted people and the information needs and barriers that affect their engagement in social activities. The interview comprised four sections: (1) general information about participants’ lived experience of visual impairment, (2) previous experiences of social interactions and reflecting on positive and negative experiences, (3) strategies to navigate social interactions in mixed visual-ability social settings, and (4) attitudes towards technology to facilitate social interactions.

3.1.2 Group Interviews.

The findings from the individual interviews were essential to developing an initial understanding of our research topic. However, we recognized the need for discussion and presenting contrasting perspectives and diverse ideas in the interviews [43][35]. Therefore, two group interviews were conducted with mixed visual ability participants: Group A with three BPS participants (VGP1-VGP3) and three sighted participants (SGP3-SGP5); Group B with two BPS participants (VGP4, VGP5) and two sighted participants (SGP1, SGP2). In the group interviews, we conducted a role-play of six social scenarios between BPS and sighted participants (approaching people for help, workplace gathering, at a party, interacting with friends or family members, making new friends, public space/transportation) to further investigate information needs of BPS participants and information sharing preferences of sighted participants.

3.2 Participants

Twelve BPS (4 female, 8 male) and eight sighted people (5 female, 3 male) with experience of interacting with BPS people participated in the interviews. Participants were recruited through word of mouth, disabled people's organizations, and public social media forums.

The 12 BPS participants (VP1 – VP7, VGP1 – VGP 5) were aged between 21 and 77 years (mean = 33.83, SD = 16.63), and the majority lived in the UK except for VP2 and VGP who were from the USA and China respectively. All participants were partially sighted with some form of central or peripheral vision except VP1 and VP4, who had no residual vision. One participant (VP2) was unemployed, one was retired (VP1), and the rest worked as a volunteer (VGP2) or were students or employed. The participants had a minimum of 4 years of lived experience of visual impairment; none reported having visual impairment since birth. Table 1 summarizes the BPS participants for individual and group interviews.

The eight sighted participants (SP1 – SP3, SGP1 – SGP5) were aged between 22 – 39 years (mean = 26.75, SD = 4.89) and lived in the UK, except SP1 and SGP4, who lived in the USA. All participants were related to a BPS person with partial central or peripheral vision or total blindness (SP2, SP3) for at least two years since the onset of visual impairment. Detailed demographic information of the participants is presented in Table 2.

3.3 Data Analysis

The individual and group interviews were audio-recorded and transcribed. The interview procedure is described in Appendix B. We used an inductive thematic analysis [15] approach to analyze the interview data. First, transcripts were anonymized to remove any personal information about the participants that was brought up in the interviews. Next, we used NVivo 12 software to conduct the initial coding and then organize the codes into three broader themes: (1) BPS People's Information Needs in Social Interactions, (2) Information-sharing Concerns in Technology-Mediated Social Interactions, and (3) Exploring Information Sharing in Social Contexts.

Skip 4FINDINGS Section

4 FINDINGS

4.1 BPS People's Information Needs in Social Interactions

BPS participants described the challenges experienced during social interactions. The most common issue was feeling isolated, especially in group gatherings, due to the lack of access to non-verbal cues and visual information and limited support from sighted companions. In general, BPS participants relied on their capabilities and descriptions from sighted companions to access the non-verbal visual information. However, this was severely restricted due to its dependence on the sighted companions’ knowledge of and familiarity with the social setting.

4.1.1 Information needs of BPS people in social interactions.

All BPS participants preferred visual information such as spatial layout and characteristics, general behavior of the people in their surroundings, and non-verbal social cues (movement and gestures of people during an interaction, facial expression, appearance). Information about people's facial expressions was the most frequently mentioned by BPS participants. VP7, who is partially sighted, shared the concern that not being able to recognize people often stops them from initiating conversations and, therefore, leads to social awkwardness as the other person may not be aware of their visual impairment (which can vary due to ambient light, stress, and other health factors) at the time —“... help me to know how I should also feel about the particular person. If someone is approaching me looking so happy to see me, I should also give back a smile expression. And it would be much helpful.” (VP7)

In addition to the facial expressions, partially sighted participants were also interested in general descriptions of the people around them, including what they were wearing and how they looked. Interestingly, many BPS participants expressed great interest in knowing people's appearance and attire yet felt embarrassed to ask. One participant (VP2) showed concerns about people not following health and safety guidelines in front of BPS people because of their visual impairment. VP4 commented that knowing the description of the person they interact with can enable BPS people to be aware of their surroundings for potential health and safety risks — “What the person is wearing. So, if anything happens to me, I might explain it to someone.” (VP4)

Gender identity was also deemed important to appropriately address people in social settings, yet BPS participants shared the concern that voice was not sufficient to identify an individual's gender and could lead to awkwardness. — “We have a friend who sounds like a guy, but she's actually a girl. So I just feel we should know the gender of the person so we can appropriately address them.” (VP6)

Some BPS participants reported it would be useful to receive information about people's ages because sighted people can guess it by people's figures and faces, but BPS people are unable to do that. BPS participants had mixed preferences for knowing the ethnicity of the person during the interaction; some participants thought the ethnicity information was unnecessary and did not impact the interaction, while other participants mentioned a passing interest (VP1, VGP2).

Where possible, partially sighted participants preferred to use their residual vision to familiarize themselves with the environment and identify people. VGP2, who lost her central vision, described her experience of recognizing people from the outline of people's shape and their dressing styles and how she could tell whether people were available for help — “I can kind of tell by their body language that they're open. I try not to ask somebody who looks busy or looks like they're in a hurry or agitated.” (VGP2)

Apart from visual and behavioral attributes, BPS participants also wanted to learn about the environment when socially interacting with people, including the physical environment and spatial environment — “I would like to know how many people there are and what's the sequence of sitting and what's in front of me like that meeting room is how they set ... And then, if there like a party, that where the table seat and what's the surrounding, how the, how the room looks like.” (VGP4)

4.1.2 The role of sighted companion in addressing these needs.

Information sharing is a complex issue, as noted by VGP2, “I want access to all the information, but I also want to choose what I want to receive.” Particularly as BPS people experience social isolation and loss of agency due to reliance on sighted people, the need for access to information that is easily available to sighted people is not trivial. It highlights the societal and cultural injustice many people face due to their disability, as eloquently put by VGP2 — “And don't forget the sighted person can see... It's so blooming unfair. A sighted person is looking, and she's calculating all the time. But a VI person can't do that. There's, no fairness in that.” (VGP2)

BPS participants described the significant impact of sighted companions and caregivers in social settings; where sighted companions were forthcoming, supportive, and well-informed, BPS participants found the social experience enjoyable and felt included as the companion would describe the non-verbal information to them. This information often included facial expressions and reactions of people around them — “When having group discussions, I don't usually associate with them [other sighted] much because of my condition. So I have a friend that normally comes down to me in one-on-one discussions and explains it more to me.” (VP5)

Another major problem was sighted people's lack of understanding and awareness of visual impairment. Some BPS participants felt sighted caregivers were not always reliable and often overlooked BPS people's needs. The lack of awareness of visual impairment and support needs of BPS people meant that sighted people sometimes failed to provide useful information or to explain it in a way that would cause confusion. Furthermore, they commented that large social gatherings could be less inclusive due to several simultaneous activities, leading to BPS people not being able to participate in group activities like games and feeling left out — “The event that I attended, I went there with a friend, and my friend left without leaving a note to me when she had gone. So I just hung there.” (VP4)

Sighted participants also reflected on the challenges of communicating in mixed-visual ability group gatherings. One sighted participant (SP1) described that verbal communication with BPS people was not efficient and required a lot of effort to give audio descriptions, which were time-consuming and needed patience. While they advocated for verbal communication, SP1 reflected that the time and efforted required in describing often led to the social exclusion of BPS people — “I do not feel too good because sometimes it makes things slow. Sometimes you might just have just wanted things to happen fast and wait for no one... It takes a longer time to explain and then communicate the process.” (SP1)

4.2 Information-sharing Concerns in Technology-Mediated Social Interactions

We asked sighted participants about their preferences for sharing their visual and behavioral information with BPS people, and the responses included mixed opinions and concerns about the privacy and confidentiality of personal information, particularly if shared through technology. Some participants were reluctant to share personal information such as their name, age, and ethnicity, which would not be obvious to a sighted person at a glance but did not mind sharing visual information (height approximation, attire, movement, and behavior) — “I feel it's fine sharing [visual cues] with them as far as it's not personal information then fine.” (SGP4)

Meanwhile, several participants did not object to BPS people having access to their personal and social information, which they already share on online social media — “I have [shared my information] on my Facebook or my LinkedIn account. I think I'm willing to share this information because I put them there, and if he could see them... he would definitely see them. It's not fair to not give the same treatment to somebody who can't see and a person who can see. So I think it's OK for me for a blind person to know my information that's already on there.” (SP3)

Three BPS participants and two sighted participants suggested that the technologies should ask for permission before their information is shared with BPS people. Sighted participants desired to have full control in choosing what information is shared about them, when, and with whom (SGP2). Furthermore, SP1 commented that technology should allow them to share and withdraw any information if they later change their mind — “When at the point of activation ... of course, I would register, probably you wanna capture information or something. So I feel, probably, a consent form can be sent to my mail. OK. This person wants this information, and are you willing to share this information with this person? Then I can tick the boxes, and at the same time, I can leave the boxes unticked. [...]And then I think it should also give me the right to withdraw at any given time.” (SP1)

Being asked for consent might give them a sense of security, confirming that only appropriate information is shared. One possible way this could happen, as suggested by VGP2, is to have a swipe-through mechanism for people in the vicinity to be able to initiate social interaction. This would provide both parties with the control to allow or decline the request to share information and initiate social interaction. However, this also raised concern about the usefulness of the technology if people are unwilling to share information, as it would lead to further exclusion and stigmatization of BPS people — “So if we don't give consent, will this device still tell me that I'm standing in front of, I don't know, Jonathan, who is 30 years old? If Jonathan ticks ‘No’ and I tick ‘No’, then the machine is not gonna work, is it?” (VGP2)

Another suggestion was to give consent to share their information each time they are in a social setting where a BPS person could be using this technology. Several participants, including VP6, suggested they would like to receive notifications when their information is shared. Others felt comfortable with sharing a “standard bio,” which could be shared with everyone without having to approve it each time — “I think I prefer to know who is able to access my information and decide whether or not I want to share it.” (VP6)

Participants discussed their preferences for information to share in a “standard bio” — emphasizing the need to be able to choose the information that should be shared and the ability to change the information-sharing preferences in different contexts — “There should be like a chat box where I'm asked certain things I feel comfortable about sharing... Would you like them to know your height, race, color, and all of that? Then I get to choose what I feel is comfortable for me. So, whatever I feel uncomfortable with should not be shared... that allows me to add and delete.” (SP2)

4.3 Exploring Information Sharing in Social Contexts

We used scenarios to further investigate how different social situations influence the information needs of BPS people. These scenarios included situations that would normally occur in the participants’ routine activities. For example, approaching people for help, attending a workplace gathering, at an informal party, interacting with friends and family, making new friends, and in a public space / on public transport. From the discussions with participants, we identified five information categories BPS people found useful about their environment, people in their surroundings, and interactions with people (as illustrated in Figure 1). Table 3below describes the information needs of BPS people and information-sharing concerns of other people in diverse social contexts.

Table 1:
IDAgeGenderLocationOccupationVisual ImpairmentYears since Onset
Individual Interview Participants
VP177MUKRetiredBlind, no light perception40
VP225MUSANot employedLow visual acuity, with colour perception6
VP328MUKEmployedLoss of central vision5
VP425MUKEmployedBlind, with light perception6
VP522MUKStudentLow visual acuity, with colour perception5
VP621FUKStudentLow visual acuity, with colour perception5
VP725FUKStudentLow visual acuity, with colour perception4
Group Interview Participants
VGP123MUKEmployedLow visual acuity, with colour perception4
VGP248FUKCharityLoss of central vision30
VGP326MUKEmployedLoss of peripheral vision4
VGP456FChinaEmployedLoss of central vision19
VGP530MUKEmployedLow visual acuity, with colour perception4

Table 1: BPS participants’ demographic information

Table 2:
IDAgeGenderLocationRelationship to BPS PeopleVisual Impairment of Related BPS peopleYears since Onset
Individual Interview Participants
SP127FUSANieceLow visual acuity, with colour perception5
SP224FUKCousinTotally blind5
SP324FUKFamily MemberTotally blind7
Group Interview Participants
SGP126MUKFamily MemberLoss of central vision8
SGP222FUKFamily MemberLow visual acuity, with colour perceptionnot disclosed
SGP325MUKFamily MemberLoss of central vision5
SGP427FUSAFamily MemberLow visual acuity, with colour perception2
SGP539MUKFamily MemberLoss of central vision6

Table 2: Sighted participants’ demographic information

Table 3:
Social ContextInformation CategoryInformation NeedInformation Sharing Concern/Risk
Arriving at a gathering / public transport / work environmentSpatial / Environmental InformationOverall description of the space, the attendees present in the room, activities, and the general movement of the attendees. This would help to create a general awareness of the space for the BPS person to situate themselves and comfortably navigate the space.Low risk: Participants did not share any concern as no personal information about them was needed at this stage.
General appearance and behavior of the attendeesThe number of people in the room, what they are wearing, what they are doing, and their location in the space. ThisLow-medium risk: Participants did not share any specific concern as no personal information about them was needed at this stage.
Locating familiar peopleBasic description of the person and non-verbal behavioral cuesThe location of known persons in the vicinity and being guided to them. Once closer, knowing the behavior and emotional state of the person. Are they busy talking to someone else? Do they appear to be free and receptive? What are their facial expressions?Medium-high risk: Participants shared concern about personal information being shared and would like to decide with whom who to share this information. Additionally, they would like to receive the same information about the recipient. In general, participants were more likely to share this information with a person of the same gender or with someone they know.
Interacting with someoneThe person's appearance (what they are wearing) and other attributes such as their height, age, gender, and ethnicity. Additionally, during interaction, non-verbal cues such as facial expressions, body language, and eye movement would enhance the social interaction.
Interacting with a friend, family memberIntimate physical description of the personThe person's detailed appearance such as the shape of their body, their eye and hair color, whether their appearance has changed since the last interaction.High risk: Participants wanted to be able to decide which information is shared, with whom, and when. They would like to be asked each time someone requests this information in an interaction.

Table 3: Information sharing in diverse social contexts

Skip 5DISCUSSION Section

5 DISCUSSION

5.1 Information sharing in mixed visual-ability social interactions

Our findings highlight the tension between sighted people's privacy concerns and the desire to support the social inclusion of BPS people. Both BPS and sighted participants demonstrated great understanding and awareness of the value of non-verbal information in social interactions. Despite the desire to support the social inclusion of BPS people, sighted participants shared concerns regarding their personal information, such as the description of their appearance, attire, behavior, age, ethnicity, and other information being captured via camera-based technologies. These findings are consistent with previous research exploring the privacy concerns of bystanders in the context of camera-based technologies [3, 4, 17, 36]. Our findings also highlight the contextual nuances that may impact the sighted people's decision to share information where the BPS person is not perceived to be a threat. For example, the sighted participants showed more willingness to share personal information with BPS people of the same gender or with BPS people whom they had previously met.

These findings are in line with the prior studies reporting the importance of consent [3]. Future research should further investigate the meaning of ‘consent’ to bystanders in different contexts. For example, if an individual accepts an invitation to a party, it may mean he has already granted consent to share his information at that party. To protect privacy and satisfy the needs of BPS people, a possible design solution could be a feedback system on the device to let bystanders know they are detected and request confirmation of the information to be shared. Consequently, sighted people should also have access to the information of people with whom their information is being shared. As the social context is the main factor influencing both BPS people's information needs and bystanders’ information-sharing preferences, the two fields may overlap.

5.2 Context-adaptive social technologies to support information sharing

Our findings show that BPS people's information needs in social interactions are dependent on the context and BPS users themselves. This is in line with previous findings exploring the importance of context-specific information [4]. The need for access to concise, relevant, and appropriate information was identified in our findings, as all participants unanimously agreed that information can only be effective and useful when provided in a concise, appropriate, and user-friendly way. Both sighted and BPS participants desired to have control over what to share and what to receive and sought ways to customize the information sharing and receiving preferences according to the social context. In Figure 1, we highlight the varying information needs of BPS people in different contexts.

Context-adaptive or context-aware technologies [18] have been explored with HCI research in the context of personal informatics [16, 42], healthcare [9, 10], and learning [5, 23, 25]. Although the applications have been primarily human-centric, collecting or providing information to the technology user based on the user's movement and behavior. Some examples of such technologies are physical activity and health-tracking wearables, interactive displays, and, more recently, video games and digital personal assistants [27]. Context-adaptive social technologies could support BPS people's information needs by allowing customization of the type of information the user may wish to share or receive in a specific social context. For example, when seeking help in a train station, the technology could be adapted to locate a reliable person of authority, such as train station staff or a police officer, and guide the user away from potential hazards such as street furniture and obstacles, crowds, dogs, and groups of children. Whereas, when in a workplace gathering, the user could be supported to describe the space layout, food, and seating arrangements, locate a person of interest, and describe the persona's appearance and behavior, in particular when a familiar person is in the visual field and is indicating non-verbal cues (e.g., making eye contact, smiling). Technologies could also be made to adapt to the user's needs, behaviors, and interactions. For example, in low ambient light, a partially sighted person may find it difficult to see and require more information.

Beyond the functionality, ease of use, efficiency, and accuracy of the technology are some of the non-tangible yet important needs identified in this study. The information provided by the technology should be designed to be brief and intuitive as well as multimodal to enable users to adapt the technology to their preferences in different social contexts.

5.3 Limitations and Future Work

Although our findings have provided rich insights into the information needs of BPS people and information sharing by sighted people, we recognize that the number of participants in this study is relatively small and may affect the generalizability of our findings to a larger sample. Furthermore, all the sighted participants were family members of a BPS person and understood the challenges around the social inclusion of BPS people. Research has demonstrated that the relationship between the BPS and sighted persons could influence information-sharing preferences [12]; further research in this area should explore the information-sharing concerns with sighted persons with diverse experiences and in a wider range of social settings.

Skip 6CONCLUSION Section

6 CONCLUSION

This study investigated the needs of BPS people in diverse social contexts and the potential role of social technologies in addressing these needs. Our findings demonstrate that, in the absence of appropriate technologies, BPS people often rely on sighted family members’ support to navigate social settings, including learning about the environment, locating persons of interest, initiating conversations, and getting a description of people's appearance and behavior. Additionally, participants raised concerns regarding information privacy and security and the need for explicit consent in sharing information with the technology and subsequently sharing it with technology users. We discussed the potential of social technologies to support the information-sharing and receiving needs of BPS and sighted people in diverse social contexts. Our findings contribute a deeper insight into the information needs of BPS people and the information-sharing preferences of sighted people in the context of social technologies. We also contribute recommendations for the design technologies to address the diverse information-sharing needs of BPS and sighted people across social situations.

Skip Acknowledgements Section

Acknowledgements

This work was partially supported by JSPS KAKENHI Grant Number 22K21309.

Footnotes

  1. 1 https://www.messenger.com/

    Footnote
Skip Supplemental Material Section

Supplemental Material

3613905.3651121-talk-video.mp4

Talk Video

mp4

30.4 MB

References

  1. Ahmed, T. 2019. Addressing physical safety, security, and privacy for people with visual impairments. SOUPS 2016 - 12th Symposium on Usable Privacy and Security (2019).Google ScholarGoogle Scholar
  2. Ahmed, T. 2018. Up to a Limit? Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2, 3 (2018). DOI:https://doi.org/10.1145/3264899.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Akter, T. 2020. “I am uncomfortable sharing what I can't see”: Privacy concerns of the visually impaired with camera based assistive applications. Proceedings of the 29th USENIX Security Symposium (2020).Google ScholarGoogle Scholar
  4. Akter, T. 2022. Shared Privacy Concerns of the Visually Impaired and Sighted Bystanders with Camera-Based Assistive Technologies. ACM Transactions on Accessible Computing. 15, 2 (2022). DOI:https://doi.org/10.1145/3506857.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Amasha, M.A. 2020. The future of using Internet of Things (loTs) and Context-Aware Technology in E-learning. ACM International Conference Proceeding Series. (Feb. 2020), 114–123. DOI:https://doi.org/10.1145/3383923.3383970.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Argyle, M. 1976. Non-verbal Communication and Language. Royal Institute of Philosophy Supplements. 10, (Mar. 1976), 63–78. DOI:https://doi.org/10.1017/S0080443600011079.Google ScholarGoogle ScholarCross RefCross Ref
  7. Argyle, M. Non-verbal communication in human social interaction.Google ScholarGoogle Scholar
  8. Astler, D. 2011. Increased accessibility to nonverbal communication through facial and expression recognition technologies for blind/visually impaired subjects. ASSETS’11: Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility. (2011), 259–260. DOI:https://doi.org/10.1145/2049536.2049596.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Bardram, J.E. 2004. Applications of context-aware computing in hospital work. (Mar. 2004), 1574–1579. DOI:https://doi.org/10.1145/967900.968215.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Bardram, J.E. 2006. Experiences from real-world deployment of context-aware technologies in a hospital environment. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 4206 LNCS, (2006), 369–386. DOI:https://doi.org/10.1007/11853565_22/COVER.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Baumeister, R.F. and Leary, M.R. 1995. The Need to Belong: Desire for Interpersonal Attachments as a Fundamental Human Motivation. Psychological Bulletin. 117, 3 (1995). DOI:https://doi.org/10.1037/0033-2909.117.3.497.Google ScholarGoogle ScholarCross RefCross Ref
  12. Bennett, C.L. 2021. It's complicated: Negotiating accessibility and (mis)representation in image descriptions of race, gender, and disability. Conference on Human Factors in Computing Systems - Proceedings. (May 2021). DOI:https://doi.org/10.1145/3411764.3445498.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Buimer, H.P. 2018. Conveying facial expressions to blind and visually impaired persons through a wearable vibrotactile device. PLoS ONE. 13, 3 (2018). DOI:https://doi.org/10.1371/journal.pone.0194737.Google ScholarGoogle ScholarCross RefCross Ref
  14. Cappagli, G. 2018. Assessing Social Competence in Visually Impaired People and Proposing an Interventional Program in Visually Impaired Children. IEEE Transactions on Cognitive and Developmental Systems. 10, 4 (2018). DOI:https://doi.org/10.1109/TCDS.2018.2809487.Google ScholarGoogle ScholarCross RefCross Ref
  15. Clarke, B.& 2013. SUCCESSFUL QUALITATIVE RESEARCH a practical guide for beginners VIRGINIA BRAUN & VICTORIA CLARKE.Google ScholarGoogle Scholar
  16. Van Dantzig, S. 2018. Enhancing physical activity through context-aware coaching. ACM International Conference Proceeding Series. (May 2018), 187–190. DOI:https://doi.org/10.1145/3240925.3240928.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Denning, T. 2014. In situ with bystanders of augmented reality glasses: Perspectives on recording and privacy-mediating technologies. Conference on Human Factors in Computing Systems - Proceedings (2014).Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Dourish, P. 2009. Seeking a Foundation for Context-Aware Computing. https://doi.org/10.1207/S15327051HCI16234_07. 16, 2–4 (2009), 229–241. DOI:https://doi.org/10.1207/S15327051HCI16234_07.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Glesne, C. 1999. Becoming qualitative researchers: An introduction. Becoming Qualitative Researchers An Introduction. (1999).Google ScholarGoogle Scholar
  20. Glesne, C. 1999. Becoming qualitative researchers: An introduction. Becoming Qualitative Researchers An Introduction. (1999).Google ScholarGoogle Scholar
  21. Grayson, M. 2020. A dynamic AI system for extending the capabilities of blind people. Conference on Human Factors in Computing Systems - Proceedings (2020).Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Van Hasselt, V.B. 1983. Social adaptation in the blind. Clinical Psychology Review. 3, 1 (1983). DOI:https://doi.org/10.1016/0272-7358(83)90007-7.Google ScholarGoogle ScholarCross RefCross Ref
  23. Hauge, J.B. 2017. Exploring context-aware activities to enhance the learning experience. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 10653 LNCS, (2017), 238–247. DOI:https://doi.org/10.1007/978-3-319-71940-5_22/FIGURES/7.Google ScholarGoogle ScholarCross RefCross Ref
  24. Hinde, R.A. and Royal Society (Great Britain). Study Group on Non-Verbal Communication. Non-verbal communication. 443.Google ScholarGoogle Scholar
  25. Huang, Y.M. and Chiu, P.S. 2015. The effectiveness of a meaningful learning-based evaluation model for context-aware mobile learning. British Journal of Educational Technology. 46, 2 (Mar. 2015), 437–447. DOI:https://doi.org/10.1111/BJET.12147.Google ScholarGoogle ScholarCross RefCross Ref
  26. Kianpisheh, M. 2019. Face Recognition Assistant for People with Visual Impairments. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 3, 3 (Sep. 2019), 1–24. DOI:https://doi.org/10.1145/3351248.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Kim, J. and Heo, J. 2021. Please Stop Listening While I Make a Private Call: Context-Aware In-Vehicle Mode of a Voice-Controlled Intelligent Personal Assistant with a Privacy Consideration. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 12788 LNCS, (2021), 177–193. DOI:https://doi.org/10.1007/978-3-030-77392-2_12/FIGURES/3.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Knapp, M.L. and Hall, J.A. 1972. Nonverbal communication in human interaction.Google ScholarGoogle Scholar
  29. Koelle, M. 2015. Don't look at me that way! - Understanding User Attitudes Towards Data Glasses Usage. MobileHCI 2015 - Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (2015).Google ScholarGoogle Scholar
  30. Krishna, S. 2008. A Systematic Requirements Analysis and Development of an Assistive Device to Enhance the Social Interaction of People Who are Blind or Visually Impaired.Google ScholarGoogle Scholar
  31. Krishna, S. 2005. A wearable face recognition system for individuals with visual impairments. ASSETS 2005 - The Seventh International ACM SIGACCESS Conference on Computers and Accessibility (2005).Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Krishna, S. 2010. VibroGlove: An assistive technology aid for conveying facial expressions. Conference on Human Factors in Computing Systems - Proceedings (2010).Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Lee, L. 2017. Information Disclosure Concerns in The Age of Wearable Computing. (2017).Google ScholarGoogle Scholar
  34. McDaniel, T. 2008. Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. HAVE 2008 - IEEE International Workshop on Haptic Audio Visual Environments and Games Proceedings (2008).Google ScholarGoogle ScholarCross RefCross Ref
  35. Morgan, D.L. and Krueger, R.A. 2014. When to Use Focus Groups and Why. Successful Focus Groups: Advancing the State of the Art.Google ScholarGoogle Scholar
  36. Nguyen, D.H. 2011. Situating the concern for information privacy through an empirical study of responses to video recording. Conference on Human Factors in Computing Systems - Proceedings (2011).Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Panchanathan, S. 2016. Social Interaction Assistant: A Person-Centered Approach to Enrich Social Interactions for Individuals with Visual Impairments. IEEE Journal on Selected Topics in Signal Processing. 10, 5 (2016). DOI:https://doi.org/10.1109/JSTSP.2016.2543681.Google ScholarGoogle ScholarCross RefCross Ref
  38. Phillips, M. and Proulx, M.J. 2019. Social Interaction Without Vision: An Assessment of Assistive Technology for the Visually Impaired. Technology & Innovation. 20, 1 (2019). DOI:https://doi.org/10.21300/20.1-2.2018.85.Google ScholarGoogle ScholarCross RefCross Ref
  39. Qiu, S. 2016. Designing and evaluating a wearable device for accessing gaze signals from the sighted. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (2016).Google ScholarGoogle Scholar
  40. Qiu, S. 2020. Understanding visually impaired people's experiences of social signal perception in face-to-face communication. Universal Access in the Information Society. 19, 4 (2020). DOI:https://doi.org/10.1007/s10209-019-00698-3.Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Sarfraz, M.S. 2017. A Multimodal Assistive System for Helping Visually Impaired in Social Interactions. Informatik-Spektrum. 40, 6 (2017). DOI:https://doi.org/10.1007/s00287-017-1077-7.Google ScholarGoogle ScholarCross RefCross Ref
  42. Saulynas, S. 2022. How and Why We Run: Investigating the Experiences of Blind and Visually-Impaired Runners. Proceedings of the 19th International Web for All Conference, W4A 2022. (Apr. 2022). DOI:https://doi.org/10.1145/3493612.3520445.Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Stokes, D. and Bergin, R. 2006. Methodology or “methodolatry”? An evaluation of focus groups and depth interviews. Qualitative Market Research. 9, 1 (2006). DOI:https://doi.org/10.1108/13522750610640530.Google ScholarGoogle ScholarCross RefCross Ref
  44. Tadić, V. 2010. Are language and social communication intact in children with congenital visual impairment at school age? Journal of Child Psychology and Psychiatry and Allied Disciplines. 51, 6 (2010). DOI:https://doi.org/10.1111/j.1469-7610.2009.02200.x.Google ScholarGoogle ScholarCross RefCross Ref
  45. Zhao, Y. 2018. A Face recognition application for people with visual impairments: Understanding use beyond the lab. Conference on Human Factors in Computing Systems - Proceedings (2018).Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Context matters: Investigating information sharing in mixed-visual ability social interactions

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '24: Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems
      May 2024
      4761 pages
      ISBN:9798400703317
      DOI:10.1145/3613905

      Copyright © 2024 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 11 May 2024

      Check for updates

      Qualifiers

      • Work in Progress
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate6,164of23,696submissions,26%

      Upcoming Conference

      CHI PLAY '24
      The Annual Symposium on Computer-Human Interaction in Play
      October 14 - 17, 2024
      Tampere , Finland
    • Article Metrics

      • Downloads (Last 12 months)87
      • Downloads (Last 6 weeks)87

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format