Some interactional functions of finger pointing in signed language conversations

Interlocutors participating in conversation collaborate with each other to coordinate their actions and talk. Research on spoken language conversations has shown that speakers use bodily gestures, in addition to speech, to regulate their interaction. The current study expands on this research by investigating how signed language users finger point to express interactional meanings. Studies of pointing in signed languages have largely focused on referential functions, as signers frequently point to refer to themselves and others, as well as visible and invisible referents. However, this study demonstrates how signers also finger point to deliver information, cite previous contributions, seek responses, manage turns, and give feedback. These interactional meanings are important, just as identifying discourse referents is important. Language theory should be able to accommodate this complexity of language in conversation, which involves an interplay between different types of semiosis (description, depiction, indexicality) in an inclusive, systematic way.


Accommodating the semiotic diversity of conversation in language theory
Face-to-face conversation is a type of "talk in interaction" (Sacks et al. 1974: 720) and a basic setting where language is used (Fillmore 1981;Chafe 1994;Clark 1996). People regularly produce meaningful, visible bodily actions, which may be combined with speech, to describe, depict, and index meanings across specific times and contexts as parts of composite utterances (Enfield 2009; see also Kendon 2014;Ferrara & Hodge 2018). There is a growing body of research that demonstrates the details of this multimodal integration (e.g., Goodwin 1981;1986;Clark & Wilkes-Gibbs 1986;Bavelas 1990;Wilkes-Gibbs 1997;Abner et al. 2015;Cooperrider 2016;Keevallik 2018) as well as how people use such actions to regulate emerging interaction (e.g., Goodwin 1981;1986;Parrill 2008;Enfield 2009;Jokinen 2009;Mondada 2014;Shaw 2019). 1 The multimodal and semiotically diverse nature of face-to-face conversation has implications for language theory. However, in linguistics, theoretical work has generally disregarded this bodily aspect of conversation and has above all prioritized speech (and writing). In addition, there has been a preoccupation with only the most conventionalized and symbolic elements of composite utterances and their expression of referential, propositional meanings. The result has been theory building focused around the most symbolic instances of speech, while non-speech, bodily actions that are symbolic, as well as speech and bodily actions that express indexical or depictive meanings, have been ignored or selectively discussed (see Dingemanse 2017;Ferrara & Hodge 2018 for more discussion of this bias).
The current study challenges this theoretical bias with a study into the interactional (rather than propositional) meanings of pointing in Norwegian Sign Language conversations. Pointing is generally defined here as a meaningful bodily movement that directs attention toward an area of space (Clark 2003;Kendon 2004;Cooperrider et al. 2018b). In particular for this study, finger pointing will be in focus, which is known to be a frequent and multi-functional practice (in both signed and spoken language interaction). The referential (e.g., pronominal) meanings that finger pointing express are well documented across signed languages (see Section 1.2), but complementary interactional meanings have yet to be fully examined. Both the referential and interactive uses of finger pointing, however, affect how language conventions emerge over time and how these conventions intertwine with other semiotic actions. A survey of the interactional meanings, or functions, of finger pointing in face-to-face signed language conversations will be used here as further evidence that the language of conversation is highly indexical and that this indexicality contributes to the coordination of emergent interactions. In this way, the analysis here aligns with previous and current work that considers the "pragmatic" and context-dependent meanings ever present in human interaction as well as the role indexicality plays in language theory (see e.g., Malinowski 1923;Kress 1976;Silverstein 1976;Washabaugh 1981;Halliday 1985;Hanks 1992;Johnston 1992;Clark 1996;Langacker 2001;Hayashi 2005

Finger pointing in signed language
There is a large body of work that investigates manual pointing in signed languages, and this work shows that signers frequently point with their hands to refer to themselves and others, as well as other visible and invisible referents. In addition finger points can serve locative and determinative functions (e.g., Engberg-Pedersen 2003;Liddell 2003;Nilsson 2004;Johnston 2013;Nordlund 2019; for thorough reviews of the literature see Cormier et al. 2013;Meier & Lillo-Martin 2013). 2 For example, in a corpus investigation of manual pointing in Auslan (the signed language used in Australia)-the largest empirical study of signed language pointing to date-Johnston (2013) investigated 5,797 tokens of manual pointing, which occurred across a dataset of questionnaire responses, retellings, and personal narratives, and found that the primary functions of these points were to identify a referent, identify a location, or specify a signed referent as somehow given or known.
A few additional observations in the literature suggest that signers also point to index and regulate an emerging interaction. For example, one early study on turn-taking in American Sign Language mentioned that manual indexing can be used for turn management and conversational feedback (Baker 1977). In a later study on Flemish Sign Language, Van Herreweghe (2002) found that signers use pointing for turn-taking functions in meetings.
The goal of the current study was to investigate these, as well as other, interactional meanings of finger pointing in more detail, as they occurred across a corpus of naturalistic Norwegian Sign Language conversations.
To further contextualize and frame the study, it is useful to consider research findings on the social meanings that visible, bodily actions produced by speakers and signers can express in interaction. In particular, the following sections will present examples of research into how people use manual (and other bodily) actions while they talk and sign to serve interactional (rather than referential) functions. These functions, which have also been labeled as "pragmatic," will provide the starting point for the current study's analysis of finger pointing, the method of which is detailed in Section 2.

"Pragmatic" meanings of various manual and non-manual actions in spoken and signed language interaction
Similar to research priorities in linguistics, researchers of gesture have also been primarily focused on the referential meanings and uses of different kinds of manual and nonmanual bodily actions (see Streeck 2009a and Kendon 2017 for useful reviews). Even so, some researchers have examined the "pragmatic" functions of gestures, such as, for example, to mark topic/comment structure or directive speech (Seyfeddinipur 2004), to mark focal discourse or to emphasize (Neumann 2004), to present or receive units of discourse (Müller 2004), to express socio-affiliational meanings (e.g., Enfield et al. 2007), or to express epistemic meanings or stance (Kendon 1995;Streeck 2009a;Deppermann 2013;Cooperrider et al. 2018a). Speakers (and signers) also use their hands and bodies to manage emerging interactions (which will be detailed further in the following sections).
As a way to begin surveying the wide range of interactional meanings of co-speech gesture as well as some signed language practices, early work by Bavelas and colleagues (see below) will be used as an initial framework. Additional literature will be used to expand and comment on this initial functional typology. This limited review will focus primarily on the interactional functions of co-speech manual gestures (and to some extent some particular signed language practices), as opposed to other types of functions, such as some of those mentioned in the previous paragraphs.
Bavelas and colleagues (Bavelas et al. 1992;Bavelas 1994;Bavelas et al. 1995) investigated a type of manual gesture that took the basic form of the finger(s) or palm(s) being oriented towards an interlocutor (and as such represented a type of manual pointing action). They found that speakers would use these gestures not to refer to referents in the discourse, but rather to relate to fellow interlocutors and the interaction itself.
Experiments showed that speakers used these gestures more often when they were with other people who they could see (Bavelas et al. 1992) and that these gestures elicited responses from interlocutors (Bavelas et al. 1995). These experiments demonstrated that interactional gestures serve social functions related to the coordination of a conversation, and that they are different from topic gestures (i.e., referential gestures that might depict how a referent looks or acts). These gestures functioned to deliver information, cite previous contributions, seek responses, and manage turns. Each of these functions will be described and supported by additional literature in the following sub-sections. It is this literature coupled with the annotation of the study corpus that led to the interactional categories identified and investigated for this study (detailed further in section 2 below).

Delivery functions
Delivery gestures are those that hand over information from the speaker to an interlocutor. They also mark common ground and digressions, or signal information that an interlocutor should elaborate themselves (Bavelas 1994). These gestures were found to be more frequent in contexts with experimentally induced common ground (Holler 2009). Other research has shown that palm up actions also work to deliver information to interlocutors in spoken (e.g., Müller 2004;Streeck 2009a;Lepeut 2018;submitted;Shaw 2019) and signed language interaction (e.g., Lepeut 2018; submitted; Shaw 2019). However, the use of finger pointing for these delivery functions in signed language interaction has yet to be documented or described. This study begins to redress this gap in the literature by examining some examples of this type of pointing in Norwegian Sign Language.

Citing functions
Citing gestures refer back to previous contributions in the interaction. For example, interlocutors can either signal that the current point being made was made previously by another interlocutor or they can show that a response by an interlocutor to a turn at talk was understood. Finger pointing and palm up actions have been shown to be used for this function in both signed and spoken language interaction (Bavelas 1994;Kendon 2004;Lepeut 2018;Shaw 2019).

Seeking functions
Seeking gestures aim to elicit a response from an interlocutor. They can be used to check if an interlocutor is following or agreeing with ongoing talk (Bavelas et al. 1992). Alternatively, such manual indexes can request help with finding what to say (Bavelas et al. 1992;Bavelas 1994;Streeck 2009a;Sikveland & Ogden 2012;Lepeut 2018;submitted;Shaw 2019). Similarly, additional studies have shown that speakers can also produce iconic, or representational, gestures (that somehow depict the referent) or word search gestures during moments of lexical retrieval (e.g., clicking or wiggling one's fingers) (e.g., Goodwin & Goodwin 1986;Streeck 2009b;Holler et al. 2013). In the current study, the use of finger pointing as a way to seek responses from interlocutors is in focus.

Turn-management functions
A final function of interactional gestures identified in the early studies by Bavelas and colleagues (Bavelas et al. 1992;Bavelas 1994;Bavelas et al. 1995) related to the management of turn-taking. Turn-taking functions have received the most attention from researchers, especially within gesture research and multimodal conversational analysis. Studies have detailed how speakers and signers use manual and non-manual gestures to secure/self-select for a turn at talk (e.g., Streeck & Hartge 1992;Bavelas 1994;Mondada 2007;Streeck 2009b for spoken language interaction, and e.g., Baker 1977;McIlvenny 1995;Van Herreweghe 2002 for signed languages). Interlocutors also use bodily actions to hold turns, either to coordinate talk from other interlocutors or, for example, as a way to create opportunities for them to seek information (Streeck & Hartge 1992;Mondada 2007;Sivkeland & Ogden 2012;Groeber & Pochon-Berger 2014;Ryttervik 2015).
In addition to taking turns and holding turns, speakers and signers also produce bodily actions as a way to complete turns or give next turns, which may involve various forms of open palm or pointing. 3 These practices vary depending on the setting, for example, classroom interaction (Kääntä 2012) and instructional contexts (Keevallik 2014), meetings (Van Herreweghe 2002Mondada 2013; see also Keevallik 2014), and everyday conversation (Baker 1977;Streeck 2009a;Li 2014;Ryttervik 2015;Lepeut 2018). This research examines how interlocutors combine speech, hand, body, and face movements to coordinate turn transitions. In the current study, the use of finger pointing for this function in signed language interaction is examined in more detail.

Feedback functions
An additional function that needs to be introduced, but that which was not observed in the Bavelas experiments (Bavelas et al. 1992;Bavelas 1994;Bavelas et al. 1995) relates to conversational feedback and backchanneling. In particular, speakers and signers not only seek following or agreement from others (outlined in section 1.3.3), but they also can give feedback responses through various bodily actions. These actions visibly indicate that someone is an active participant in the ongoing talk and show that they are following or even agreeing with what is being said (e.g., Baker 1977;Healy 2012;Ryttervik 2015;Mesch 2016;Gabarró-López 2020). These visible actions can be manual (e.g., as cospeech gestures, or signed lexical phrases) or non-manual (e.g., in the form of head nods or eye gaze). Spoken response particles, lexical phrases, or other vocal activities such as laughing can also be used (e.g., Coates & Sutton-Spence 2001; see also Deppermann 2013 for such actions at turn beginnings). In this study, finger pointing and its function as conversational feedback is considered, thereby adding to this area of research.

The current study -materials and method
As mentioned in Section 1.2, most research on manual pointing in signed language has focused on referential functions. However, the review of the literature above demonstrated that signers and speakers also point to index aspects of the ongoing talk itself and serve various interactional functions. This study adds to this literature and supports the anecdotal reports in Baker (1977) and Van Herreweghe (2002) by examining how finger pointing serves interactional functions in signed language conversation. Findings will be compared with what has been done on spoken language interaction and will add to a broader understanding of how finger pointing works in signed languages. Before this though, in the following sections, the data used for the study are described and details about data annotation and analysis are provided.

Data and participants
The data for this study come from video recordings of 11 informal conversations about a variety of everyday topics in Norwegian Sign Language, e.g., stories of growing up, history of the deaf community, vacation travels. Recordings were made in various university locations (meeting rooms or classrooms) or at a local deaf association. The aim of the sessions was to elicit spontaneous, and as natural as possible, conversations in Norwegian Sign Language. Many of the participants knew each other and the research assistants. These recordings were collected during two earlier projects, which have been approved by the Norwegian Centre for Research data (#42133 and #55097). Participants have consented to this data being used for research, and they have given consent to using their data and images in research and teaching activities.
The filmed conversations involved between two and five Norwegian signers each, and the analysis here examined 3.4 hours of signing by 21 different signers (15 women and 6 men). Table 1 provides a summary of the data in the study corpus. 45 4 It should be noted that conversations 1-8 involve both deaf and hearing signers. A hearing signer was recruited to facilitate the collection of these eight conversations. She is a near-native signer, has deaf parents, and is an active member of the deaf community. As initial data annotation has been focused on the signing practices of deaf signers (in relation to other projects), the hearing signer has not been annotated, and so any interactive pointing actions she produces are not included in this study (which explains why she is not listed as a participant in Table 1). However, in later sections of this paper where examples are provided, she is included and any relevant actions on her part are included in the analysis. Conversations 9-11 were facilitated by deaf signers, OIS and LMN, and their data and signing are included in the analysis reported here (their multiple contributions to the study corpus are italicized in Table 1). 5 Only parts of conversations 1-8 have been tokenized for manual signs as part of previous projects. The total time annotated and thus analyzed in the current study for these conversations are supplied in parentheses.
Conversations 9-11 are fully tokenized for manual signs. The data for this study represent in some ways the diversity inherent in the Norwegian Deaf community by including signers with various backgrounds, in terms of their age, where they live, and when they acquired Norwegian Sign Language (age of acquisition). Although this study includes signers who report learning Norwegian Sign Language between 8-12 years old-and are thus often considered to be late learners by linguistsall participants report that they consider themselves members of the deaf community and use Norwegian Sign Language in their daily private and public lives. They, along with the other participants who learned Norwegian Sign Language before the age of seven, are considered able to provide some preliminary insight into the use of finger pointing for interactional functions in the Norwegian deaf community.

Data annotation
The video recordings outlined above were reviewed and annotated in ELAN (Wittenburg et al. 2006). The use of ELAN facilitated the time-alignment of annotations, created on various user-defined tiers, with the video source, allowing the primary data to always remain in view (Crasborn & Sloetjes 2008). 6 First, manual signs were identified and annotations were created on two tiers: a right and left hand gloss tier. Labels within these annotations identified tokens of fingerspelling, pointing signs, depicting signs, or manual constructed actions. Empty annotations indicated signs presumed to be fully lexical, and as such, await assignment of a unique identifier-or ID-gloss-from the Norwegian Sign Language lexical database, which is currently being developed (see Johnston 2010; for more information about the use of ID-glosses in signed language annotation as well as using labels to identify different types of signs).
After the data was tokenized for manual signs, all tokens of pointing were revisited and assigned a subtype, when possible, based on the sign's meaning in context. These subtypes were initially based on the types of points identified in the Auslan Corpus (Johnston 2008) and described in the Auslan corpus annotation guidelines (Johnston 2016). As annotation of the current dataset progressed, several other subtypes were added, including the interactional points that are the focus of this study. Importantly, tokens of pointing were tagged for multiple functions when warranted, or when the context was ambiguous. For example, a token point could be tagged as both pronominal and locative, or as interactional and pronominal.
After tokens of pointing across the dataset were tagged for function, all the interactional points were again revisited and tagged for their particular interactional function(s) based on the context of the token. Additional tiers in ELAN were used to do this-two Main Function tiers and two Specific Function tiers (which accommodated tokens serving multiple interactional functions). The particular functions tagged on these tiers were based initially on the functions of interactional gestures in spoken English observed by Bavelas and colleagues (Bavelas et al. 1992;Bavelas 1994;Bavelas et al. 1995). These categories were then amended and expanded upon in the light of more recent literature (see review in Section 1.3) as well as the data encountered during annotation of the study corpus. This type of iterative and reciprocal interaction between the data annotation and literature/theory is common in studies employing corpus methods (McEnery & Hardie 2012: 158). The full list of interactional functions observed and tagged in the current dataset, grouped into main and specific functions, is provided in Table 2.
The analysis and tagging of interactional finger pointing in the data occurred over multiple parses of the data by one annotator. All data were reviewed at least twice, in some cases, three times (in addition to the initial identification parses). Each token was scrutinized in context against the backdrop of the literature for potential interactional functions. This involved examining the signer's bodily actions that occurred simultaneously and immediately surrounding the token, as well as the actions and reactions of other interlocutors. If there was uncertainty around a token in regard to either its main or specific function, this was indicated in the tag with a question mark.
After the annotation of the interactional pointing and their functions was completed, a second annotator was trained and then given a randomized 10 percent of the interactional pointing tokens (n = 40). 7 They tagged these tokens for main and specific interactional functions. Cohen's kappa (which is a measure for inter-rater reliability) between the main annotator and this second annotator on the main categories was calculated at 0.58 (95% CI: 0.44-0.72). This level of moderate agreement was judged as satisfactory here, given the exploratory, and non-experimental, nature of this study. It demonstrates that the interactional meanings coded here have some level of validity. As work into these types Delivery finger pointing, as a group, refers to the delivery of information by a signer to an interlocutor: Shared information marks material that an interlocutor probably already knows-information that is part of their common ground (Clark & Brennan 1991). They mean, essentially, 'As you know.' Digression marks information that should be treated by an interlocutor as an aside from the main point. Analogues to 'By the way' or 'Back to the main point.' Citing finger pointing refers to a (previous) contribution by an interlocutor: General citing indicates 'as you said earlier'/'what you are saying,' that is, the point the signer is now making had been contributed earlier by the interlocutor. This pointing action can also be produced by an interlocutor to respond to another signer 'right, as was (just) mentioned (earlier)' (as a group or as an individual).
Acknowledgement of an interlocutor's response indicates that the signer saw or heard that an interlocutor understood what had been said. Paraphrased, 'I see that you understood me.' Seeking finger pointing aims to elicit a specific response from an interlocutor: Seeking help requests a word or phrase that the signer cannot find at the moment. A verbal paraphrase would be 'Can you give me the word for…?' Seeking alone is a word-searching action that does not request help from an interlocutor.
Seeking agreement (/confirmation) asks whether an interlocutor agrees or disagrees with the point being made.
Analogous to 'Do you agree?' Seeking following asks whether an interlocutor understands what is being said. Verbal equivalents include 'you know?' or 'eh?' at the end of a phrase.
Turn-regulating finger pointing refers to issues related to turn management: Giving turn hands a turn over to another interlocutor. As if to say, 'Your turn.' Taking turn accepts a turn from an interlocutor. Paraphrased as 'OK, I'll take over.' These points can also be produced to self-select for next turn.
Turn open indicates that it is anyone's turn, as if to say, 'Who's going to talk next?' Look guides other interlocutors' gaze/attention to the current signer (in the case they are looking at the wrong person).
Holding turn allows a current signer to continue their turn after a pause or to allow another signer to add a comment or say something. It can also help guide eye gaze.
Feedback finger pointing shows involvement from non-current signing interlocutors (also known as backchanneling):

Showing following indicates that an interlocutor understands what is being said.
Showing agreement indicates an interlocutor agrees with what is being said.
Ferrara: Some interactional functions of finger pointing in signed language conversations Art. 88, page 9 of 26 of interactional (rather than referential) meanings expand and develop, linguists will be able to collectively help to establish more robust standards into this type of annotation and analysis. This study presents but one contribution to this early effort. In the following sections, findings from an analysis of the annotations outlined above are reported. In addition, examples will be detailed to demonstrate how signers use these finger points to coordinate conversational moves. The frequency and function of these points indicate that they are an important feature of (Norwegian) signed language conversation. As a final note, the data, ELAN files, and supplementary materials used for this study are openly available via the Open Science Framework at https://osf.io/g8zv6/.

Overview of the data
The annotation of the data resulted in the identification of 21,265 manual sign tokens (as annotated on the dominant hand ID-gloss tier), of which 19.62% (n = 4,172) were tokens of pointing. These manual points served a variety of functions, as described above, with most points (n = 2,318, 55.5%) indexing referents. In addition, however, a number of points (n = 345, 8.3%) were observed to serve the interactional functions addressed in this study. See Table 3 for a full summary of the frequency and distribution of different types of pointing across the study corpus. As mentioned in Section 2.2, a token point could be tagged for more than one function, and this is reflected in the figures reported. In total, 4,172 points were tagged for 4,305 functions. Importantly, the 133 tokens tagged for more than one function were often pairings of an interactional function with either a determiner (10 tokens), locative (7 tokens), or pronominal (64 tokens) function. This multifunctionality is reflected in Table 3 and will be revisited in the discussion (Section 4).
While it is common practice in signed language corpus linguistics to calculate sign frequency using annotations on the dominant hand gloss tier, signers can of course produce signs independently on their non-dominant hand. Further examination thus revealed that signers do produce interactional points with their non-dominant hand, independently of the dominant hand. These tokens are included in the analysis presented below and raise the total number of interactional pointing tokens investigated here to 418.
In the studies of spoken English conversations, Bavelas and colleagues (Bavelas et al. 1992;Bavelas 1994;Bavelas et al. 1995) identified a variety of functions interactional gestures can serve without providing much detail about which functions were more or less common. While this is a preliminary study aimed at exploring some of the interactional Ferrara: Some interactional functions of finger pointing in signed language conversations Art. 88, page 10 of 26 meanings of finger pointing by signers in the Norwegian deaf community and exploring their theoretical importance, it can still be useful to see which types of functions were used more often than others. Table 4 presents the frequency of tokens for each of the main interactional functions. Note that these figures represent only the interactional pointing that was certainly identified for function (as judged by the main annotator across multiple parses). Uncertainly identified tokens with regards to either the main or specific function represented 2.4% (10/418) of the interactional pointing tokens. These tokens have been removed from the remaining analysis until they can receive further scrutiny. It is also important to reiterate that each token could be tagged for multiple functions (in a similar way that a point could be tagged as, for example, pronominal and interactional). Of the 408 tokens of certainly identified interactional finger pointing, 58 tokens were tagged with two interactional functions, which explains the total 466 main functions reported in Table 4. The most frequent pairs of interactional functions included turn-taking with citing (13 tokens), with feedback (10 tokens), and seeking (11 tokens). Some examples of such dual-function points will be detailed in the following sections.
The figures reported in Table 4 show that Norwegian signers in the study corpus most often used interactional pointing to regulate turn-taking, e.g., taking and giving turns. However, there were also a number of tokens that functioned as feedback, both to show agreement and to show that an interlocutor was following/understanding the signer. Compare the fairly numerous tokens of interactional pointing for turn-regulating functions (n = 211) and feedback (n = 132), with the relatively few citing (n = 42). Furthermore, signers in this dataset rarely engaged pointing for delivery functions (n = 18). In the following sections, examples of these different functions of interactional pointing are detailed and discussed, presented in the order of least to most frequent.

Delivery functions of pointing
As mentioned above, interactional finger pointing that serves delivery functions-shared (n = 16) and digression (n = 2)-were fairly rare in this dataset. However, in order to demonstrate the category, a token is described here from a conversation with two deaf and one hearing interlocutor. The two deaf interlocutors work together, and they were replying to a question from the hearing interlocutor about how to get to the kindergarten from their offices (which all are located on the same campus). First, the male deaf signer had provided his way of getting there (which entailed a path through buildings). Then the female deaf signer begins describing an alternate path that goes outside the buildings. She explains that the reason for this alternative path is that there had been renovations inside the buildings, and so they were not so easily passed through anymore. She explains that where one used to be able to walk through two buildings, now it is blocked off. The example in Figure 1 begins as the signer inserts an aside to mention that this particular part of her workplace used to be a carpentry workshop. The signer TVG (who is the deaf, female signer on the left in the double video frame in Figure 1) signs pt:loc actually before it-is workshop, 'actually, a long time ago that place was a workshop.' During this utterance, she is looking at the hearing interlocutor (the woman in the black shirt in Figure 1). However, as she produces the sign workshop, she shifts her gaze to her male colleague and maintains this gaze direction as she produces a very brief (.12 seconds) interactional finger point towards him (shown in the video frame in Figure 1). She then shifts her gaze back to the hearing interlocutor and continues without pause to clarify the initial part of her comment, carpentry factory before, 'or a carpentry workshop-factory-before.' The male colleague (abbreviated as TJ in Figure 1) appears to respond to this point directed at him by providing additional information (yes and kitchen, 'yes and a kitchen'), which suggests he does indeed know what and where TVG is talking about. He says this while TVG is still signing, and his comment overlaps with her ongoing talk. TVG notices his movement and shifts her gaze back to him while she finishes signing before, while he produces the signs yes and. However, TVG does not pause her signing but instead shifts her gaze immediately back to the hearing interlocutor and thus does not see TJ finish his comment. TVG continues describing the way to the kindergarten. 8 The interactional finger point in this example was analyzed as indicating shared information between the two deaf interlocutors (TVG and TJ). TVG knew that her colleague TJ was familiar with the history of their workplace and she indicated it as such. This token was not interpreted as a comprehension check, because this sequence was mainly directed at the hearing interlocutor who had originally asked the question (evidenced 8 In order to simplify the figures, only active interlocutors are given lines in the transcript. rh and lh indicate signs produced on the right and left hands. Each of these tiers are associated with a signer by including their initials (e.g., rh-tvg indicates the signs TVG produces on her right hand). Translations in English are also provided. Images in the examples are connected to the ELAN timeline and transcript with a solid (red) line. This line shows when in the example the still shot was captured. Figure 1: An example of a finger point in Norwegian Sign Language that serves a delivery function (Ferrara & Ringsø 2017. 8 by TVG's body orientation and gaze). Even in the brief moment that TVG re-directed her gaze towards TJ during her finger pointing, she did not wait to see if he responded and immediately directed her gaze back towards the hearing interlocutor. As we can see from this specific example, signers are capable of indexing their interlocutors to indicate shared common ground, although it does not seem to a very frequent behavior in this study corpus.

Citing functions of pointing
The Norwegian deaf signers in this study also used finger pointing to cite the contributions of other interlocutors, similar to what has been observed in some spoken language contexts (Bavelas 1994;Kendon 2004). Specifically, these finger points worked in most cases to cite things said earlier in the discourse (n = 29), or sometimes they were used to acknowledge a response from an interlocutor (n = 13). In some cases, these tokens simultaneously functioned as turn-initiators or even as feedback, as signers would link their upcoming talk with what had just been said or provide a backchannel response acknowledging what was being signed by another interlocutor. An example of a point that cites previous discourse is illustrated in Figure 2 and occurs as part of a conversation with three deaf interlocutors. Earlier in the conversation, OIS (the deaf man sitting on the right in the double frame in Figure 2) asks the other two interlocutors if they had noticed differences in how different students signed when they were at a particular vocational training school for the deaf. EB (the deaf woman sitting in the middle in Figure 2) responded that the signing was very different indeed and that the students from Trondheim signed "ugly" while the students from Holmestrand signed "pretty." The conversation continues and after a while TR (the deaf man sitting on the left in the double frame in Figure 2) begins telling about what he remembers from that time. He says that he did see differences among the students' signing at the vocational deaf school, because they were coming from many different places. The example begins as TR explains how he reacted to seeing those strange signs, 'There I remember seeing Figure 2: An example of a finger point in Norwegian Sign Language that cites previous discourse (Ferrara & Bø 2015, P-BO1_TR.eaf, 5:16.646-5:25.190).
strange signs' (top row in Figure 2). He then shifts his gaze to EB and signs also pt:int, 'and also like you said.' By pointing towards EB, TR effectively cites the comments that EB produced a minute earlier about some signing being ugly and some being pretty. Note that the finger point, although directed towards EB, is not interpreted to index EB as a referent, as a way to say 'you.' It mainly functions to index her previous comments. Then TR explains that after a while, the signing got mixed together and it was fine. He checks this assessment with EB by looking at her while signing function okay. EB nods in response and agrees, 'yes, fine, after a while' (bottom row in Figure 2). The interactional finger point produced by TR, coupled with a directed eye gaze towards EB effectively indexed EB's comment that was made one minute before in the conversation. In this way, TR's finger pointing was able to link his section of the talk with what had been going on before and allowed him to add to the discussion by emphasizing that in the end everyone was signing together.

Seeking functions of pointing
Finger pointing produced by the participants in this study were sometimes used to seek information from themselves or from other interlocutors (n = 63). In most cases, these points functioned to solicit feedback related to whether an interlocutor was following (n = 32) and/or agreed with what the signer was saying (n = 14). In only a few cases, did signers point as a way to seek help with information (n = 5) or as a way to indicate they themselves needed time to think of what they wanted to say (n = 12).
In one example, a (deaf) signer is talking about a new highway that is being built (Ferrara & Ringsø 2017). Just after she says, 'Now, they are building the highway,' she points to her (hearing) interlocutor to confirm that the interlocutor knows where and what she is talking about. This point is co-produced with a head nod forward and a squinting of the eyes. This action elicits an immediate confirmation from her interlocutor in the form of a head nod and a silently mouthed ja ('yeah'). In this way the signer confirms the common ground with her interlocutor, which can be used to orientate future conversational moves. These types of comprehension checks were fairly common in the data, as signers frequently checked to see if their interlocutors were following what they were saying.
Signers also pointed as a way to signal that they needed time to think of what to say. An example of this is provided in Figure 3, which begins with a question that TR2 (the woman on the right of the video frame, who is hearing) asks to EMN (the woman on the left of the video frame, who is deaf) about where EMN works. EMN replies, 'in Ranheim,' which involved a locative point on her right hand (pt:loc, shown on top row, Figure 3). EMN holds this point while TR2 looks up as if thinking about where this location is in town and after which she then nods and mouthes the Norwegian word for 'yes.' Perhaps because of TR2's slightly delayed response, EMN produces an interactional point on her left hand to start a new turn and index the question TR2 had just posed (an example of citing as well as turn self-selection, see the leftmost image in the middle row of Figure 3). 9 Then EMN shifts her gaze upwards and to the side while she signs know and points again to her interlocutor (also with her left hand, see middle and rightmost images in middle row of Figure 3). This second point lasted 1.5 seconds and functioned to index her interlocutor and give the signer time to think of how to explain the location of her workplace. The shift in gaze away from TR2 is analyzed here as a cue about to how the information search process should be negotiated between EMN and TR2, namely that EMN is not seeking TR2's help in the process (see Goodwin & Goodwin 1986 regarding interlocutor participation in word searches during spoken interaction). EMN continues looking away as she begins further clarification with the sign by. Then she shifts her gaze back to TR2 and explains that her work is by the new Kiwi grocery store. EMN then asks if TR2 knows the big hill there, after which TR2 responds with an interactional point of her own (see bottom row in Figure 3), while nodding, confirming that she is following now where EMN is talking about.
It should be noted that the interactional point in this example also functioned as a pronominal (i.e., the second person pronoun 'you,' glossed as pt:int/pro2 in the middle row of Figure 3). It was interpreted as part of a question 'you know…?'). However, here the additional function of seeking information was profiled (as well as holding a turn). This finger pointing effectively allowed EMN to hold her turn, while she thought of what to say. These indexical, interactional meanings were co-expressed with referential meanings. In this way, the finger point contributed to the coordination and advancement of the emerging talk.

Feedback functions of pointing
Throughout the study corpus, signers produced interactional finger pointing that indicated that they were following (n = 75) and, in some cases, agreeing with what another interlocutor was signing (n = 57). These points contrast with those produced by a signer to seek following/agreement, which can be summarized as 'you know?' or 'do you agree?' Instead, these showing following/agreement finger points tell another signer 'ah, I see' or 'yes, I agree with what you are saying,' and thus are examples of conversational feedback or backchanneling. Interactional finger pointing that served these functions were common in the data (n = 132, 28.3%), second only to turn-taking functions (presented in the next section). Figure 4 shows examples of finger pointing as feedback that occurred in a conversation with three deaf interlocutors, although this sequence involves only two of the signers. ES (the woman on the left in the video frames in Figure 4) produces a series of points that show that she is following ULA's (the woman on the right in the video frames in Figure 4) signing. The example comes after ES recounts her experience learning to vocalize different sounds, and how some sounds would cause a paper placed in front of her mouth to blow over, while others would not. The example begins as ES signs 'I remember that the letter p [would cause paper to blow over]' where she ends up looking at ULA. ULA had tried to add a clarification to ES's story before the example begins, and as she  (Ferrara & Bø 2015, P-OO1_ES.eaf, 35:17.456-35:24.708).
gets ES's gaze, ULA waves her hand and again begins, 'hey, the letter d doesn't make the paper blow down' (top row of Figure 4). During ULA's production of the sign quiet (top row Figure 4), ES begins to produce two interactional finger points towards ULA, coupled with two head nods and a mouthing, ahh, showing that she understands ULA's qualification to her story (see the first and second image on the left in the middle row in Figure 4). While ES produces these two finger points, ULA looks at her hands, fingerspells the letter t, and then gazes back to ES to explain that the letter t will cause a paper to blow over. ES responds with yes and another interactional finger point (see the rightmost image in the middle row of Figure 4), again showing that she is following these additional clarifications. This utterance is accompanied with a series of head nods that continue while ULA explains that the letter p will result in the same effect. ULA then contrasts this with the letter b, which will not blow paper down. ES catches this contrasting example and responds with an overlapping interactional point (see bottom row in Figure 4) and a waving action with the palm facing ULA (to indicate negation). Throughout this sequence, ES indicates that she is following ULA's explanation and that she now understands/agrees with which letters (sounds) go with which effect (also signaled through her frequent head nodding that accompanies this sequence). This example demonstrates how finger pointing as conversational feedback guides the trajectory of moves across the interaction.

Turn-regulating functions of pointing
In the study corpus, signers most frequently used interactional finger pointing to coordinate turn-taking (n = 211, 45.3%), a function that has only very briefly been mentioned in any signed language linguistics literature (Baker 1977;Van Herreweghe 2002). Specific turn-regulating functions included giving turns to other signers (often used in question contexts) (n = 85) and taking turns (both as a response to another signer giving a turn, but also for self-selection) (n = 76). In addition, signers used finger pointing to indicate that the turn was open for someone to take (n = 7), or if the current signer simply wanted to pause their turn (n = 23), for example, to allow for a small insertion or comment from another interlocutor (similar to an example from spoken French described in Mondada 2007). Finally, in some contexts, signers would use finger pointing to help guide the gaze and attention of other signers (n = 20). For example, it might happen that interlocutor A is looking at interlocutor B, while another interlocutor (C) is the one signing. So, interlocutor B might point to interlocutor C as a way to indicate to interlocutor A that they should shift their gaze to interlocutor C.
An example of using finger pointing to regulate turn-taking occurs in a conversation with three interlocutors (two deaf and one hearing), although this specific sequence only involves the two deaf signers. Prior to the start of the example, TVG (the deaf woman sitting on the left in the double video frame in Figure 5) has been talking about the layout of offices at her work. She explains that the building is a square shape, with the interior being an open space. The example starts as TVG explains that the office of TJ (the deaf man sitting in the middle of the double video frame in Figure 5), is located along the hallway that goes around the open area (see the top row in Figure 5, where TVG signs 'between your office, which is on the other side of the hall').
As she goes on to immediately start depicting where the rooms are placed in relation to this open area (top row in Figure 5, ds:room), TJ produces an interactional finger point (pt:int) followed by the sign actually. In this way, TJ indexes what TVG is saying by pointing to her/her signing, while simultaneously self-selecting for a turn. However, TVG does not see this interruption, possibly because her ability to look somewhere is reserved for indexing her own signing in that moment rather than being directed at TJ. Because he has not yet received the gaze of TVG, TJ produces a larger finger point, which approaches TVG's signing space and peripheral vision (compare TJ's finger point in the image in the top row with the one in the middle row of Figure 5). This interactional finger point is followed by a false start and then the sign correct. Only then does TVG shift her gaze to look TJ (at 00:36:20.2). TJ then points once again to TVG and her signing space to index the topic she has just been discussing, signs actually, and then shifts his gaze in front of him while signing that 'before it used to be three offices, I think' (see bottom row of Figure 5). This example demonstrates how a signer can use finger pointing to self-select for a turn, while also creating time and space for other interlocutors to redirect their gaze and attention. Once TJ received TVG's attention with his interactional finger pointing, he then was able to continue with what he wanted to add to the conversation.

Discussion
The findings and examples presented above surveyed the various interactional meanings of finger pointing observed within a set of Norwegian Sign Language conversations. They are in many ways similar to the interactional meanings expressed by different types of manual and non-manual bodily actions in various spoken language contexts (as reviewed Figure 5: An example of a Norwegian signer using pointing to self-select for a turn (while linking to the talk of an interlocutor) (Ferrara & Ringsø 2017).
in Section 1.3). Even though the data analyzed for this study had much in common with previous research, the functions identified should not be considered exhaustive (nor for spoken language interaction for that matter). It is expected that signers may also engage other types of interactional meanings either through finger pointing or other types of bodily actions. These possibly include, for example, to 'move aside topics,' which has been described by Streeck (2009a), or to 'interrupt' ongoing discourse, which was observed by Kamunen (2018). It is hoped that future work on more spoken and signed language interaction in different contexts will reveal how communities of speakers and signers use bodily actions for interactional purposes and how language is shaped by this use. An important consideration for the current study is how finger points and the meanings they prompt fit into a theory of (signed) language. Analysis showed that signers finger point for a range of referential and interactional functions. Many of these functions align with previous descriptions in the literature and support findings demonstrating the essential nature of finger pointing in signed language (Engberg-Pedersen 2003;Liddell 2003;Cormier et al. 2013;Johnston 2013). For example, pointing made up a total 19.6% of all manual sign tokens examined in this study, and many of these points served the three main functions discussed in the literature: pronominal (55.6%), locative (17.3%), and determinative (2.8%). Points outlining paths were also very common in the data (13.8%).
However, it became clear through multiple parses of the data that signers engaged in conversation also used finger pointing for a number of other functions, which served to index and regulate aspects of the interaction itself-and represented 8.3% of all the finger pointing in the dataset (based on dominant hand glosses). These interactional functions were the fourth most frequent function of finger pointing in the study corpus, after pronominal, locative, and path points. These figures indicate that signers frequently leverage the indexicality of finger pointing to coordinate emerging interaction in Norwegian Sign Language, and that these functions should be given more consideration in future studies of signed language pointing more generally. Their frequent use here also underscores the importance of investigating diverse text types, and the caution we should have in using, for example, narrative (re-tellings), to make generalizations about signed language use.
The analysis presented here also highlighted the multifunctional nature of finger pointing in Norwegian Sign Language. One example of this was provided in Figure 3 where a finger point served both interactional (seeking and turn holding) and referential (as second person pronominal) functions. In another example (which was illustrated in Figure 2), a signer's finger pointing toward his interlocutor showed an acknowledgement of her previous comments. It also acted to mark the common ground which had emerged over the course of their conversation. Furthermore, the fact that signers can point to express both referential and interactional meanings, suggests that other types of signed language actions may also serve multiple functions. Future research could consider how other types of signed language actions also express interactional meanings, e.g., in line with the research on palm-up actions (Lepeut 2018;submitted) or how signs are timed and coordinated across turns (e.g., Groeber & Pochon-Berger 2014).
General theories of language have yet to fully integrate the contextual and interactional nature of language, even though these aspects align with cognitive-functional, usage-based linguistics. There is also specific work that directly addresses how such aspects of language use are essential to linguistic theory (e.g., Kress 1976;Silverstein 1976;Washabaugh 1981;Halliday 1985;Johnston 1992;Bavelas 1994;Couper-Kuhlen & Selting 1996;Langacker 2001). Prioritizing one function (propositional, referential) while dismissing others (interactional) is unable to provide a comprehensive account of language and distorts the complexity of language as it is actually used by speakers and signers. Thus, future work could focus more on these "pragmatic" functions of language so that they can be fully explicated and integrated into theoretical thinking. The pointing investigated in the current study is but one example of this type of work.

Conclusion
This paper has reported on a study of finger pointing in Norwegian Sign Language conversations that serve to index aspects of the emergent interaction, and not just discourse referents. Findings from a corpus of Norwegian Sign Language showed that signers frequently point as a way to 1) deliver information (e.g., indicate common ground), 2) cite previous contributions to the interaction, 3) seek a response from an interlocutor, 4) coordinate turn-taking, and 5) provide conversational feedback. These functions have not been previously considered in linguistic studies on pointing in signed languages, which have been primarily focused on referential functions. However, work on co-speech gesture in spoken language interaction has shown that speakers express similar interactional meanings through various manual indexical actions, such as manual pointing and other gestures, including palm-up gestures (see Section 1.3). While direct comparisons between the findings of this study and other literature is difficult due to different styles in reporting on the form of co-speech gestures and the focus of analysis, it is clear that finger pointing and other manual indexical actions, such as palm up actions, are used for interactional functions in both spoken and signed language communities in very similar ways (see Table 5). Indeed, a few recent studies that directly compare the speakers and signers of a community have observed many similarities across groups (e.g., Shaw 2019 for English These interactional functions are as important to meaning-making as pointing to indicate referents and locations as part of propositional meanings. In addition, finger pointing in signed language interaction can express both types of meaning, sometimes simultaneously, which makes drawing distinctions between which tokens are to be considered under the purview of linguistics, and which are not, untenable. Investigating (multimodal) language in conversation provides an opportunity to further develop a theory of language that accommodates the multimodal semiotic diversity and complexity inherent in face-to-face interaction. This diversity and complexity concerns on the one hand semiotic mode, namely an interplay between description, depiction, and indication (Peirce 1955;Clark 1996;Dingemanse 2013;Kendon 2014;Hodge & Ferrara 2018;Keevallik 2018). On the other hand, it entails different kinds of meaning, e.g., ideational, interpersonal, and textual (Halliday 1985); meaning exchange and presence manipulation (Washabaugh 1981); referential and pragmatic (Silverstein 1976). This study has contributed to this goal by providing a preliminary description of how signers are able to use finger pointing (a type of indication) to not only index discourse referents (referential meaning), but to also (sometimes simultaneously) index aspects of the conversation itself (interactional meaning).

Abbreviations
In the figures, Norwegian Sign Language (manual) signs are represented through capitalized English glosses, as is customary in signed language research. Pointing signs are annotated with the prefix pt, depicting signs are annotated with the prefix ds, constructed actions (bodily enactment) are annotated with the prefix ca, and fingerspelling is annotated with the prefix fs.

Supplementary files
The data, ELAN files, and supplementary materials used for this study are openly available via the Open Science Framework at https://osf.io/g8zv6/.