Meaning non-verbally: The neglected corners of the bi-dimensional continuum communication in people with aphasia

The potential for pragmatic insights to be enriched, and even generated, from investigation of people with communication disabilities has been vastly underutilised in theoretical pragmatics. An adequate pragmatic theory must account for the full range of human communication, including that of people with communication disabilities. A similar argument has been made regarding pragmatic explanations of the natural non-verbal behaviours accompanying speech, which has lagged behind exploration of non-natural linguistic meaning. These two domains e pragmatic research into the meaning of non-verbal behaviours and clinical research into the communicative strategies of people with aphasia (the communication disability that commonly follows a stroke) e have the potential to inform each other. This paper builds on the idea that a relevance-theoretic ostensive stimulus is typically a complex of linguistic elements, which usually convey propositional information, and non-verbal behaviours, which carry emotional or attitu-dinal information that supplement the verbal content. Many people with aphasia, however, rely much more heavily on the use of non-verbal behaviours. What do these convey? How can what is conveyed best be described and explained? This paper will use the ‘ bi-dimensional continuum ’ in which meaning and showing are plotted against determinate and indeterminate intended import (Sperber and Wilson 2015, p. 147) to demonstrate the complexity of non-verbal communication in dyads where one partner has aphasia. © 2021 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).


Relevance and meaning
Relevance theory (Sperber andWilson, 1986/1995) is a theory of cognition and communication which has become extremely influential in the fields of linguistic pragmatics, psychology, cognitive science and the philosophy of language.It also turns up in areas where you might not expect to find it: in theories of art (Pignocchi, 2018), museum-curating (Simon, 2016), and the study of neuro-aesthetics (Kolaiti, 2017).However, applications of relevance theory to the understanding of communication in the clinical domain have been rather limited and the potential for mutually beneficial research remains untapped (Jagoe, 2020;Jagoe and Smith, 2016).This paper takes the first tentative steps towards establishing it as a theoretical framework within which the gestural strategies of one particular population e people with post-stroke aphasia e might be characterized.There have been numerous attempts to characterize both the form and the function of gesture generally (McNeill, 1992;Kendon 2004), as well as in relation to aphasia (e.g.Sekine et al., 2013;Dipper et al., 2015;Kong et al., 2015) and we suggest that relevance theory provides a more coherent and unified approach.
Relevance theory is an inferential model, in which human communication revolves around the expression and recognition of the speaker's intentions in the performance of an ostensive stimulus: an act accompanied by the appropriate combination of intentions.This inferential model is proposed as a replacement for the traditional code-model of communication, according to which a speaker simply encodes into a signal the thought they wish to communicate and the hearer retrieves their meaning by decoding the signal they have provided.We will argue that much existing work on gesture remains rooted in a code model.
One important way in which the inferential model used in relevance theory departs from other philosophical approaches to the study of meaning (see, for example, Grice 1957Grice , 1968Grice , 1969) ) is that there is an element of looseness built-in to the framework.To a certain extent, a degree of looseness is inherent in any inferential model since the thought communicated to the hearer is only ever in a relation of resemblance with the thought entertained by the speaker.In a code-model it is in a relation of identity.But relevance theory goes further and differs from other approaches in two ways, both of which are particularly useful when it comes to the analysis of the weaker aspects of communication: attitudes, impressions and emotions, for example.
Firstly, while Grice's account involves an informative intention which attempts to modify directly the hearer's thoughts, Sperber and Wilson propose an intention that is better characterised as one to modify a hearer's cognitive environment: in other words, all the facts or assumptions that he is aware of, as well as all those he is capable of becoming aware of.In relevance-theoretic terms, this is characterised as the set of facts that are manifest to him (i.e. that he is capable of perceiving or inferring).Manifestness is key to the relevance-theoretic characterisation of an informative intention, defined as an intention 'to make manifest or more manifest to the audience a set of assumptions I' (Sperber andWilson, 1986/1995: 58).
The second difference is that, traditionally construed, the domain of pragmatics is co-extensive with cases of what Grice (1957) termed non-natural meaning (meaning NN ).These are those instances in which the certain combination of intentions referred to above can be shown to separate them from cases of mere intentional showing.In any case of intentional communication there are two layers of information to be retrieved.The first, basic layer is the information that is being pointed out, or conveyed.The second is the fact that this basic layer is being pointed out intentionally.For a case to be one of meaning NN Grice argued it was essential that recognition of the second layer must be involved in the retrieval of the first.In cases of showing, he argued, the second layer need play no role at all, since a hearer can get to the first layer without reference to the intentions of the speaker.(He uses intentionally showing someone a photograph as an example.) Sperber and Wilson argue that, actually, no such line should be drawn.Instances of both meaning NN and showing count as cases of overt intentional (or ostensive) behavior and cannot be ignored in theories of utterance interpretation.Moreover, there is a continuum of cases between the two extremes.Sperber and Wilson (2015) extend the continuum originally presented in the original version of Relevance to include another dimension between determinate and indeterminate intended import, which effectively turns the straight line into a square (see Fig. 1).
The showing-meaning NN continuum1 is defined by the type of evidence presented for the basic layer of information being communicated.If your friend asks you the time and you point to a clock on the wall, you have provided direct evidence of the basic layer: this is a case of showing.A coded response, such as an utterance of 'Half-past ten' is considered to be indirect (since you need to know the code) and is an example of meaning.
The continuum between determinate and indeterminate import is defined not by the type of evidence being presented but, rather, the nature of the information that is being pointed out, irrespective of whether it is shown or meant.When I point to the clock or utter 'Half-past ten' what I am showing or meaning is highly determinate: a particular time of the day (coded in the vicinity of 7 in the case of pointing to the clock, and in the vicinity of 1 in the case of uttering 'half-past ten').The highly abstract, figurative language in a poem is a case of indeterminate meaning NN : often, what is meant is too nebulous to be paraphrased at all.But some cases of showing are equally indeterminate.If, on seeing or hearing the time, you place your palm on your forehead and let out an ostensive sigh of disappointment or frustration, you might encourage me to entertain thoughts and feelings that are similar to your own (this could be represented in the vicinity of 9 in the bidimensional continuum).What you intended to convey to me was an impression, as incapable of being spelt out in words as a poetic metaphor.You did not mean any one thing.

Gesture and meaning
One of the most famous theoretical notions used in the analysis of gesture is Kendon's continuum, along which different forms of gesture are plotted between natural gesticulation and full-formed linguistic codes, such as Irish Sign Language (ISL) (see Fig. 2).
As we move from left to right on the continuum, the gestures take on more properties that are more language-like, and their use depends less and less on the co-presence of language itself.The most non-language-like gestures are those movements classified as gesticulation: these are what McNeill describes as 'unwitting accompaniments of speech' (1992: 72), spontaneous movements of the arms and hands that accompany speech: Communicators are either unaware or, at best, only marginally aware of these gestures.
Language-like gestures are similar to gesticulations but are integrated into utterances in the sense that they must occur at a certain point and contribute to the interpretation of the string as a whole: so, someone describing an altercation between two people might utter 'at first it was civilised enough, but then he [gesture which can be interpreted as representing a push] shoved him'.Pantomimes are movements which clearly depict objects or actions, accompanying speech no longer obligatory to decontextualize them.Emblems are those entirely culture-dependent symbolic gestures used to convey a wide range of both positive and negative meanings: the British two-fingered insult, or the more widely understood 'flipping-the-bird' insult are two examples.Finally, Sign Languages are, of course, properly linguistic systems, with their own syntactic, semantic and phonological rules.Wharton (2009) argues that while the continuum has generated a number of useful insights, the fact that it is based firmly in the code-model of communication renders it highly problematic: even at the leftmost end of the continuum, after all, the relationship between the gesture and what is conveyed involves the production of a coded signal and the decoding of that signal.The intentions of the speaker are nowhere in the picture and an important dimension of the debate concerning both the form and the function of gestures is missing.Might Kendon's continuum somehow be extended to incorporate the insights of the bi-dimensional showing-meaning continuum?We think not.Firstly, pure cases of showing can hardly be described as codes.When you draw someone's attention to something by merely altering the direction of your gaze, you can hardly said to be encoding something.Secondly, codes are by their very nature determinate.Codes are systems that reliable map signals with messages.The relationship between the coded signal and what it means must not only be stable, but also clearly defined.
We claim that the showing-meaning continuum, however, can easily be accommodated to incorporate Kendon's continuum.As far as form is concerned, the coding and decoding process fits neatly into the picture (in the vicinity of 1).Regarding function, and as we will try to demonstrate in this paper, the bi-dimensional continuum describes the full range of communicative possibilities.

Gesture use in people with aphasia
Aphasia is usually defined as a language impairment acquired as a result of a focal neurological lesion, most commonly caused by a stroke.It is characterised by varying degrees of impairment in language comprehension and production which, in interaction with features in the social environment, result in communication disability affecting participation and quality of life.For some people with aphasia, gesture may be an important communicative modality, particularly if verbal communication is very limited.There are two primary lines of research regarding the function of gesture use in aphasia.The first relates to gesture use as communicative, occurring either alongside speech or in the absence of speech to 'convey' meaning or augment verbal content (e.g.Sekine et al., 2013).The second relates to the use of gesture as facilitative in moments of word findings difficulties (e.g.Lanyon and Rose, 2009).Relatedly, gesture interventions may focus on improving the use and quality of gestures as a compensatory strategy or on the use of gesture as facilitative for verbal output or word finding (Rose et al., 2013).What is common across these lines of investigation is the focus on the semantic content conveyed (or facilitated) by the use of gesture.
Alongside research on the function of gesture for people with aphasia, is research on the types of gesture employed.The Kendon continuum has been influential in the field and focus has largely been on gestures which are symbolic (Rose et al., 2013), both in terms of descriptive research and studies of gesture interventions.Studies on gestures which are considered 'semantically-rich' (Kistner et al., 2019 p.4) or 'meaning-laden' (Sekine et al., 2013(Sekine et al., , p.1042) dominate the literature.Iconics (gesture which represents features of concrete objects or actions; McNeil, 2000) and pantomines have been most commonly investigated in aphasia,.Additional forms of semantically rich gesture, such as air-writing and number gestures have been proposed (Kistner et al., 2019).
The role of deictics, where a referent is indicated, and beats, where rhythmic gestures are timed with speech, have been less investigated in aphasiology.To a lesser extent, so have metaphorics, where a gesture represents an abstract concept.Indeed, the limitations of the code-model become apparent from the differences in how these gesture types are described.While Kistner and colleagues include metaphorics in their category of semantically-rich ('meaning laden') gestures, Sekine et al. (2013) classify metaphorics as 'abstract gesture, along with deictics and beats' e the latter two being considered semantically-empty by Kistner et al. (2019).The findings of Kistner and colleagues (2019) suggest that gestures may play an augmentative role, for example conveying attitudinal information.The potential communicative role of gesture in terms of non-propositional information and vague communication has been vastly underexplored and we suggest that relevance theory can provide a framework in which to understand the indeterminacies of gesture.
In an attempt to bring together the work on form and function of gesture in people with aphasia, Kong and colleagues (2015) propose a coding scheme which identifies the form of the gesture used (based on Kendon's continuum) and categories of function of the gesture.The coding system identifies eight primary functions of gesture, listed in Table 1.Although this coding system appears to provide a way to distinguish between communicative gesture and gesture functioning for intrapersonal facilitation of lexical retrieval or sentence reconstruction, on closer analysis it is difficult to map ostension.For example, function 1, would appear to capture an ostensive, communicative use of gesture, but it can contain a non-ostensive dimension too.For example, a gesture of a person with aphasia may 'add the information' that the person has a hemi-paresis of the dominant arm.This information is relevant to the clinician, but not intentionally conveyed by the person with aphasia.Similarly, we suggest that there is continuum of cases between function 1 and function 2. For example, we will show an extract in which the participant gestures by shaping index finger and thumb into an incomplete rectangle representing a card while producing the word 'licence'.The distinction as to whether this gesture fulfils function 1 or function 2 in the coding scheme above, depends on the context.If the gesture is produced (as in this case) in a discussion about an event in one's life, then the gesture may merely enhance the speech content.However, in Ireland, that gesture may add relevant information to the speech content as there is a transition from larger (more awkward) cardboard licences to the more desirable waterproof card-type.One can imagine instances in which that distinction is relevant and adds to the speech content e such as if one had put the licence through the wash.By contrast, function 6, which initially appears as likely to be non-ostensive, may have an ostensive element.For example, a gesture may be used to signal difficulty in finding word, or as an alternative to the lexical item during that moment of difficulty.The hearer may be able to interpret the intended import from the gesture, which may be determinate (e.g. in the vicinity of 7, if the gesture allows), or may be a less determinate impression: perhaps the individual has something specific to say and is attempting to find the right word (in the vicinity of 8 or 9).Note that the bi-dimensional continuum allows for these functions to be analysed as such, including with the nuance of whether the gesture resulted in lexical retrieval or not.For example, a person with aphasia might produce a determinate gesture (in the vicinity of 7), followed immediately by retrieval of the lexical item (an utterance in the vicinity of 1).

Table 1
Functions of gesture identified in the coding system of Kong et al. (2015).
Description of Function (Kong et al., 2015) (1) providing additional information to message conveyed (2) enhancing speech content (3) providing alternative means of communication (4) guiding and controlling flow of speech (5) reinforcing the intonation or prosody (6) assisting lexical retrieval (7) assisting sentence reconstruction (8) no specific function deduced There is clinical relevance in understanding how gesture either functions as part of the communicative intent of the person, or facilitates intra-personal processes such as lexical retrieval.We suggest that relevance theory provides a clearer way to think about these core functions.According to the relevance-theoretic account gestures do not need to be classified according to a priori categories, but rather exist as ostensive stimuli that provide evidence (either in tandem with speech, or alone) as to the speaker's intention.
This exploratory paper has two main aims.Firstly, we will use relevance theory and the bi-dimensional continuum, to explore the potential of a unified account of ostensive communicative gesture in people with aphasia.Secondly, the mere existence of the bi-dimensional continuum allows for a first exploration of what call its 'neglected corners', instances of communication in which propositional content is difficult to specify and what is conveyed is either vague or indeterminate.We will demonstrate that indeterminacy and impressions e perfectly common among typical speakers, but not even visible in the code model of communication e may be intentionally and effectively used by a person with aphasia.

Participants
Five participants with aphasia from the AphasiaBank dataset (MacWhinney et al., 2011) were included in the sample for this exploratory work.Aphasiabank is a multimedia database of interaction and language tasks involving people with aphasia, made up of several corpora drawn from aphasia researchers across the world.The data included within AphasiaBank has been cleared by the relevant Institutional Review Board, and participants consent to their data being included in AphasiaBank for research purposes and the database is accessible to members of the AphasiaBank Consortium.Participants were included on the basis of high levels of gesture use in a previous analysis by van Nispen et al., 2017.These participants are drawn from two corpora within the AphasiaBank dataset: The SCALE corpus (https://doi.org/10.21415/MPQN-W212)and the Tucson corpus (https://doi.org/10.21415/20DX-AR29).
Table 2 outlines the characteristics of the participants.For the purposes of this paper, illustrative extracts of conversation are used from 4 of the participants, Scale01a, Scale04a, Scale13a and Tucson13a in order to provide clear exemplars of positions 4e9 on the bi-dimensional continuum.
The participants' ages ranged from 58 to 78 years (mean ¼ 67.2).Of the five participants, four were male, and the duration of aphasia ranged from 3.8 years to 30 years (mean ¼ 16).All were monolingual English speakers.Participants presented with either Broca's (n ¼ 2) or Conduction aphasia (n ¼ 3).All participants had moderate aphasia according to their Aphasia Quotient scores on the Western Aphasia Battery (Kertesz, 1982), with scores ranging from 53 to 73 at the more severely affected end of the continuum.Participants were those reported by van Nispen et al. ( 2017) to use higher numbers of "essential gestures" e defined as gestures required for the meaning of the utterance to be understood, with a range of 32e67 essential gestures used during interview (mean ¼ 49.4).

Data and analysis
Gesture use in the self-report on speech, the stroke story, life event and non-protocol tasks (where available) were reviewed across each of the five participants.Movements of the arms or hands were included in the analysis when they were considered to be ostensive.Gestures which comprised a 'turn' in the interaction (that is, the interlocutor responded to the gesture as such), as well as gestures which accompanied speech were treated as ostensive.The transcription generated within aphasiabank was used and further detailed transcription of gesture was undertaken.Using the conventions described by Damico and Simmons-Mackie (2002) gesture is represented in the line above the primary orthographic transcription in double parentheses.In line with the exploratory nature of the project, gestures were identified from the dataset which appeared on initial review to represent the 'neglected corners' of the bi-dimensional continuum.Gestures which the authors agreed represented one of the areas of interest on the bi-dimensional continuum were then analysed and their position on the a Extracts used for the purposes of this paper were drawn from the four participants identified.
meaning-showing continuum and the determinate-indeterminate continuum explained and justified.This analysis yielded a visual 'mapping' of gesture onto the bi-dimensional continuum.The authors discussed the analysis as it progressed and agreed on the positioning within the bidimensional continuum.Analysis of the intended import of the gestures were based on the context within the interaction.Wherever possible the response of the hearer, as well as the subsequent clarification of the person with aphasia was used to inform the interpretation.However, given the nature of the dataset (in which the purpose was to elicit speech from the person with aphasia), the clinicians did not always provide a follow-up or interpretation of the utterance for verification.In some instances biographical details available on the participants provided clarifying information.
Inter-rater agreement was not calculated.For purposes of this paper, four exemplars, each from a different participant, are presented.Given the exploratory nature of the study, we present exemplars selected to illustrated instances gesture across the 'neglected corners' of the continuum.

Analysis
Four extracts are presented in the analysis which follows.Each is analysed in relation to the bi-dimensional continuum, with instances of at least one gesture use portrayed.

Extract 1 (from Scale04a)
In the extract below, from Scale-04, the participant uses a sequence of gestures in response to the question "how's your talking?" In this example, what is communicated is an impression e the intended import is indeterminate and not easily paraphrasable e including that Ginger finds talking challenging but not impossible, that it differs depending on the communication partner, perhaps that some conversations flow more naturally than others.In traditional accounts of gesture, analysis (and interpretation) would paraphrase the gesture to yield a proposition of some sort e perhaps that 'speaking is difficult'.Note however that this code-model approach loses much of the impression communicated by the sequence of gestures.Relevance theory offers a more satisfactory account of the gesture in this instance, an account which fits with the intuitive sense of what Ginger may have intended to communicate.Analysing the use of gesture as falling within the vicinity of 8 on the bidimensional continuum (a case of semi-indeterminate showing), we suggest captures all of the relevant information in this case e that Ginger is using gesture as an alternative to speech ('showing') and that she is conveying an 'impression' in the relevance theory sense of the word.Ginger makes an array of propositions (including but not limited to those above) more manifest to the clinician.In this case, as the speaker, she is not committing to any particular proposition, but allowing the clinician to accept any of the assumptions made more manifest, which also fulfil the expectations of relevance created by the clinician's question in line 20.Note that other information is also made manifest through Ginger's gesture, including the assumption that her right arm has been affected by the stroke.However, in the absence of any evidence that Ginger used only her left arm in order to deliberately communicate this fact, the clinician might note it as a clinically meaningful piece of information, but it does not form part of the intended array of propositions.

Extract 2 (Tucson13a)
In extract 2, the participant (Tucson13a) is responding to the request to describe something important that has happened in his life.
In line 184, the participant demonstrates the use of a gesture with a relatively determinate intended import, shaping index finger and thumb into an incomplete rectangle representing a card while producing the word 'licence'.The gesture is easily paraphrasable and occurs alongside the verbal output creating a multimodal utterance.This utterance can be analysed as a case of determinate meaning/showing (in the vicinity of 4 on the bidimensional continuum).In this example, the gesture and verbalisation complement each other and are therefore analysed as a case of meaning/showing.However, imagine that this participant produced the same gesture while saying "coffee" (i.e. the gesture provides an alternative means of communication).The bi-dimensional continuum can accommodate this, by considering the utterance as having both a determinate meaning element and a determinate showing element (which do not overlap in terms of intended import).In this case, the interpretation will be guided by expectations of optimal relevance.Note that the unified approach taken here avoids the need for an extensive a priori list of possible functions of gesture, with no need to invoke any further explanations as to whether the gesture or spoken output should be taken to be the intended message.
Participant Tucson13a also demonstrates instances of gesture use in which the impression communicated goes beyond a single paraphasable proposition.Line 186 represents a case of indeterminate showing (in the vicinity of 9 on the bidimensional continuum).The gesture, with the vocalisation of a sighed 'oh!' weakly communicates an array of propositions and attitudinal information, including that he felt a sense of relief at regaining his driver's license.Lines 190e193 could be analysed as falling in the vicinity of 6 e indeterminate meaning/showing.The use of the lexical item 'thousand' and 'back and forth' combined with the broad arc-like gesture communicates an impression of travelling great distances, but more weakly communicated is the impression that the driving was for no particular purpose other than enjoyment; that he was travelling alone; perhaps that it was on open roads out of the city.Similarly, the pairing of the word "freedom" (line 190) with the open handed gesture is challenging to 'paraphrase without loss' (a marker of indeterminate import; Sperber and Wilson, 2015, p. 122).Weakly communicated is an impression of an ability to travel wherever he chooses; a regained independence; a sense of pride; a valued sense of movement which may not be possible with his post-stroke mobility difficulties.The way in which this multimodal utterance is produced e the emphasis provided by the gesture, the speech and gestural prosody e together creates the impression conveyed.

Extract 3 (Scale15a)
The participant in extract 3, Scale15a, is similarly describing his memories following his stroke.In line 24 he uses 'finger counting' while stating "no, no, yes".
The intended import in this case is best explained using the bi-dimensional continuum, which is able to capture the less determinate elements conveyed.Unlike a code model approach, in which finger counting is treated as a straight-forward code, in this instance there is clearly a more nuanced impression created.What is communicated is an impression of three elements, experiences, instances or places (later it emerges that these are indeed three health centres), the first two being distinguished from the third which was more acceptable or correct.As is evident from this gloss, the utterance is difficult to paraphrase without loss, making it a case of semi-determinate meaning/showing (in the vicinity of 5).The clarification provided in line 31, comprised of a pointing gesture above head height, along with verbalisation of "that one was a good one" further constrains the hearer's inferential process towards identified of a place, related to stroke (given the conversational context), which is identified as being some distance away.Again, the meaning is semi-determinate, but shifts towards the vicinity of 4, depending on the degree to which knowledge of the local health institutions in the area are mutually manifest.

Extract 4 (Scale01a -non-protocol tasks)
Extract 4, from Scale01a includes a discussion about surgery.In line 30 the participant indicates the nature of the surgery, pointing to eye and the bridge of his nose.While doing so he uses the word 'here'.However, it is the gesture which conveys the location of where the surgeon operated.This utterance is one of determinate showing (in the vicinity of 7).

Summary of analysis
The four extracts, representing gesture use across four participants, provide evidence for the use of gesture across the bidimensional continuum.A visual mapping of the gesture use in these extracts is presented in Fig. 3.We suggest that such a mapping could be a relevant tool for investigation of how people with aphasia use gesture (and multimodal utterances in general).

Discussion
The participants with aphasia used gesture across the bi-dimensional continuum.Gesture types which have traditionally been considered as 'semantically-rich' or 'meaning-laden', were used both determinately and indeterminately.A focus on precise, or determinate 'meaning' in gesture use of people with aphasia belies the productive use of vague communication, which forms an important part of human communication according to relevance theory.
What of 'unintentionally vague' gesture?People with aphasia have a communication disability, and what we have not discussed in this paper is how the intended import may be imperfectly manifest in the gesture used as a result of the aphasia.It is beyond the scope of this paper to discuss this inevitability in any detail.However, within a relevance theory account, failure of communication is to be expected e communication in this account is not based on a failsafe heuristic (Sperber and  Wilson, 1986Wilson, /1995)).Therefore, no specific ad hoc mechanism is required to account for these instances, beyond that in which the utterance failed to achieve optimal relevance for the hearer.Further research should examine how hearers respond to less determinate cases of showing or meaning/showing and, in turn, how a hearer's response impacts on the next 'non verbal' turn.The nature of the aphasiabank protocol means that the focus of the interaction is to elicit verbal output.This focus means that at times the clinician does not respond or seek clarification in a way that they may do in a more conversational context.It may be the case that indeterminate import is considered by the hearer, in some instances, to be a function of the aphasia rather than a communicative choice.Perhaps because gesture typically takes a secondary role, communicators are less comfortable relying on their inferences, and seek verification through their more typical modality of speech.However, it is also possible that despite ostensive use, gesture itself provides fewer constraints, or perhaps fewer clues, to guide utterance interpretation.
The question arises as to what the bi-dimensional continuum can offer from the perspective of clinical research.The current approaches to analyzing the function of gesture in people with aphasia (e.g.Kong et al., 2015), provides clinical relevant information.However, this account doesn't distinguish between overlap in functions or tell us how the listener comes to decide what to attend to (for example, where the two modalities contradict each other).We suggest that the application of the bi-dimensional continuum loses none of the detail of these functions, but instead provides a consistent and coherent theoretical account which treats gesture as an inherent element of ostensive-inferential communication.In this account, gesture instead can be understood as an ostensive stimulus which provides a clue to the 'intended import'.For example, the function of 'providing additional information to speech' can be captured in a more nuanced way using the bidimensional continuum.Situated in the vicinity of the meaning/showing zone, the gesture can then be further analysed in terms of determinacy.The hearer's interpretation is driven by expectations of optimal relevance, in the context of knowledge about ability and preferences (e.g.reliability of gesture).Indeed, this model could allow for description of situations in which the gesture directly contradicts the spoken output.
What the account described here does not do is explain how gesture might be used to facilitate word retrieval e an intrapersonal process.What is of interest to us is the interpersonal communicative function of gesture and, as a cognitivepragmatic theory, relevance theory allows for a nuanced explanation of the ostensive-inferential processes involved in gesture use.However, it may be possible to use the bi-dimensional continuum to map the nature of gesture which appears to facilitate lexical retrieval for a specific individual or group of participants.The use of mapping across time could reveal an association between an individual's use of gesture in a particular vicinity of the bi-dimensional continuum, followed by lexical retrieval (which would be mapping in blocks 1e3 on the continuum).In this way, the association and latency between the gesture and the spoken output may signal that the gesture facilitates lexical retrieval, while at the same time allowing for mapping of how determinate the gesture needs to be for successful lexical retrieval.
Conversational data from people with communication disabilities such as aphasia supports the bi-dimensional continuum as a framework which brings together dimensions of meaning/showing as well as graded considerations of determinacy.In turn, as argued in this paper, relevance theory offers a conceptually unified approach with which to explore and explain ostensive-inferential communication, including gesture, in people with aphasia.The analysis of the exemplars presented demonstrates how people with aphasia may use gesture to communicate an impression.Critically, as discussed in the introduction, vagueness or indeterminacy is not pathological, but rather a feature of human communication.Moving beyond the code model may also provide a framework in which to address the intentional use of 'vague' gesture, recognizing that human communication is not consistently determinate.

Fig. 3 .
Fig. 3. Visual mapping of exemplars onto the 'neglected corners' of the bi-dimensional continuum.