Abstract

This article presents an experiential model for community-engaged research that understands communities as living meshworks of embodied human beings, material circumstances, and affective environments. We first trace how community organizations and academics must increasingly respond to a push for hard data. Using an analysis of a national research study on hunger as an example, we then show how this “data imperative” can lead to collecting more and more measurable data on community members without addressing their human-based concerns. The meshworks approach that we suggest emphasizes recognizing participants’ most immediate needs as articulated by participants. As meshworks-inspired research has to be contingent and contextual within the meshworks of the community in which it takes place, we offer examples of what such research can look like in various community settings. Finally, we present a heuristic that community agencies and researchers can use to evaluate their own projects as meshworks while also gathering hard data.

Introduction

John: Once upon a time you used to have people who did surveys, right, and they sat down and did the surveys and they chatted with people. . . . But nobody does that any more, it’s all statistically driven, so everybody is sitting at tablets and computers entering data into forms. I don’t know who still does that. I don’t know if people still think that’s valuable. But it would have been nice if you could pick out of 20 people, two to have a conversation with . . . [because] it would give some, some texture, some flesh to whatever the statistics were. If Dorothy or Alfredo talked for 20 minutes about their life in [location cannot be specified] that might be helpful at least in providing a context for the data.

Many community-focused academic research methods, such as participatory action research and community-based participatory research, importantly emphasize the need to include community members more reciprocally in academic research (Baum, MacDougall, & Smith, 2006; Chevalier & Buckles, 2013; Hacker, 2013; Hammond, Hicks, Kalman, & Miller, 2005; Kemmis & McTaggart, 2005; Minkler & Baden, 2008; Minkler & Wallerstein, 2008; Pinto, Wall, & Spector, 2014). However, complicating this good work is the emergence of what we call the “data imperative,” a push toward collecting measurable quantitative or qualitative data to assess communities’ needs and address their problems. As John, a volunteer data collector and middle-aged preacher, describes above, nonprofit research in communities has become “statistically driven,” pushing agencies and organizations to gather specific measurable data yet losing the value of direct human interaction in the process. Nonprofits need such data to make arguments on grant applications, garner greater public/private support, and demonstrate their community “impact” (Benjamin, Voida, & Bopp, 2018; Bopp, Harmon, & Voida, 2017; Erete, Ryou, Smith, Fassett, & Duda, 2016). Community needs assessments are becoming more widespread as nonprofits are frequently required by grant-funding partners to collect data on unmet needs in their service areas (Becker, 2015; Fischer et al., 2018). Researchers in university settings participate in the data imperative also, as they are increasingly asked to provide evidence of the larger impact of their teaching, research, and community engagement (Driscoll & Sandmann, 2016; Eubanks, 2017; Fear & Sandmann, 2016; Franz, 2014). Given the data imperative across these contexts, vulnerable populations often experience “research fatigue” in which they are repeatedly asked to participate in studies to improve the social services that they use; however, these same people do not always experience direct, tangible benefits from that research, an imbalance that is a social justice concern (Clark, 2008). Missing from these data-driven approaches is attention to the material, affective, and embodied realities of research participants’ lives, which cannot easily be quantified as numerical data or even captured by traditional qualitative methods.[1]

This article presents a more experiential model for community-engaged research that understands communities as living meshworks of embodied human beings, material circumstances, and affective environments. As John describes above, data is richer and more vibrant when it is understood within the “texture” or “flesh” of people’s lived experiences. We rely on meshworks theory (Ingold, 2007, 2011, 2015) to imagine a different way to approach and analyze the “textured mesh” of participants, researchers, and the communities in which they live and work. This theory asks us to see the world as meshworks, or tangled lines, in which our lives intersect with other animate beings: “to live, every being must put out a line, and in life these lines tangle with another . . . when everything tangles with everything else, the result is what I call a meshwork” (Ingold, 2015, p. 3). In any given location, our “lifelines” are entwined with those of others:

Proceeding along a path, every inhabitant [of the world] lays a trail. Where inhabitants meet, trails are entwined, as the life of each becomes bound up with each other. Every entwining is a knot, and the more lifelines are entwined, the greater the density of the knot. Places, then, are like knots, and the threads from which they are tied are lines of wayfaring. A house for example, is a place where the lines of its residents are tightly knotted together. But these lines are no more contained within the house than are threads within a knot. Rather, they trail beyond it, only to be caught up with other lines in other places, as are threads in other knots. Together they make up what I have called the meshwork. (Ingold, 2011, p. 149)

Meshworks allow us to visualize communities not in terms of isolated places or people but as interwoven life trails. The “knots” of the meshwork, where the lives of animate beings intersect, are tangled and messy, and “this tangle is the texture of world” (Ingold, 2011, p. 66). Thinking of communities as having a multi-dimensional complex “texture” draws attention to the dynamic ways in which we relate to one another, other living beings, and our environments. As part of a given community’s meshworks, a researcher brings her own lifeline into a research site and thus contributes to that site’s “continual coming into being” and to “its weave and texture” (Ingold, 2007, p. 81). Community-engaged researchers must be aware of how their own lifelines entangle within, contribute to, and alter the people and places that they study.

Community-engaged academic research often becomes enmeshed with other forms of research conducted in nonprofits. For example, we have frequently collaborated with nonprofit agencies to conduct research that helps the agency and simultaneously contributes to academic scholarship. For example, Kathryn helped a senior center study how its food insecure clients obtain healthy foods, which the center used to write a grant application that would allow it to continue its food pantry. This same data became part of Kathryn’s larger academic research on how seniors navigate health-based advice with material constraints such as the inability to afford fresh vegetables. Similarly, Jenny has facilitated numerous service-learning projects to develop methods for assessing self-sufficiency among food pantry visitors, which have resulted in larger academic studies to theorize such work as well as ways to help community members enact research and data justice approaches in their communities (Bay, 2019).

Ensuring that such academic research impacts human lives does not necessarily mean producing more “objective” data. Likewise, if nonprofits want to learn more about the vulnerable populations they serve, they must complement the more empirical data typically required for grant applications, government support, and impact reports. By working together and recognizing research sites as meshworks, academics and community agencies can develop more innovative ways to humanely gather data. By analyzing interviews with data collectors for a national community-based study, the Food Insecurity Study (FIS), we highlight the challenges of carrying out academic research methods in community settings. We then present examples of what a meshworks-inspired approach to such research can look like, urging researchers to focus on a research landscape’s knots and textures in order to make visible human experiences not typically reported. Community agencies and academics can use the concept of meshworks presented here to resist the data imperative through a more human-centered approach while also practically responding to the social pressure for hard data. Because we want to create a more human-centered narrative here, each section begins with a quote from one of the FIS data collectors we interviewed.

How the Data Imperative Affects Community-Engaged Academic Research

John: So, they do not have to hire people from the deco to sit at a table with a computer and type those paper forms into something. So, I can see how that saves them money. I can see how that generates the information they did more quickly.

We ground our argument on an interview-based study[2] that we conducted with local data collectors, who helped to solicit data for the FIS, a national research initiative. The FIS was implemented by a large nonprofit dedicated to fighting food insecurity[3] in the United States and designed by a statistical company. Initiated over 20 years ago, the FIS was conducted six times, every four years, until 2014. The FIS provides a national picture of hunger as well as local and regional data on food insecurity. Although carried out by nonprofits, studies like the FIS are informed by academic methods in that they go through the institutional review board (IRB) process, and academic researchers often act as consultants in their design.

There has been a gradual shift in methodology for the FIS. The first five instantiations of the study relied on orally administered surveys of food insecure clients about their experiences with hunger. In 2014, the FIS moved to a tablet computer-based survey in order to control for human error and to protect participants’ confidentiality. Participants’ responses were uploaded to a central database, eliminating the need for data entry personnel, as John implies above. We focus on the tablet-based FIS here. Currently, the FIS is no longer conducted every four years. Rather, the national nonprofit uses data analytics from other large-scale studies to develop annual reports on the state of hunger in the United States. These data analytics combine data from the Current Population Survey (CPS) and from the Bureau of Labor Statistics (BLS) along with Nielsen’s food price variation, food budget shortfall, and national average meal costs. We see this trajectory as a disturbing trend away from interacting directly with food insecure individuals to gather information that is purportedly being used to improve their lives.

This trend, the data imperative, can be seen across community-based research, academic scholarship, and nonprofit work. We see the data imperative perhaps most explicitly in higher education where the crushing need to demonstrate one’s value in light of waning public support reigns supreme. The scholarship of engagement (SoE) movement, for instance, encourages faculty to measure, quantify, and assess their service-learning or engagement work in communities (Driscoll & Sandmann, 2016; Fear & Sandmann, 2016). Yet, as Eubanks (2017) highlights, even community-minded academics tend to see our research as “on” or “in” communities rather than “with” communities. Although community-engaged researchers tend to involve community members more frequently in the research process, that process itself still follows a traditionally academic, linear progression motivated by the data imperative: the research starts with specific questions, and researchers collect qualitative or quantitative data through interviews, surveys, focus groups, or clinical trials to answer those questions and to yield publishable results. In order to show the impact of their scholarship in measurable ways that are academically legible, researchers find themselves collecting more and more data on community members rather than participating with them to address shared concerns.

At the same time, the data imperative has independently reached into the nonprofit world as mission-driven organizations are now required by many granting agencies to collect, manage, and analyze data about their services and stakeholders, research which is often conducted in collaboration with academic researchers, as noted above (Erete et al., 2016). As Bopp et al. (2017) argue, rather than being liberating, the forced collection of data becomes stifling: “Instead of leading to productive and empowering data-driven decision making, monitoring and evaluation work is characterized by the erosion of autonomy, data drift, and data fragmentation” (p. 3608). The additional requirements of collecting and reporting data do not just encompass traditional outputs, such as pounds of food distributed or units of housing provided, but also include outcomes, or impacts of provided services. For instance, rather than reporting how many families were reached or how much food was distributed through a school backpack program, many funders now require backpack programs to demonstrate their educational impact, which shifts “metrics and data collection foci in response to externally re-framed missions and priorities, moving the organization towards a mission that is both undefined and unknowable” (Bopp et al., 2017, p. 3616). This push toward the collection of data that may not be in line with an organization’s mission erodes autonomy as data collection is dictated by a variety of outside influences such as boards, granting agencies, government, and the like. Bopp et al. note that “This array of actors exerts influence, impinging upon organizations’ autonomy in making decisions about choosing metrics, compiling and using data, and prioritizing data work” (p. 3611).

Indeed, rigorously collected data can be critical for uncovering social problems, and nonprofits do need such data. For example, when we shared our critiques about the FIS with the food bank director, she told us that she still found the FIS data useful for grant applications and stakeholder information. In short, from the food bank’s perspective, any data, however flawed, was better than no data at all. Recognizing the various purposes of data, we are not arguing against the collection of such data per se. Rather, we examine here how the process of data collection can impact a community setting and the people living their lives within that setting. Specifically, we are concerned about the impact that the data imperative has on the vulnerable populations being asked to provide measurable information about their lives but who do not feel that their lived experiences are being truly heard or understood through such research. As Clark (2008) found, people receiving social services can be so frequently asked to complete a survey or sit for an interview that they experience “research fatigue” in which “they are tired of participating and no longer value the experience or any of the associated outcomes” of participating in research (p. 956). However, as our study shows, more important than becoming weary of participating in too many research projects, community members feel that conventional research methods render invisible the complexities of their lives and their shared humanity. That is, purely data-driven projects can present a flattened view of the human “knots” in a community’s meshworks by ignoring certain aspects of human experience, even as such research is always already implicated within that meshwork.

The FIS illustrates how the data imperative abstracts the human experience of a social problem, as the national nonprofit sponsoring the study had very strict measures about how the data was to be collected even though those methods did not account for people’s lived experiences—that is, the textured knots of their lives—at emergency food outlets. For example, the strict protocols assumed that participants would be standing in rigid lines outside of food pantries while waiting for food and would be compliant with requests to participate according to the proscribed method. This approach assumed that data collection was a matter of extracting information from people, much like we might collect blood samples for a clinical trial. What we found through our interviews, however, was that human beings could not be isolated into “lines” of data; rather, the research sites were meshworks of intersecting lifelines and pathways, of which we were also implicated by our mere presence at the sites.

How our Own Lifelines Became Knotted With the FIS

MacKenzie: [The FIS training] was nice to know, giving me the guidelines, but I feel like until you are actually thrown into the situation, that’s the best way to do the training . . . I mean that was my first time working in the community type thing like that so just being there was like just responding to the different questions that were asked and responding like that it was more or less just play it by ear . . . day-to-day basis.

Our lifelines led us to participate in the FIS in 2014. Like many community-engaged researchers, we became interested in the FIS as a research project because we were already part of the community that it affected. As long-time volunteers at a local food bank, we volunteered to collect data for the FIS when that food bank decided to participate in the study. As a participating partner, our local food bank was tasked with recruiting and training data collectors to administer the FIS survey at emergency food outlets throughout the 16 counties that the food bank services.[4] Jenny participated in the FIS as a data collector, while Kathryn participated as both a data collector and the data collector coordinator. As such, Kathryn was briefed on the FIS data collection protocols by food bank staff who had attended a day-long training session. Based on this briefing, and a 200-page FIS data collection handbook, Kathryn trained volunteer data collectors on how to sample participants, obtain participant consent, administer tablet-based surveys, and then upload the survey data to a central server. The data collectors ranged from college students seeking summer internships, like MacKenzie above, to graduate students with ample experience conducting field-based research to community members with long-term experience with food assistance, such as John in the previous two sections. After facilitating a two-hour training at the local library, Kathryn scheduled the newly trained data collectors to visit emergency food outlets, where they were tasked with inviting food pantry visitors[5] to complete the over 50-question survey on a tablet computer.[6] The survey asked questions aimed to quantify what hunger looks like in the United States, including questions on household income, educational level, and frequency of visits to food assistance locations. Like MacKenzie describes above, the official guidelines for administering the survey did not prepare data collectors for the contingencies of carrying out the FIS in actual community sites, with real human beings asking “different questions” on a “day-to-day basis.”

Methods

Sacha: Some people were willing [to use the tablet] but others were just completely turned off—and they couldn’t read it, they didn’t have their glasses, it was too small . . . some people just wanted me to like say it to them—they were like “it would be so much faster if you could just read these to me and just check my answers.” So, a lot of people had issues with the tablet . . . on a couple of cases when we used the Wi-Fi units, they just refreshed, and people lost their whole work, which was really embarrassing . . . they would be 20 minutes in and almost like done, and they would lose it. Sometimes we could recover it, and sometimes it was gone.

As data collectors for the FIS, we had an uneasy feeling throughout the study that its methods, and the prescribed way in which data collectors were trained to carry them out, were not taking into account participants’ lived experiences. For example, as Sacha implies, participants frequently asked data collectors to press their answers for them if they were uncomfortable with the tablet technology or had a disability that made manipulating the tablet difficult. However, when we told participants that the study protocol did not allow us to help them manipulate the tablet, many of these individuals would stop the survey in frustration, telling us something like Sacha recounts: “it would be so much faster if you could just read these to me and just check my answers.” As data collectors, we shared their frustration as we recognized that, because of the tablet technology, these individuals were effectively silenced from sharing their experiences with food insecurity.

Meshworks theory recognizes that human beings essentially want to connect with other living beings: human lifelines “are always in the midst of things, while their ends are on the loose, rooting for other lines to tangle with” (Ingold, 2015, p. 22). When visitors wanted the human connection of telling their stories to the data collectors, or “other lines to tangle with,” what they got instead was the frustrating experience of learning to use a new technology–the tablet–when they were already in the vulnerable position of using a social service. Data collectors also experienced the tablet negatively–in Sacha’s case, embarrassment–when problems with the tablet technology caused visitors to lose their work on the survey.

Based on this unease, we decided to interview the other data collectors after the conclusion of the FIS to see if their experiences matched ours. Specifically, we sought to research the ways that data collectors had dealt with the study’s methodological limitations while administering the survey on the ground. At the time, we were less interested in interrogating the FIS method and more interested in discovering how the people collecting the data interacted with the people providing that data. With these goals in mind, our original research questions were:

  • R1: What challenges did FIS data collectors face in facilitating a tablet-based survey across diverse community settings?
  • R2: What strategies did data collectors use to meet these challenges and gain client participation?

To answer these questions, we conducted qualitative, semi-structured interviews (Marshall & Rossman, 2006) with eight of the 13 data collectors who had volunteered for our local segment of the FIS. We recruited the data collectors by emailing them to describe our study and request their participation. We conducted these interviews several weeks after FIS data collection concluded. Semi-structured interviews allowed us to ask participants follow-up questions and to provide participants with the opportunity to reflect on their experiences beyond our specific 11 questions (see Appendix A for interview questions). We also tried to eliminate bias by having Jenny conduct the interviews since Kathryn worked more closely with participants during the FIS.

In first approaching the interviews, we used a grounded theory approach, which emphasizes inductively analyzing qualitative information by looking for themes and patterns during and after data collection rather than analyzing data according to predetermined categories (Patton, 2002; Strauss & Corbin, 1990; Teddlie & Tashakkori, 2009). Grounded theory is an iterative process in which qualitative data is continually coded for emergent themes and broken down into comparable categories (Dey, 1993; Patton, 2002; Taylor & Bogdan, 1998). Following this approach, we examined the interview transcripts multiple times throughout our study while coding for patterns. We then compared our individual analyses to identify common themes. As participant observers, we then triangulated the interview transcripts with the field notes that we had taken from food assistance sites we had visited as FIS data collectors, coding these notes according to the same grounded theory approach (Teddlie & Tashakkori, 2009).

As our analysis progressed, it became difficult to separate the interview data into neat themes. Thus, we started to look for a theory that could help us to more holistically understand the complexities we had found at community sites throughout the FIS, in our interviews with data collectors, and in our own experiences, complexities that were not captured by the FIS’s strict methods. We found that the theory of meshworks—that is, lines of people’s life experiences as entwined in knots in any given place, yet also trailing beyond that place—provided a theoretical framework for understanding the messy, dynamic, and embodied human interactions that we saw and the data collectors described at the community sites (Ingold, 2011, p. 149). We realized, in a sense, that we had been reproducing the data imperative ourselves by looking for separate strands of human experience that we could measure or isolate into neat categories, when those strands were interconnected and enmeshed within the research site. Meshworks theory moved us past independent data points to see the richer lifelines that make up our emplaced, embodied, and intersectional world.

Results

We organize our results around three descriptors for the meshwork of FIS data collection: affective, environmental, and embodied. We present these themes as intertwined knots rather than as stand-alone themes, because they all interconnect in meaningful ways to show where the FIS flattened the human experiences at research sites as well as how data collectors adapted those methods in more textured, human-centered ways.

Affective Knots: “Sometimes you know I don’t eat at all in a day [even]with these services”

Jane: I had several people say to me, they were getting upset, visibly upset because they would try to answer these questions, and they are doing the best they can to answer them honestly because they really want [to], they’re hoping this survey will make a difference in the services that are provided to them and they are not able to answer the question, because it might say, “how many times this week did you have to go without a meal” and there’s not enough options there for them to answer it. It assumes a certain cap, and I’m talking to people who say, “sometimes you know I don’t eat at all in a day [even]with these services,” and there just weren’t the right options for what they wanted to answer, and that’s something that again with a pen and paper you can write it in. With that tablet you’re restricted to the options that they gave you for answers, and the answer isn’t always there because it’s almost impossible to anticipate millions of answers and think of every scenario.

A social science graduate student, Jane highlights here how the lifelines that visitors carried into the community site—the “trailing threads” of their lives beyond the particular meshwork of that site—were not accounted for by the prescriptive FIS survey. There was a disconnect between the reality that the FIS assumed visitors to be living, as indicated by the options that the survey presented, and the dynamic meshwork of people’s lives as they experienced them, as indicated here by the visitors who sometimes skipped more meals than the survey allowed them to report. Her comment shows the affective implications of such a disconnect: visitors became “visibly upset” when they realized that the tablet-based survey would not allow them to share their stories in ways that they felt truly represented what they were experiencing and literally walked away.

The tablet technology further distanced the FIS from visitors’ lived experiences, as Jane continued:

I’m sure, [the tablet] streamlines all of the number-crunching that’s going on because they’re able to send it all online, and it’s immediately into this larger nationwide database and not having to go through and read people’s writings, read different people’s writing and all of that, it does I think make it easier to maintain anonymity and privacy but […] you’re going into a community essentially where people are socioeconomically disadvantaged, and they may have never held a tablet before and that’s very intimidating—if you never held a tablet before, to take a survey on a tablet? You know, you might be afraid you’re gonna break it or that you’re going to look silly because you don’t know what you’re doing. There’s an embarrassment factor there. I think there should have at least been the option if somebody really really did not want to use the tablet an option to have somebody administer the survey as it’s been done in the past with paper. I think response rates would have been much higher, I really do, because that was the number one reason when people did give a reason why they didn’t want to, the number one reason they gave was that “I don’t use that thing and I don’t even want to try.”

Here Jane highlights the gap between the FIS’s push to collect data easily and quickly—that is, the data imperative—and the affective, material situations of visitors at emergency food outlets. Similar to Sacha above, Jane recognizes how this push for efficiency became problematic on the ground, as the tablets created the conditions in which potential participants responded negatively to the study. Jane suspects that this response stemmed from participants’ embarrassment or intimidation at not wanting to “break” the tablet or “look silly” using it. Although Jane is careful to mention that she doesn’t want to “overgeneralize or pigeonhole anybody,” by assuming that “socioeconomically disadvantaged” individuals do not have access to tablet technology, the fact remains that participants often did not want to use it. That is, the tablet created an affective barrier for many visitors, which eliminated a chance for those participants to express their individuality, as could be expressed through a paper survey or in person during an oral interview.

This example shows how the affective knot twists and intertwines into a technology knot, which complicates the ability for participants to be heard and treated with dignity and respect. For participants, it was often disturbing to be asked to use a new technology when they were in the already vulnerable position of seeking social services. Their affective response impacted how they were able (or not able) to have their voices recorded in the FIS.

Because the technological choice to employ the tablets valued efficiency for the researchers over participants’ comfort levels and preferences, the tablets became a barrier to the types of conversations and human connections that data collectors and participants could share, as explained by Mary:

By us talking to them it shows that we care. I think that might be helpful for the people we are serving as well. Versus, you know, oh come participate in the survey. But then we just go and push them to the tablet—you’re on your own type of thing. I don’t know. Just human interaction, I think is very important.

Mary’s comment highlights that the tablets were an affective problem for data collectors as well as participants. Mary expressed frustration that the tablet existed as a physical barrier to the types of “human interaction” that she herself was able to have with participants. Because she was obligated by the FIS protocol to ask participants to use the tablet, she was not able to talk directly to participants as much as she would have liked to show that, as data collectors, “we care.” Ironically, in the push to “save time” by using tablets, the FIS reduced the more valuable “affective time” that participants might share with data collectors, time that might have resulted in more textured data to emplace and embody the issue of food insecurity.

Environmental Knots: Shifting Lines Shape the Story

Betty: A lot of people just did not have the time. A lot of people were with rides and they had to go when the other person was going to go or they had kids or they had to go to the next food pantry before it closed because they needed to do all that while they were downtown or whatever. So, the time was the biggest problem that people would be willing to do it but did not have time.

The study protocol instructed data collectors to tell participants that the survey would take 20 to 30 minutes. However, in our experience, and as the other data collectors corroborated, it took most visitors well beyond 30 minutes to complete the survey, depending on factors such as familiarity with the technology, disruptions with the technology, and distractions at the site. For some visitors, it took over an hour to complete the survey. As Betty’s comments show, the visitors at each site came to it with other “trailing threads” of their lives that they had to rush off to, such as kids, carpools, or other obligations. Betty, who is a long-time community organizer with ample experience collecting such data, further explained that more visitors may have taken the survey if it was offered in other formats:

But I definitely think more people would have done it if it was written out, if they could even take it with them and mail it back. I know that’s tough to get. If they could have taken it another time or if they could have gone online either at home or at the library and done it that they may have been willing to do it. All those options. Because of the time.

The survey design lacked attention to the material lives of participants in a way that limited their participation. Moreover, it also assumed that participants were willing to share their experiences via a machine rather than with a human being.

While the survey limited the types of affective experiences that could be shared, participants did alter the study to their own desires, bringing it into their own living meshworks. The FIS protocol assumed food emergency sites to be fairly static environments, in which visitors would be lined up waiting to enter the pantry or receive a hot meal. Based on that assumption, data collectors were trained to count visitors and to ask only a certain number of visitors to take the survey according to a sampling protocol determined by the number of visitors expected at that site. For example, at a site where over 100 visitors were expected, data collectors might be asked to sample every fourth visitor in line. Data collectors were instructed not to allow visitors who were not sampled to take the survey. However, on the ground, data collectors described that it became difficult to adhere to this strict sampling procedure. For example, Betty recounted visitors wanting to switch places with other visitors so that they would (or would not) be sampled to take the survey:

The frustrating thing was that more people would have done it than we could have by the constraints of the surveying sample. By only being able to ask the certain number of people and in the certain order that way, we couldn’t get some people [who] really would have liked to have done it and they couldn’t because they weren’t, we didn’t ask them, we couldn’t ask them. So, I would have loved to have modified that but really tried to stay away from that; maybe a few, maybe a time or two if they sort of insisted, well “I will switch places with her and she can do it,” you know.

Unsurprisingly, Betty’s comments highlight that these community spaces were not the static environments that the protocol assumed they would be because they were inhabited by real human beings with affective desires. Rather than adhere to the literal lines—bodies simply standing next to one another—that the FIS protocol expected, individuals brought their affective and embodied “lines” to bear on the study and adapted it according to their own human needs: when some visitors wanted to take the survey and others didn’t, visitors would “insist” that they switch places with one another. We can think of this adaptation as visitors incorporating the FIS into the living meshwork of the community site. Muir and McGrath (2018) explain the meshwork as “the process of living with others, at once entangled together, located in particular, embodied, material locations, and yet still not wholly defined by that location, due to the unique paths which each person has forged through the world” (p. 165). Although visitors and data collectors occupied the same material environment—a physical space that it might be tempting to assume that researchers could define in advance—each individual brought their own “unique paths” to that location. The FIS’s protocol assumed that participants would comply with its methods, but these visitors forged their own lines, literally by switching spots in a physical line, thus molding the FIS’s methods in a manner more attuned to their circumstances and preferences.

Embodied Knots: Building Trust in the Quiet Moments

Sacha: I remember I think it was the first one [site visit] I went on where they [food site staff] held us back . . . they were just really confused about what we were doing, they were “why are you doing this, are you with the government?” . . . the person that we had initially contacted to come wasn’t there; the pastor wasn’t there, so it was the second pastor, and he was really suspicious so we missed out on some clients that way. Yeah, every once in a while we had the suspicion and, you know, “who is this and why are you here and what are you doing?” I remember basically kind of really protective of their clients and of their own procedures like “we’re doing this right.”

As Sacha describes, there was sometimes a level of distrust surrounding outsiders coming into a community asking to gather data. Data collectors had to first gain the trust of food site staff before even coming into contact with visitors. Sacha’s comment implies that some staff suspected that the FIS might be surveilling how they ran that site, leading to a defensive stance. This suspicion could be a result of the research fatigue we noted earlier, in which community sites participate in research studies without experiencing any direct benefits. Or such distrust could stem from the reality that many data collectors did not have long lifelines, or prior experiences, knotted in the community sites they visited. Amanda, a college student with limited experience in social service settings, described how food pantry staff had more success in getting visitors to take the survey because they already knew their stories:

There was this one that I went to in [location cannot be specified]. And there was a lot of people there that weren’t mentally able to take it. And so the lady [on staff] knew pretty much the gist of everybody so she actually went up to them and said “do you want to take this survey?” and she pretty much didn’t let us go and interact with them and she was “I’ll send them to you.” Because I guess she knew who would be able to take it and who would be able to understand what they were asking in the survey.

For data collectors without much experience in the community, it was difficult to become enmeshed immediately in the needs of a particular site and its people. The data collectors themselves were literally marked on their bodies as outsiders by the FIS T-shirts that they wore and the briefcase, filled with the tablet computers, that they rolled into each site. As MacKenzie noted, “before we got T-shirts that said we were with [FIS] I feel like people they would see us coming in with the briefcase and they would be kinda hesitant.”

In order to build trust with visitors, then, data collectors had to become part of the site’s meshwork. Becoming enmeshed within a site often meant forgetting about data collection, if only for a moment. For example, Luke described researchers and participants joining together in a very embodied, material task, when no data was being collected: “Yeah, they just needed some extra hands getting some food inside. It was a quiet moment for us, so we helped out, we helped them out a little bit just lifting things and getting them in the freezer.” Seemingly simple acts, like helping to stock a food pantry freezer, shifts the focus to the needs of a community in a particular moment over the expedient collection of data. Sometimes, visitors seemed to need somebody to simply recognize their more immediate embodied experience rather than be seen as a source of information. For example, Sacha describes how some visitors just needed a way to keep their children occupied while they completed the lengthy survey:

Especially in a place [where] we, where they had to wait, the kind of tactic we would use is like “oh you can do this while you wait, your kid can color.” We had little sheets of paper and crayons and stuff . . . a sucker or something for their kid if they wanted one.

In one of the interviews with data collectors, Jenny recounted her own experience relating to participants on a more human level:

We went way up to [location cannot be specified], which is way up [a different county]. And there was a woman with a little baby, like an 11-month old baby. She had agreed to take the survey, and she was struggling, and I was like “do you want me to hold her?” And so I held the baby the whole time while she was doing it. The other data collectors had told me later, they were younger, they said “oh yeah, I thought about doing something like that but I didn’t feel like I could” […] I have held many babies, and I understood that woman was not going to do the survey if I didn’t hold her baby.

The other data collectors’ hesitance to hold a participant’s baby—because of what they interpreted as allowed under the FIS protocol—shows how the data imperative can prevent researchers from becoming enmeshed in a living, embodied community and recognizing basic human needs. In a sense, the other data collectors felt that they couldn't be a human being and a researcher at the same time and see that this woman needed something very simple—just someone to hold her baby.

Discussion: Examples of Enmeshed Research in Practice

If community research sites are enmeshed sites where the world comes into sharp focus, then the data collection methods followed in the FIS, and in the larger data imperative, fail to capture the intricacies of that world. Reciprocal and socially responsive community engagement is embodied, enmeshed engagement. It requires addressing participants’ most immediate material, emotional, and affective needs as articulated by participants just as much, if not more, than it means collecting data.

There is no one way to conduct research in an enmeshed, embodied way. Such research has to be contingent and contextual within the meshworks of the community in which it is being conducted. We offer a heuristic (see Appendix B) that could assist nonprofit and academic researchers assess how that research might affect the living meshwork of a particular local context as well as consider affective and embodied ways of being that traditional research methods might miss. This heuristic is not intended to be prescriptive but rather would emerge as its own meshwork within a wide variety of unique community research sites. In fact, we developed this heuristic from our own reflections on the meshwork of our past collaborative experiences, which allowed us to initiate a more enmeshed research practice in our own community-based settings. The examples that follow present some ways to collect data that prioritize affective relationships over pure data and how academics and community members can collaborate on humanely centered research methods.

Example 1: Photovoice projects that account for how participants want to share their stories

When working with food insecure seniors to better understand their approach to everyday health, Kathryn initially employed a photovoice method, in which she asked the senior research participants to take photos of what health means to them before discussing those images in oral interviews. She decided on this method in consultation with members of the senior center where the study was located and from her own observations, which indicated that seniors often preferred more visual forms of communication. She also based this methodological decision on scholarship that shows photovoice can capture material barriers to health not often made visible through oral interviews (e.g., Genoe & Dupuis, 2013; Neill, Leipert, Garcia, & Kloseck, 2011; Novek & Menec, 2014). However, when only one participant brought pre-prepared images to her interview, Kathryn thought that the study had failed because it did not produce the type of data anticipated by its method. However, further analysis showed that participants frequently referred to visuals around them, such as the food trays they were eating from or their own artwork on the walls behind them during the interviews. By referring to such visuals in the context where they were important to them and as they used them, participants threaded the photovoice method within the meshworks of their lives. Although this participant-driven adaptation did not generate the types of data anticipated, it offered a richer sense of how participants wanted to connect with Kathryn to share their experiences. That deeper connection between researcher and participants led them to collaboratively create several resources important to their enmeshed lives, such as a community cookbook (Swacha, 2018).

Example 2: Research that seeks to strengthen relationships among employees through non-representational ethnographic interviews

Emerging from a long-term partnership, Jenny is engaged in a research project with a growing food bank, which attempts to both capture the rich interwoven nature of nonprofit work as well as strengthen relationships among staff members. Rather than use traditional ethnographic or qualitative approaches, Jenny is employing non-representational ethnographic methods (Dowling, Lloyd, & Suchet-Pearson, 2017; Vannini, 2015) that seek to capture the affective comportment and feeling of the research site rather than an objective depiction of the work happening there. An important part of both this approach and the goal of the project is to strengthen communication patterns and relationships among employees. As the food bank has grown, it has diversified its programs and moved its location; whereas the food bank was once located in an industrial park and was purely a food distribution site, it has now moved to a low-income neighborhood where it provides direct service to clients through a food pantry, educational programs, and referrals to community resources. The additional staff, new location, and shift in mission have caused some friction. Through her non-representational approach, Jenny seeks to understand the feelings that have surfaced from this shift as well as the affective comportments emerging from a new and unintended hierarchical building structure. Thus, her interviews are a form of data collection but also a sort of “motivational interviewing” (Miller & Rollnick, 2012), in which a researcher attempts to encourage and support participants to reflect on their feelings and responses to situations to produce change. This meshworks approach does not view the research site with pure data collection or solving a community problem in mind as much as it acknowledges the messy, emotional, and embodied work of community nonprofits and sees data collection as an opportunity for reflection on that work.

Example 3: Anonymous writing projects that allow participants to share what they feel

Another approach to the emergency food outlet meshwork is the notion of the “secret box” (Lyndon, 2018; Punch, 2002). Jenny and her students are piloting a “secret box” at a local food pantry to provide visitors with an opportunity to share their feelings and situations through writing. The secret box method attempts to provide participants with an anonymous way to write/draw/depict the meshwork of their lives. A locked box with paper/pencil is placed in a central location with the invitation to share experiences, feelings, ideas, or suggestions. In our initial pilot, we asked visitors to reply to the question, “What’s your biggest struggle today?” This method addresses multiple issues: it allows visitors to share experiences, feelings, or problems that might not be quantified via a survey or questionnaire; it allows the food pantry to better understand the complexities of poverty that cannot be addressed by the mere distribution of food; and it uses writing/sharing as an outlet that allows visitors to feel respected and valued for what they are experiencing. We are already seeing some important results that can help us connect and collaborate with visitors to address their affective and emotional experiences.

Future Trajectories

This article has presented a model for community-engaged research that approaches communities as living meshworks of embodied human beings, material circumstances, and affective environments. As such, the lifelines of researchers and participants intersect within those meshworks, affecting the ways we collect and understand data. In order to combat the data imperative that has permeated our world, we suggest approaches to data collection that highlight the “knottedness” of human beings. Thus, if we want to capture the texture of those knots, we must be open to approaches that make visible the affective and material complexities of human experience. Doing so may mean that we don’t end up with measurable, quantifiable data. It may mean that our data collection methods result in relationships rather than general theories of our research. We have not presented a formal method for acknowledging the texture of the meshwork because such acknowledgment can take many forms, depending on local environments. Thus, we prompt readers to consider how we, as both academic researchers and those working with communities, can collaboratively develop approaches that attend to the human dignity of research participants and see their lives not as sites to be mined for information but as rich, vibrant intersections that provide openings for moments of connection and social justice.

References

  • Baum, F., MacDougall, C., & Smith, D. (2006). Participatory action research. Journal of Epidemiology & Community Health, 60(10), 854–857.
  • Bay, J. (2019). Research justice as reciprocity: Homegrown research methodologies. Community Literacy Journal, 14(1), 7-25.
  • Becker, K. L. (2015). Conducting community health needs assessments in rural communities: Lessons learned. Health Promotion Practice, 16(1), 15–19.
  • Benjamin, L. M., Voida, A., & Bopp, C. (2018). Policy fields, data systems, and the performance of nonprofit human service organizations. Human Service Organizations: Management, Leadership & Governance, 42(2), 185–204.
  • Bennett, J. (2010). Vibrant matter: A political ecology of things. Durham, NC: Duke University Press.
  • Bopp, C., Harmon, E., & Voida, A. (2017, May). Disempowered by data: Nonprofits, social enterprises, and the consequences of data-driven work. In Proceedings of the 2017 CHI conference on human factors in computing systems (pp. 3608–3619). New York: ACM.
  • Chevalier, J. M., & Buckles, D. J. (2013). Participatory action research: Theory and methods for engaged inquiry. Abingdon, UK: Routledge.
  • Clark, T. (2008). “We’re over-researched here!”: Exploring accounts of research fatigue within qualitative research engagements. Sociology, 42(5), 953–970. doi:10.1177/0038038508094573
  • Coole, D., & Frost, S. (Eds.). (2010). New materialisms: Ontology, agency, and politics. Durham, NC: Duke University Press.
  • Dey, I. (1993). Qualitative data analysis: A user-friendly guide for social scientists. London: Routledge.
  • Dowling, R., Lloyd, K., & Suchet-Pearson, S. (2017). Qualitative methods II: “More-than-human” methodologies and/in praxis. Progress in Human Geography, 41(6), 823–831.
  • Driscoll, A., & Sandmann, L. R. (2016). From maverick to mainstream: The scholarship of engagement. Journal of Higher Education Outreach and Engagement, 20(1), 83–94.
  • Economic Policy Institute. Missing Workers. Retrieved from https://www.epi.org/publication/missing-workers/
  • Erete, S., Ryou, E., Smith, G., Fassett, K. M., & Duda, S. (2016, February). Storytelling with data: Examining the use of data by non-profit organizations. In Proceedings of the 19th ACM conference on computer-supported cooperative work & social computing (pp. 1273–1283). New York: ACM.
  • Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. New York, NY: St. Martin’s Press.
  • Fear, F. A., & Sandmann, L. R. (2016). The “new” scholarship: Implications for engagement and extension. Journal of Higher Education Outreach and Engagement, 20(1), 101–112.
  • Fischer, K. R., Schwimmer, H., Purtle, J., Roman, D., Cosgrove, S., Current, J. J., & Greene, M. B. (2018). A content analysis of hospitals’ community health needs assessments in the most violent U.S. cities. Journal of Community Health, 43(2), 259–262.
  • Franz, N. K. (2014). Measuring and articulating the value of community engagement: Lessons learned from 100 years of Cooperative Extension work. Journal of Higher Education Outreach and Engagement, 18(2), 5–16.
  • Genoe, R., & Dupuis, S. (2013). Picturing leisure: Using photovoice to understand the experience of leisure and dementia. The Qualitative Report, 18(11), 1–21.
  • Gregg, M., & Seigworth, G. J. (Eds). (2010). The affect theory reader. Durham, NC: Duke University Press.
  • Hacker, K. (2013). Community-based participatory research. Thousand Oaks, CA: Sage.
  • Hammond, J. D., Hicks, M., Kalman, R., & Miller, J. (2005). PAR for the course: A congruent pedagogical approach for a PAR methods class. Michigan Journal of Community Service Learning, 12(1), 52–66.
  • Ingold, T. (2007). Lines: A brief history. Abingdon, UK: Routledge.
  • Ingold, T. (2011). Being alive: Essays on movement, knowledge and description. Abingdon, UK: Routledge.
  • Ingold, T. (2015). The life of lines. Abingdon, UK: Routledge.
  • Kemmis, S., & McTaggart, R. (2005). Participatory action research: Communicative action and the public sphere. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (pp. 559–603). Thousand Oaks, CA: Sage.
  • Lyndon, S. (2018). Analyzing focus groups about poverty in the early years using a narrative approach. SAGE Research Methods Cases. doi:10.4135/9781526445322
  • Marshall, C., & Rossman, G. B. (2006). Designing qualitative research (4th ed.). Thousand Oaks, CA: Sage.
  • Miller, W. R., & Rollnick, S. (2012). Motivational interviewing: Helping people change (3rd ed.). New York: Guilford Press.
  • Minkler, M., & Baden, A. C. (2008). Impacts of CBPR on academic researchers, research quality and methodology, and power relations. In M. Minkler & N. Wallerstein (Eds.), Community-based participatory research for health: From process to outcomes (2nd ed., pp. 243–262). San Francisco, CA: Jossey-Bass.
  • Minkler, M., & Wallerstein, N. (Eds.). (2008). Community-based participatory research for health: From process to outcomes. San Francisco, CA: Jossey-Bass.
  • Muir, J., & McGrath, L. (2018). Life lines: Loss, loneliness and expanding meshworks with an urban Walk and Talk group. Health & Place, 53, 164–172.
  • Neill, C., Leipert, B. D., Garcia, A. C., & Kloseck, M. (2011). Using photovoice methodology to investigate facilitators and barriers to food acquisition and preparation by rural older women. Journal of Nutrition in Gerontology and Geriatrics, 30(3), 225–247.
  • Novek, S., & Menec, V. H. (2014). Older adults’ perceptions of age-friendly communities in Canada: A photovoice study. Ageing and Society, 34(6), 1052–1072.
  • Owens, K. H. (2009). Confronting rhetorical disability: A critical analysis of women’s birth plans. Written Communication, 26(3), 247–272.
  • Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks, CA: Sage.
  • Pinto, R. M., Wall, M. M., & Spector, A. Y. (2014). Modeling the structure of partnership between researchers and front-line service providers: Strengthening collaborative public health research. Journal of Mixed Methods Research, 8(1), 83–106.
  • Punch, S. (2002). Interviewing strategies with young people: The “secret box,” stimulus material and task‐based activities. Children & Society, 16(1), 45–56.
  • Strauss, A. M., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park, CA: Sage.
  • Swacha, K. Y. (2018). “Bridging the gap between food pantries and the kitchen table”: Teaching embodied literacy in the technical communication classroom. Technical Communication Quarterly, 27(3), 261–282.
  • Taylor, S. J., & Bogdan, R. (1998). Introduction to qualitative research methods: A guidebook and resource (3rd ed.). New York: John Wiley & Sons.
  • Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. Thousand Oaks, CA: Sage.
  • Vannini, P. (2015). Non-representational ethnography: New ways of animating lifeworlds. cultural geographies, 22(2), 317–327.

Notes

    1. We use the terms affective, embodied, and material purposefully here. We draw from Owens (2009), who describes embodied knowledge as “an intuitive, tactic understanding of what one’s body is doing and can or must do” (p. 69). Affect refers to the “visceral forces beneath, alongside, or generally other than conscious knowing that can serve to drive us toward movement, thought and ever-changing forms of relation” (Gregg & Seigworth, 2010). Differing from emotion, affect is often a bodily experience outside of what can be understood and named; thus, affect refers to pre-personal intensities as opposed to feelings, which is affect understood across labels and personal histories, and emotion, which is the social sharing of feelings. Our understanding and orientation toward this constellation of terms is informed by new materialist theory, an extensive body of research that grants new vitalities to our conceptions of the material world (Bennett, 2010; Coole & Frost, 2010).return to text

    2. Our study was approved by Purdue University’s IRB, Protocol #1309013950.return to text

    3. Food insecurity refers to the inability to know where a person’s next meal will come from or whether they have enough food to feed their families.return to text

    4. Emergency food outlets include food pantries, where visitors obtain grocery staples, and congregate meal sites, where visitors receive a hot meal.return to text

    5. In an effort to provide a sense of dignity and respect to these community members, we refer to them as visitors rather than clients, language used by many nonprofits. Clients implies that they are receiving services on a sustained basis, which contradicts much of the research that demonstrates visits to food pantries are more sporadic, need-based, and change over time. The term visitor is our small attempt to ensure human dignity to food insecure individuals.return to text

    6. The exact number of questions varied based on participant responses to questions concerning age, gender, work status, household size, and income.return to text

    Appendix A: Data Collector Interview Questions

    1. How long did you work as a Hunger Study data collector? Approximately how many site visits did you attend?
    2. Is this the first time you’ve ever facilitated a survey? Do you have previous experience in a community-based context?
    3. Please briefly describe how you were trained as a data collector. How long was your training? What types of information and training materials did you receive?
    4. How well did the training prepare you for the site visits?
    5. Did you ever have to modify the procedures outlined during your training when you were on a visit? If so, when and how?
    6. Did you ever have difficulty getting clients to agree to take the survey?
    7. If you had difficulty getting clients to take the survey, did you use any techniques to gain client cooperation? If so, what were these techniques and were they effective? Why or why not?
    8. Did the type of space and/or program influence the procedures you used to gain client cooperation?
    9. Did the other data collectors on your team influence how you carried out your work? If so, how?
    10. How often did you work with pantry staff to help you carry out data collection? In what ways did pantry staff help you and/or hold you back from effective data collection?
    11. Describe your experiences with facilitating a survey administered on a tablet computer.

    Appendix B: MAEE Heuristic for Researchers

    M—Meshwork

    The following questions can help researchers in community settings to engage with their research site as a living meshwork, where intersecting lifelines might affect the experiences of research participants, the types of data that emerge, and the larger impact on the community:

    • Who are the multiple stakeholders in this project?
    • How are these stakeholders intertwined in various ways? Are there any subtle ways in which stakeholders are intertwined that will affect the research?
    • Who benefits the most from the collection of this data? In what ways will this data be used, and how will that use affect the meshworks of this community?
    • Which lifelines are we pulling on when we conduct this research?
    • What trailing lifelines do we need to consider when asking participants to participate?

    A—Affect

    Researchers can use these questions to think about the ways in which participants and researchers might interact affectively with a given project and what the ethical stakes of such affective interactions may be:

    • Which emotional lifelines are we pulling on by asking participants to engage in this research?
    • How are we as researchers affectively involved and implicated in this project?
    • What kinds of affective responses do we need to notice in participants, responses that may not be captured in traditional research methods?
    • What methods could best capture those affective and emotional lifelines in ethical ways?
    • In what ways can researchers and community members connect interpersonally throughout this project? How does the research design encourage or discourage such connection?

    EE—Embodied and Environmental

    These questions urge researchers to consider how their research affects, and is affected by, participants’ physical, environmental, and material contexts and constraints.

    • What are the ways in which participants’ and researcher’s bodies will be implicated in this research project?
    • How will the environment in which the research takes place affect how the research unfolds and how it affects community members and researchers?
    • How can this research (and the research process itself) make a positive impact on the embodied, environmental contexts of participants? Are there any ways in which the research might make a negative impact? If so, how might that negative impact be mitigated?

    Authors

    JENNIFER L. BAY is an Associate Professor of English and Director of Professional Writing at Purdue University, where she teaches courses in rhetorical theory, professional writing, feminist rhetorics, and community engagement. Her work has appeared in journals such as Technical Communication Quarterly, College English, and Community Literacy Journal as well as in edited collections. 

    KATHRYM YANKURA SWACHA is an Assistant Professor of English at the University of Maine. She works alongside community partners to study how people make health-based decisions in their daily lives. Her research has appeared in journals such as Partnerships, Technical Communication Quarterly, TESOL Journal, and Rhetoric Review