Effects of Prepaid Postage Stamps and Postcard Incentives in a Web Survey Experiment

Even small monetary incentives, e.g., a one-dollar bill in a postal invitation letter, can increase the response rate in a web survey. However, in the euro currency area, the smallest amount of monetary incentive for a postal invitation is a five-euro bill, which is costly. As such, we conducted a random experiment with prepaid stamp and postcard incentives as affordable alternatives. We compare the effect of our experimental groups with a control group in terms of response rates, response rates in a subsequent wave, data linkage consent, and data collection costs. Compared with the control group, the postcard incentive has no effect on our outcomes except overall costs. Using a prepaid stamp incentive increases the response rate overall but with different effect sizes for subgroups. We find no effect of stamp incentives on response rates in a subsequent wave or data linkage consent.


Introduction
Empirical findings consistently reveal that incentives increase response rates. Research provides solid empirical evidence that prepaid incentives are more effective than incentives that are conditional on completing a survey while not necessarily being more cost efficient. Furthermore, various experiments highlight the fact that prepaid monetary incentives work better than conditional monetary, gift, or lottery incentives (e.g., Dykema et al. 2015;Edwards et al. 2009;Mercer et al. 2015;Singer and Ye 2013). However, higher incentives do not increase response rates linearly but with a declining rate (Mercer et al. 2015;Singer and Ye 2013). Nevertheless, even the smallest amount, for instance, a one-dollar bill, can yield a substantial increase in the response rate compared with no incentives or a gift incentive (e.g., Sun et al. 2020).
For web surveys in which respondents are invited by postal mail, using the smallest bank note amount for a prepaid monetary incentive to increase response rates may be a cost-effective solution for many survey projects with small budgets. In the euro currency area, however, the smallest amount of monetary incentive feasible for a postal invitation is a five-euro bill. Choosing a smaller amount of money would require sending coins, which would be inconvenient. As a five-euro prepaid incentive probably exceeds the budget of many studies, we strive to find more affordable alternatives that can be enclosed in a standard invitation letter.
This article reports on an experimental study from Germany that invites individuals to respond to a web survey using two small prepaid noncash incentive alternatives and one control group with no incentive. While the first alternative included a postage stamp set consisting of 10 different stamps with a total value of €1, the second alternative enclosed a blank seasonal postcard specially designed for the survey. We randomly assigned our sample (N = 7,500) to either one of the two incentive groups or the control group. We investigate the effect of our incentives on (1) response rates; (2) response rates in a subsequent wave; (3) data linkage consent; and (4) data collection cost.
The article is structured as follows: We briefly review the literature on the effectiveness of gifts especially stamp incentives and build our hypotheses. Subsequently, we explain our study design and present the results. The last section summarizes and discusses the results.

Effects of Nonmonetary Prepaid Incentives
In general, two theories explain how prepaid incentives work: social exchange and utility theory. Social exchange theory refers to reciprocity as a norm underlying social exchange. Sociologists argue that a "gift" creates an obligation to reciprocate with a gift (Gouldner 1960;Mauss 1990Mauss [1954). Providing prepaid incentives to potential respondents thus creates an obligation to give something in return, and respondents fulfill this obligation by participating in the survey.
However, not every kind of prepaid incentive leads to survey participation. Utility theories argue that the benefit of a prepaid incentive must outweigh the cost of survey participation to be effective. Groves et al. (2000) postulate in their discussion of leverage-salience theory that the decision of whether to participate in a survey depends on the subjective weight given to various factors for and against participation. Therefore, an incentive creates a positive leverage that may be salient to a person.
Based on utility theory, different kinds of incentives have different quantities of utility. As money can be spent freely to buy goods and services, monetary incentives will usually yield the greatest utility for an individual. Other nonmonetary incentives, such as vouchers or gifts, offer fewer possibilities for usages, which may explain why monetary incentives have a larger effect than nonmonetary incentives (Becker et al. 2019;Lipps et al., 2019).
Searching for an affordable alternative to monetary incentives that can be included in an invitation letter, Gendall and Healey (2008) list the following criteria: practical, small, durable, flat, and likely to appeal to individuals. As such, researchers experimented with cash-like incentives or little gifts. For instance, luggage tags in the invitation letter for a web survey increased response rates significantly by approximately 7.5 percentage points (Cobanoglu and Cobanoglu 2018). Different incentive experiments in New Zealand with tea bags, chocolate, postage stamps, and charity donations had less clear results. Adding stamps and chocolate in the invitation letter to a mail survey significantly increased response rates only in some surveys, while tea bags did not affect response rates (Fairweather 2010;Gendall and Healey 2008).
Compared with monetary incentives, the exact utility of a particular gift incentive varies more strongly across consumers. On average, gift incentives have a lower utility than a monetary incentive of equal value. Assuming that the perceived utility is the leverage for individuals to respond to a survey, the effect of the gift incentive depends on the survey operators' ability to choose an affordable gift with a common and wide-ranging utility to optimize the cost-benefit of using an incentive.
For our web survey, we use a set of postage stamps or a plain postcard showing a winter theme (see Figure 1, appendix) as affordable alternatives to monetary incentives and test their effect on survey outcomes compared with a control group.

Developing Hypotheses: Effect of Stamps
While a considerable number of studies have evaluated the effect of low-cost gift incentives on survey outcomes (e.g., Becker et al. 2019;Cobanoglu and Cobanoglu 2018), only a few studies have evaluated the effect of using stamps as a monetary incentive (Gendall and Healey 2008;Harkness et al. 1998;McConaghy and Beerten 2003;Wetzels et al. 2008). Furthermore, none of these studies evaluated survey outcomes for a web survey with a postal invitation.
Below, we review the findings of previous studies evaluating the usage of stamp incentives and formulate hypotheses on how using stamps may affect survey outcomes and costs.

Response Rates
Most studies using stamps as an incentive find a moderate positive effect on response rates compared with no incentive i.e., 3-6 percentage points (Gendall and Healey 2008;Harkness et al. 1998;McConaghy and Beerten 2003;Wetzels et al. 2008). Based on these findings, we hypothesize that the stamp incentive increases the response rate.
The effect of stamps on response rates may correlate with respondent characteristics. Findings indicate that gender matters regarding incentives: Women are more likely to react positively to smaller prepaid incentives than men or to prefer nonmonetary conditional incentives (Boulianne 2013;Sittenthaler and Mohnen 2020).
Studies using stamp incentives find mixed results regarding response rates within subgroups. For instance, Gendall and Healey (2008) find that stamps increase the response rate in older age groups, while Wetzels et al. (2008) find a decrease in response rates for the 45-54-year-old age group compared with other age groups. Harkness et al. (1998) only find an increase in response rates for age within the group of women. In a survey conducted in the Netherlands, stamp incentives increased the response rate with increasing household size, income and Dutch nationality (Wetzels et al. 2008), while other studies reported an increased response rate in regions with a lower initial response rate compared with regions with an already high initial response rate (McConaghy and Beerten 2003). Based on those mixed results from four surveys with different populations, we refrain from formulating specific hypotheses. However, those empirical findings suggest that incentives affect response rates between subgroups differently. Therefore, we conduct an exploratory analysis.

Effect on Responses in Subsequent Waves
To our knowledge, no study has evaluated the effect of a one-time stamp incentive in a first wave on response rates in a subsequent wave. Studies that used an incentive in the first wave but not in subsequent waves found a positive effect or no effect on subsequent wave response rates and cumulative response rates (e.g., Göritz 2008;Göritz and Wolff 2007;Scherpenzeel et al. 2002). Göritz and Wolff (2007) note that incentives may directly or indirectly affect the response in a subsequent wave. A direct effect would increase response rates in the subsequent wave. An indirect effect would increase the number of cases in the subsequent wave by increasing the response rate in the first wave. Because the response rate increases in the first wave, the cumulative response rate increases, that is, the share of respondents who responded to both waves.
Using a stamp incentive in the first wave, we hypothesize the response rate to increase in the subsequent wave (direct effect) or to increase the cumulative response rate (indirect effect) compared with our control group with no incentive.

Effect on Data Linkage Consent
Given the respondents' consent, our survey data can be combined with administrative data to increase its analytic potential. We are not aware of any studies evaluating the effect of incentives on data linkage consent. Against the background of social exchange theory, gift incentives may create an obligation to reciprocate by consenting to data linkage. Therefore, we expect that the stamp group would yield a higher linkage consent rate than the control group.

Data Collection Cost
Managing the trade-off between data quality and data collection costs when conducting a survey is one of the main tasks of survey operators. We report the data collection cost for each group as important information for researchers when planning a survey.
Well-used incentives may reduce data collection costs as they acquire a higher response rate without sending many reminders (Brennan et al. 1993) or increase the overall response rate in such a way that the cost per respondent decreases. For instance, Scherpenzeel and Toepoel (2012) report that using a €10 prepaid incentive reduced the cost per registered household from €72.84 to €70.20 compared with no incentive. However, incentives may only reduce the cost per respondent if the initial data collection cost per respondent is already high while the incentive used is comparably small (Saunders et al. 2006) e.g., in a face-to-face survey with a €10 prepaid incentive. As we conduct a web survey with a postal invitation, the cost per respondent emerges from postage charges (€0.53 per invitation). Therefore, we do not expect that our incentives reduce costs.

Effect of Postcards
We are not aware of any study that uses postcards as incentives. Following social exchange theory, even though the utility of receiving a postcard incentive may be very low, we may see some positive effects based on the principle of reciprocity. As such, we expect positive effects on response rates, response rates in subsequent waves and the propensity to data linkage consent. Since the perceived value of a postcard should be smaller than stamps, we expect that the magnitude of the effect of postcards on our outcomes is weaker compared with stamps. For the effect of response rates within subgroups, response bias and cost, we use an exploratory approach.

Data
Our experiment was implemented in a web survey assessing public fairness perceptions utilizing factorial survey experiments. The factorial survey consisted of several vignettes, i.e., short scenarios with randomly varied key characteristics (e.g., Auspurg and Hinz 2015). The sample for the survey was drawn from a 2% sample of the "Integrated Employment Biographies" (IEB V13.01.00-181010, Nuremberg 2018) of the Institute for Employment Research (IEB). The IEB cover all registered spells of employment due to social security contributions as well as unemployment benefit receipts, job searches, and participation in labor market programs (see Antoni et al. 2019). The sample was restricted to German citizens who were at least 18 years old at the time of data collection (for more details, see Osiander et al. 2020). The first wave of the survey was fielded between November 2019 and January 2020, and the second wave was fielded during May 2020. Data used in this article are available at IAB. Up-to-date access information can be found here (https://iab. de/en/topics/research-data-and-methods/).
For our study, we invited 7,500 individuals with a postal invitation letter. Using Stata, we randomly assigned cases to three experimental conditions (postage stamps, postcard, no incentive) with 2,500 cases each after conducting a power analysis (Cohen's d = 0.07, β ¼ 0:8, α ¼ 0:05). Respondents were not aware of the experimental setting.
The postage stamp incentive consisted of a set of 10 different stamps with a total value of €1 (see Figure 1, appendix, left panel). A few months before the survey was fielded, the German Post AG increased the postage fee from €0.70 to €0.80 for a standard letter. Respondents could therefore use the different low-value stamps of the set to supplement stamps bought prior to this increase in price.
The postcard incentive was a postcard showing a winter theme (see Figure 1, appendix, right panel). Since the survey was fielded in the pre-Christmas period (November-December), respondents could use the card as a greeting during the Christmas or winter holiday season. The printing cost for the 2,500 postcards was €217.76-i.e., €0.09 per invited individual.
We prepared two different versions of the questionnaire with vignettes on fairness perceptions regarding two different social policy programs in Germany. All 7,500 invited individuals were randomly assigned to one of the two versions. The "unemployment benefits version" dealt with the question of what maximum duration of unemployment benefit receipt the respondents considered to be fair under different circumstances. The "training subsidies version" focused on the perceived fairness of financial support from the unemployment insurance system for training measures within firms. In addition to the short hypothetical scenarios, we collected sociodemographic information, political party preferences, and social policy relevant attitudes.
Invitation letters for both versions were identical; they informed individuals that researchers from the IAB and the University of Bamberg were investigating fairness perceptions regarding social and labor market policies. Since the recruitment strategy was equal, our analysis does not differentiate between topics. The invitation letter contained a short link, QR code, and password to access the web survey. The letters for those respondents who received an unconditional incentive included the additional sentence "To thank you for your support, we enclose stamps/a card for your personal mail." Respondents who completed the unemployment benefits version survey (n = 285) were invited to a subsequent wave in May 2020 during the COVID-19 pandemic and asked to repeat the survey. In the subsequent wave, we offered no incentives.

Analysis Plan
We analyze the effect of incentives on four dimensions: response rates, responding to a subsequent wave, consent to data linkage and data collection cost.
To compare differences between our experimental groups in terms of response rates, response rates to a subsequent wave, and consent rate to data linkage, we use tests of proportion. We report differences both in the access rate (AR) (i.e., every individual who at least accessed the web survey), and the response rate according to the AAPOR RR1 definition (see AAPOR 2016) (i.e., respondents who finished the survey). To assess whether incentives positively influence response rates in a subsequent wave, we compare the response rate (AAPOR RR1) in the second wave and the cumulative response rate.
Furthermore, we compare the response rate (AAPOR RR1) between subgroups. For this purpose, we use a subset of the variables used in Osiander et al. (2020): gender, region (East/West Germany), age, education level, complexity of activities in the last job. All variables are collected by the employer at the start of an individual's employment and reported to the Federal Employment Agency. Due to an insufficient number of cases, we did not conduct a similar analysis for response rates in a subsequent wave and data linkage consent rate.
For data collection cost, we compare four measures for each experimental group: (1) cost per invitation; (2) cost per survey access; (3) cost per respondent; and (4) cost per additional respondent recruited using an incentive. Except for cost per additional respondent recruited, we provide for each measure a price increase in percent of the incentive group compared with our control group Cost per invitation sums up postage costs and the cost of the incentive. We do not include packaging costs in the calculation. For cost per survey access, we divide the overall cost (product of the number of invited cases and the cost per invitation) by the number of individuals who at least accessed the web survey. Hence, it indicates how much money we had to spend to recruit one case. Similarly, the cost per respondent (quotient of number of respondents and overall costs) indicates how much money was required to recruit one complete case.
The cost per additional respondent reveals how much we have to pay to recruit one additional case compared with the control group. For instance, if the control group recruits 100 respondents and the incentive group recruits 150 respondents, the incentive helps recruit 50 additional cases. However, by using a prepaid incentive, we pay not only for the 50 additional cases but also for the 100 respondents we would have recruited without using any incentive plus all nonrespondents. Against this background, using a prepaid incentive to recruit more respondents drives data collection costs upward.
cost additional ¼ cost per invitation Incentive À cost per invitation control À N Respondents, Incentive N Invited, Incentive Á À À N Respondents, control N Invited, control Á

Response Rates
Compared with the control group, the stamp group yields a statistically significant higher access rate and response rate (see Table 1). Therefore, we find support for our hypothesis that prepaid stamp incentives increase the response rate. The postcard incentive also leads to slightly higher access, and AAPOR RR1 response rate but differences are not statistically significant (i.e.,  p ≤ 0.05). Therefore, we find no support for our hypothesis that a prepaid postcard incentive increases the response rate. To find effect sizes within subgroups that are equal to the effect size of the main effect between the stamp and control groups (Cohen's d = 0.14, β ¼ 0:8, α ¼ 0:05), each group must include at least 550 cases in their respective invited sample. Without the sufficient number of cases, effect sizes need to be larger than 0.14 to identify a statistically significant effect. Table 2 displays response rates by subgroups. The effect for the stamp incentive is positive for all subgroups, although not all effects are statistically significant. Using stamp incentives significantly increased the response rates of men (6.2% vs. 11.5%, p < 0.001) and women (5.5% vs. 7.8%, p = 0.023). Additionally, we find an increase in response rates of 4.1 percentage points (p.p.) for West Germany (p < 0.001), while the response rate for East Germany does not increase (2.5 p.p., p = 0.249).
Response rates in the stamp group are around four p.p. larger over all age groups except the youngest age group (18-30 years).
For educational subgroups, we find no effect of the stamp incentives on subgroups with matriculation level or a university degree. We find a strong increase of 6.4 p.p. (p = 0.014) for individuals with no degree. In the control group, no one with no degree participated. For individuals with occupational training as the highest educational level, we find a positive effect of 4.3 p.p. (p < 0.001). While we find no effect among helper activities (0.3 p.p., p = 1), stamp incentives moderately increase response rates for people with professional activities by 4.0 p.p. (p < 0.001) and significantly increase response rates for people with complex activities by 7.6 p.p. (p = 0.001).
For our postcard incentive, we see no statistically significant effects on response rates that are below the 0.05 level.

Responding to a Subsequent Wave
Contrary to wave 1, we offered no incentive in the subsequent wave. Table 3 shows the response rates in the subsequent wave by experimental groups. As nearly all respondents who accessed the subsequent survey also completed it, we only considered completed interviews. The control group yields the highest response rate of 51.3%, while the response rate in the stamp group is 29.5%. The difference is statistically significant (p = 0.003). We observe a similar pattern for the postcard group, where the response rate in the subsequent wave is 36.4%. However, this difference is not statistically significant (p = 0.089), which stems from the small number of cases available for analysis (N Needed, Cohen's d=0.3, β=0.8, α=0.05 =136). Our results do not support our hypothesis that gift incentives in the first wave increase response rates in a subsequent wave. In contrast, our results show that response rates in both incentive groups are smaller in the subsequent wave compared to the control   group. Nevertheless, the number of respondents who participated in both waves is similar over all groups (1.1% for the postcard incentive and 1.6% for the control and stamp incentive groups, respectively). That is, respondents who participate because of incentives in the first wave are less likely to participate in the subsequent wave when incentives are no longer offered.

Data Linkage Consent
The stamp and postcard incentive groups have a slightly lower linkage consent rate (see Table 4). However, those differences are not statistically significant. We find no support for our hypothesis that stamp or postcard incentives have a positive effect on the propensity to consent to link survey and administrative data. Table 5 summarizes the data collection cost per (1) invitation, (2) survey access, (3) respondent and (4) additional respondent recruited with incentive by our experimental groups. Compared with the control group, the cost per invitation increases for our incentive groups. As each invited individual received an upfront incentive, we see an increase from €0.53 to €1.53 (189%) for the stamp incentive and an increase from €0.53 to €0.62 (20%) for the postcard incentive. While the costs per survey access inform us of how much money we must spend recruiting one respondent, the cost per respondent informs us of how much money we must spend recruiting one complete respondent for analysis. Stamp incentives increase the cost per survey access from €8.08 to €14.22 and the cost per respondent from €9.08 to €15.81. The costs for the control group and the postcard incentive groups are similar. We see a slight decrease for the cost per survey access from €8.08 to €7.99 and a slight increase for the cost per respondent from €9.08 to €9.23 when using postcards as incentives. Without incentives, we recruited 146 respondents, and using a stamp incentive, we recruited 242 respondents i.e., 96 additional respondents. Using a prepaid incentive, we incentivize not only the 96 additional cases but also the 146 cases that we would have recruited without an incentive. For the stamp group, each additional respondent costs €25.64. The cost per additional respondent for the postcard group is €10.00.

Conclusion
In this article, we investigated the effects of gift incentives (postage stamps and postcards) on participation in a web survey with a postal invitation using an experimental design. Individuals were sampled from the IEB. We analyzed the effect of prepaid incentives on four dimensions: (1) response rates; (2) response rates in a subsequent wave; (3) data linkage consent; and (4) data collection costs.
In line with previous studies using gift incentives, prepaid stamp incentives positively affect response rates compared with no incentive (10.8% vs. 6.6%). In contrast, prepaid postcard incentives did not result in a statistically significantly increase in the response rate (7.8% vs. 6.6%).
We used German administrative data (IEB) to identify differential effects of our incentives on response rates within subgroups. Stamp incentives positively affect response rates in most subgroups, while some differences were too small to identify statistically significant effects. Future research is needed to draw a reliable conclusion of stamp incentives within subgroups. Postcard incentives do not affect subgroup response rates differentially.
In contrast, the stamp and control groups yielded the exact same cumulative response rate (i.e., the same number of respondents participated in both waves for both groups). Therefore, using a stamp incentive in the first wave does not affect response rates in a subsequent wave. Using a postcard incentive even decreases response rates for the subsequent wave and the cumulative response rate.
Data privacy laws in Europe require researchers to ask respondents for data linkage consent when combining several data sources. Any means that increase the linkage consent rate increases the analysis potential of the data. As incentives initiate reciprocal behavior, we expected our incentive groups to have a higher linkage consent rate. However, we find no such effects.
Finally, we explored the effect of our incentives on data collection costs. Using a stamp incentive increases the cost per respondent by 74% and the postcard incentive by 2% compared with no incentive. From a cost perspective, it would have been more beneficial to draw a larger sample to recruit the same number of respondents. However, in a setting where the size of the invited sample is limited, the cost increase using stamp incentives may be justified to maximize the response rate.
Our study contributes to the existing literature on gift incentives for web surveys. Especially for small-budget surveys, using stamps can be a valuable alternative to a €5 bill or a voucher, as stamps are applicable worldwide, inexpensive and have no requirements for use. However, the perceived utility of stamps may vary between countries, depending, for example, on the level of digitization and the social value attributed to a letter. Cross-national studies could shed more light on the generalizability of the findings. also, the comparison between stamps and postcards reveals the importance of thoroughly reflecting which incentive to choose. The effectiveness of the stamp incentives used in our experiment derives, in our opinion, from special features. A stamp sticks out by its monetary character, while its small monetary value is consistent with its usage in everyday life. Also, choosing a nicely designed set of stamps creates an inviting impression.