749
Views
0
CrossRef citations to date
0
Altmetric
Research Article

‘We are looking for people like you’ – new technique of social influence as a tool of improving response rate in surveys

, , , , , , & show all
Article: 2316348 | Received 17 Apr 2023, Accepted 31 Jan 2024, Published online: 29 Feb 2024

ABSTRACT

A low response rate in surveys makes the research more expensive and time consuming, but it also, or even more importantly, constitutes a major methodological problem. Therefore, researchers use all kinds of measures in order to increase the response rate. This article describes four experiments (three field studies and one on-line experiment) designed to test a specific technique intended for this purpose. The technique consists in suggesting, to a respondent, that the study requires individuals with a certain rare quality that they have. In the case of the first study, which was conducted during the pandemic, respondents were told that it required people wearing exactly the type of face mask as theirs (or people without a face mask). In the second study, tall people were asked to complete a survey. Some of them were told right away that only tall individuals were considered suitable candidates for the study. In both cases, this technique significantly improved the response rate. However, the employed technique proved ineffective in both Study 3 and Study 4. In these instances, participants likely perceived that their refusal wouldn’t pose any inconvenience to the researchers. Consequently, we can posit that the efficacy of the ’we are looking for people like you’ social influence technique hinges on respondents feeling compelled to respond affirmatively to the research invitation. The assumption is that declining may potentially create challenges for the researchers, prompting a sense of obligation among participants.

Statement of significance

Low response rate is a serious problem – not only methodological but also practical. Questionnaire researchers try to reduce the low response rate in various ways. In our article, we propose a solution for PAPI researchers. We analyze the technique of social influence that allows you to easily increase the percentage of people willing to fill in the questionnaires. This technique (we named it ‘we are looking for people like you’) turns out to be an effective tool to increase the response rate, and at the same time it is cheap, does not require any additional tools and allows you to maintain the structure of the results obtained.

Introduction

Most probably, the first survey in history was conducted in 1788. Sir John Sinclair sent out a questionnaire to the ministers of all the parishes of the Church of Scotland. It took 23 reminders, but he managed to achieve a 100% response rate (De Leeuw, Citation2005, p. 233). Gradually, survey-based studies have become more widely used. For researchers, they have become the main source of knowledge of attitudes, values and beliefs held by various people, whereas practitioners have started to apply surveys in order to assess the popularity of particular political parties, brand awareness or people’s purchasing preferences. However, achieving a 100% response rate, as John Sinclair did, has been very rare. In fact, in the case of most studies, the rate was usually significantly lower (e.g., Baruch & Holtom, Citation2008; Booker et al., Citation2021; Wu et al., Citation2022).

Obviously, the achieved response rate is far from 100%, as the researchers have to rely on the level of motivation of the respondents, they ask to complete their surveys. Needless to say, not everyone feels like doing so, not everyone has the time to spare, some people forget that they had agreed to do it, etc. Groves et al. (Citation1992) presented an interesting overview of respondents’ reasons for active participation in the study. These are (1) societal factors related to the context in which the participation is requested, (2) survey design quality, (3) socio-economic characteristics of the sample drawn from target population, (3) sociodemographic characteristics of the interviewer, and (4) the interaction between the respondent and the interviewer. They also pointed out a number of psychological factors, such as respondent’s willingness to act on request, tendency to provide help, and propensity to alter opinion on some subjects. Researchers want to achieve a high response rate not only because this way they can obtain larger data samples and statistical power as well as smaller confidence intervals around sample statistics, but they want to achieve a high response rate predominantly because those individuals who refuse to take part in the study may differ, in terms of various aspects, from those who agree. This type of nonrandom nonresponse could skew sample data and lead to under-representation of certain groups. Regarding mail surveys with low response rates, Fowler (Citation2002) stated: “In such instances, the final sample has little relationship to the original sampling process; those responding are essentially self-selected. It is very unlikely that such procedures will provide any credible statistics about the characteristics of the population as a whole (pp. 41–42). It is thus unquestionable that nonresponse bias is a deadly blow to both the reliability and validity of survey study findings.

Consequently, researchers conducting survey studies undertake various (more or less effective) measures aiming at increasing their response rate. Researchers tested the role of such factors, related to contacting the subjects, as preceding the dispatch of the questionnaire as an e-mail with an invitation to participate in a study (e.g., Kaplowitz et al., Citation2004; Keusch, Citation2015) or following the dispatch of the questionnaire with a kind reminder about completing the questionnaire (e.g., Harrison et al., Citation2019; Keusch, Citation2015; Saleh & Morgan, Citation2017). Numerous studies were devoted to providing subjects with additional motivation through offering money or other prizes for completing the questionnaire (e.g., Butler et al., Citation2016; Dillman et al., Citation2014; Szelényi et al., Citation2005), although in the case of specific groups who were highly interested in the problems investigated in the questionnaire, an appeal for altruistic involvement in the study may be more effective than material prizes (Conn et al., Citation2019). It was also found that various research methods may be more or less convenient for particular respondents. Some are more willing to complete an online questionnaire, while others prefer to use traditional mail. Such a mixed-mode approach combining various methods of reaching the respondents and offering them the most convenient, from their perspective, ways of participation in the study significantly increase the response rate (see: Fincham, Citation2008 for review).

Moreover, the subject of the study may have an impact on the number of completed surveys. Through a series of experiments, Groves et al. (Citation2006) demonstrated that there are three major factors which significantly affect the response rate: (1) respondent’s interest in the subject of the survey study, (2) his/her response to the researcher or institution conducting the study, and (3) application of remuneration methods for the time and inconvenience of participation in the study (e.g., financial or non-financial). However, it should be noted that in most of the studies conducted by Groves et al. surveys were administrated via traditional mail (rather uncommon practice today) or over the telephone. Both of these methods are different from face to face survey administration based on structured or semi-structured interview due to the lack of direct contact with the respondent.

It also appears that knowledge of social influence psychology may be helpful when it comes to increasing response rate. Social psychologists have been studying methods of increasing compliance without pressure in people for many years now. The application of particular techniques increases the probability of people complying with the requests they receive. In such studies, the requests usually pertain to donating small amounts of money collected during various charity events, becoming a volunteer or signing a petition (see: Cialdini, Citation2021; Dolinski, Citation2016; Dolinski & Grzyb, Citation2023 for review). However, sometimes people are asked to take part in a survey-based study. Thus, Gueguen and Pascual (Citation2005) assumed that freedom and agency are some of the most important needs in humans. People want to feel that they are the masters of their decisions: they like it better than being persuaded into doing something by others. When we ask someone to take part in a survey study, they often feel as if they are being persuaded into doing it. Researchers demonstrated that when passersby were asked to participate in a short survey regarding local salesmen and craftsmen, 75.6% of subjects agreed to answer the questions in standard conditions. However, when the request was followed by ‘obviously, the decision is yours: you can agree or decline,’ which stressed the agency of the respondent, as many as 90.1% of the subjects agreed to comply with the request. Carpenter (Citation2014), on the other hand, decided to capitalize on the fact that in most cases, people feel more motivated if the goal is close at hand. He asked passersby he met at a university campus for 10 minutes of their time to complete a survey. As it turned out, 60% of the individuals agreed to comply with his request. However, then the request was followed by ‘I only need one more person for my studies,’ as many as 80% of the subjects agreed to participate in the survey. Another study focused on testing the ‘dialogue involvement’ technique and demonstrated that passersby are more willing to complete a survey if the request is preceded with a dialogue on a trivial subject matter (small talk). The effect of the dialogue involvement technique was even stronger if the interviewer started the conversation with touching on topics that were very important for his interlocutor, instead of the small talk, and explicitly expressed opinions consistent with those of the subject. On the other hand, if the interviewer voiced opinions contrary to those of the subject, their willingness to complete the survey fell below the level of the standard conditions (Dolinski et al., Citation2001).

In this article, we would like to present a completely new social influence technique focused on increasing response rate in survey studies. It should be pointed out that whenever a subject is asked to do something, they have reasons to comply with the request as well as reasons to decline it. When the request pertains to taking part in a survey study, the subject may decline, for example, because they may be convinced that the interviewer can just as well ask someone else to do it (‘Why do I have to waste my time?’). However, the situation becomes different if the interviewer tells the subject that he is conducting a study on a rather specific and small population. In such a case, the subject may think that their refusal could put the interviewer in a difficult position, as finding suitable individuals, such as himself/herself, is not an easy task. As a result, the subject may feel, in a way, obligated to agree. This is because the ‘let someone else do it, the interviewer will find another person without a problem’ line of reasoning is not available any more. The hypothesis, according to which the response rate in survey studies increases when the interviewer suggests that they are specifically looking for individuals such as the person they are addressing, was tested in the course of two field studies.

Study 1

Method

The study was conducted during the COVID-19 pandemic, when wearing face masks in public buildings (stores, cinemas and offices) was mandatory. Since ignoring this obligation entailed no repercussions in Poland, a certain part of the population – specifically individuals who questioned the very existence of COVID or its severe consequences, refused to wear face masks. Those who did wear face masks could either go with a standard, disposable kind (white or blue and white) or a colored mask (e.g., red or yellow). There were also those who wanted to be original – their face masks were colorful or featured a pattern (e.g., checkered or flowered).

The study was conducted in a large shopping center. Randomly-selected individuals of both sexes were approached by an experimenter’s assistant, who asked (in the control group) the following question:

Excuse me, I’m a psychology student and I need to conduct a survey regarding wearing face masks, would you agree to answer a few questions?

In the experimental group, the question was phrased differently and it was modified according to the type (or lack) of the face mask worn by the subject.

Excuse me, I’m looking for people with a colorful/patterned/disposable/without a face mask just like you. I’m a psychology student and I need to conduct a survey regarding wearing face masks, would you agree to answer a few questions?

Therefore, even though we approached various individuals – without a face mask, with a standard or an original face mask, etc. – in each case, the subject had to feel that they belong to a small group tested under the study.

The size of the sample was calculated with G-Power software. The sample consisted of 203 individuals (effect size = 0.25; alpha err = 0.05, power = 0.90).

Results

Ultimately, 242 individuals participated in the study (121 individuals in each group, i.e., the experimental and the control group). An identical number of men and women were tested. The results are presented in .

The relation proved to be statistically significant (χ2 = 8.044, p < .05, φ = .182).

Discussion

Our findings demonstrated the effectiveness of the applied technique – in the group where individuals were informed about their unique quality, more subjects agreed to take part in the proposed survey. However, we must not forget about the specific nature of the request they were approached with. The question of wearing a face mask to cover one’s mouth and nose or refusing to wear it was, in many countries around the world, a sticking point stirring up emotions and causing conflicts. Therefore, we decided to check if the effectiveness of the technique described herein would remain unchanged in the case of a request referring to a quality that does not evoke such high emotions.

We also wanted to check if the answers provided by the subjects in response to a few standard questions would vary in the control conditions and in the conditions where the ‘I am looking for someone like you’ technique is used. Due to the lack of sufficient theoretical presumptions, we did not formulate any predictions.

Study 2

Method

We assumed that the survey would be conducted among tall individuals. According to the official data (Raport Gisplay, Citation2018), the average height of women and men in Poland is, respectively, 164.6 cm and 177.3 cm. According to our criteria, people with height ½ SD above the average could be considered tall. Therefore, we decided to use the following threshold height values: 169 cm for women and 182 cm for men.

Assuming, prior to the study, that the effect size = 0.3, p level = 0.05 and power = 0.90, we identified a sample size of 162 individuals. As the study was conducted independently by two different people and we wanted each of them to test the same number of men and women, we determined that the desired sample size was 164 individuals.

The study was conducted independently by two young (20 years old) undergraduates – a woman and a man. The task of the interviewer was to inquire every fifth person meeting the applicable height criteria (as guessed by the researcher) who was walking alone in a large shopping center in one of the largest cities in Poland (Wrocław). Each person conducting the study examined 82 individuals (41 women and 41 men). In the control conditions, the subjects were addressed in the following manner:

‘Excuse me, I’m an undergraduate conducting a survey study. Would you agree to answer a few questions for the survey?’ Whereas in the experimental conditions, the request was phrased as follows: ‘Excuse me, I’m an undergraduate conducting a survey study and I’m specifically looking for tall individuals like you. Would you agree to answer a few questions for the survey?’ If the subject agreed, they were asked the following three questions: (‘How much do you spend on food each month?’, ‘How much time do you spend watching TV every day?’, ‘How often do you check the balance of your bank account in a month?’).

Results

The initial analyses demonstrate that both interviewers varied in terms of their effectiveness. The female interviewer was more successful in terms of persuading the subjects into participating in a short survey study (87.1%) than the male one (64.6%) – χ2 = 6.09, p < .015, φ = .19. Nevertheless, in both cases the subjects were more willing to participate in the study if they were addressed with the ‘I’m specifically looking for tall individuals like you’ phrase. When it was the woman who asked the subjects to participate in the study, the use of the phrase caused the percentage of individuals who agreed to comply with the request to increase from 70.7% to 95.1% (χ2 = 6.61, p < .011, φ = .28) and when the request was formulated by the male interviewer, the numbers were respectively: 53.7% and 75.6% (χ2 = 4.32, p < .038, φ =.23). Generally, using the above-mentioned phrase resulted in an increase in the response rate from 62.2 to 84.1%. (χ2 = 10.06, p < .002, φ = .25).

The individuals who agreed to take part in the survey study were asked three questions. In none of the cases were differences observed in terms of the answers between the groups recruited in the standard manner and those with the use of ‘I’m looking for someone like you’ technique. Average and standard deviation values for answers to these questions can be found in .

Discussion

In this case, similar to Study 1, the use of the ‘we are looking for people like you’ technique also contributed to a significant increase in response rate. Here, too, the quality highlighted in the message the potential subject was addressed with pertained to a certain element of one’s appearance. However, while in Study 1 this element was under the control of the given individual (it was their previous decision whether or not to wear a face mask and if so, what kind), this time around we referred, in our message, to something that was completely beyond the control of the individual (height). This demonstrates that the ‘we are looking for people like you’ technique is, at least to some extent, universal in its use (but see: General Discussion section). In the case of Study 2, we asked the participants three different questions to see if the structure of their answers was the same in both the standard and experimental conditions (i.e., where the tested social influence technique was used). We found that the structure of their answers to all three questions did not differ regardless of the conditions. Therefore, it can be concluded that, with reference to our study, we only obtained an increase in the response rate but we found no grounds to claim that the relatively low response rate (62.2%) in the standard conditions was caused by a research artifact in the form of selection of a specific sample.

Nevertheless, the psychological mechanism that underscores the efficacy of the ‘we are looking for people like you’ technique remains unclear. Our hypothesis posits that the analyzed technique operates on the premise that participants sense an obligation to engage, given the perceived difficulty the interviewer might face in readily identifying an alternative candidate. Another words, the reason is due to participants thinking that their refusal to participate in the survey could put the interviewer in a difficult position. An alternative explanation, however, is that the participant self-importance is increased due to something that is unique to them.

To ascertain which of these mechanisms truly underlies the effectiveness of the technique under examination in this article, we devised an experiment. We designed the study to minimize the perception that declining participation would inconvenience the researcher significantly. This scenario arises when the survey participant readily observes numerous others in their vicinity who also meet the interviewer’s selection criteria. If the approach proves consistently effective under these conditions, it suggests that the technique operates without necessitating the interviewee to believe that the researcher would encounter difficulties in finding an alternative participant if they were to decline. Consequently, it implies a higher likelihood that the efficacy of this technique stems from the interviewee feeling a sense of uniqueness or specialness.

Study 3

Method

The research was undertaken by a 23-year-old woman over three consecutive days (Monday, Tuesday, and Wednesday) at the Victoria Shopping Center in Wałbrzych, Poland. The interviewer’s task involved approaching every fifth individual walking alone with the inquiry: ‘Excuse me, I’m an undergraduate conducting a survey study. Would you be willing to answer a few questions for the survey?’

In the experimental conditions, the request took a specific form: ‘Excuse me, I’m an undergraduate conducting a survey study and I’m specifically seeking individuals who are engaged in shopping on [mentioning the respective day of the week – Monday, Tuesday, or Wednesday]. Would you be willing to answer a few questions for the survey?’ If the participant agreed, the interviewer clarified that the aim was to assess the percentage of people agreeing to respond to survey questions in such a context and expressed gratitude to the respondent for their cooperation.

In both conditions, 75 people were approached (N = 150), most of them were women (N = 92).

Results

The results revealed a nearly identical willingness among participants to partake in the survey in both conditions of Experiment 3. In the control conditions, 64 individuals (85.33%) agreed, while in the experimental conditions, 62 people (82.67%) agreed (χ2 = .02; p > .65, ϕ = .04).

Discussion

The observed results in our study suggest that the ‘we are looking for people like you’ technique lacks effectiveness in scenarios where the interviewed individual perceives no consequence in refusing, thereby avoiding any inconvenience for the interviewer in recruiting other participants. In the bustling environment of the shopping center, with numerous potential respondents present, the interviewer, alongside the participant, remains surrounded by others engaged in shopping on that particular day. This implies that the efficacy of the analyzed technique lies in instilling a belief in the respondent that their refusal would significantly inconvenience the interviewer. In the absence of this crucial element in our experimental conditions, the technique proved to be entirely ineffective. The lack of effectiveness observed in Study 3 indicates that the technique’s success might depend more on the respondent’s perception that declining would significantly inconvenience the interviewer, rather than the sense of uniqueness evoked by the phrase ‘we are looking for people like you.’ Nonetheless, it might be premature to definitively assert that the mechanism contributing to the effectiveness of the analyzed technique does not stem from the individual’s perception of being distinguished in some manner. First of all, there exists serious uncertainty regarding whether the experimental context we established truly elevated the participants’ sense of self-importance through something genuinely unique to them. While respondents may have felt distinguished for participating on the specific shopping day the interviewer targeted, it might not have been sufficient to evoke a profound sense of specialness.

Therefore, we have chosen to conduct an additional empirical study to thoroughly explore the reasons underlying the technique’s efficacy in a different context. This time, the experiment departed from a field study format and instead unfolded on the Internet.

We opted to engage with individuals actively using the thematic internet portal. Half of this group received a conventional survey participation invitation, while the other half received an invitation using the ‘we are looking for people like you’ approach. Our initial two experiments revealed that employing the ‘we are looking for people like you’ approach significantly enhances the likelihood of recruiting respondents in face-to-face interactions. Transitioning to an online survey format, our objective was to assess whether a similar effect could be observed. The secondary and more crucial aim of our experiment was to understand the motivations behind participants in both conditions agreeing to respond to our survey invitation.

Study 4

Method

Participants of our experiment were subscribers of the electronic newsletter of the local university in Poland. The subscribers’ sample consisted of people who participated in various events for general public and followers of the university’s social media concerning design. All participants were Polish-speaking and presumably current Polish residents. Participants were recruited through e-mail invitation. The invitation was sent only once to each participant.

Based on a randomized approach, a subset of subscribers (N = 11,019) received a standard invitation, stating, ‘As a team of researchers from the University of [XXX], we kindly request your participation in a brief survey.’ The remaining subscribers (N = 1,018) received an invitation with the following content: ‘We are looking for individuals like you – inviting those with an interest in design to participate in a study conducted by researchers from the University of [XXX].’ All participants were assured that the survey would require no more than 3 minutes and would be entirely anonymous.

Upon agreement to participate, respondents clicked on the provided website link. They then proceeded to answer five questions related to design, followed by an inquiry into the motivations behind their decision to participate in the study.

Participants provided feedback on the validity of the following statements:

  1. I derive enjoyment from participating in surveys.

  2. I consented because the researchers were seeking individuals with characteristics similar to mine.

  3. I lacked alternative activities.

  4. I consented due to the researchers’ potential challenges in acquiring an adequate number of respondents for result analysis.

  5. I agreed because I felt honored by the request extended to me.

  6. I believed that I was the suitable individual to undertake this survey.

The respondents were asked to assess the truthfulness of these statements on an eleven-point scale (0–10) described by extreme terms (0 - completely untrue, 10 - definitely true).

Results

In the standard request condition, 479 out of 11,019 individuals (4.35%) agreed to participate in the study. In the condition where the request included the phrase ‘we are looking for people like you,’ this rate was nearly identical: 461 people out of 11,018 (4.19%) agreed. Notably, the difference in respondents’ willingness to complete the survey between these conditions was statistically insignificant (chi-square = .3598, p = .549).

Despite the ‘we are looking for people like you’ technique proving ineffective in recruiting respondents for survey research, we chose to delve into the responses of both respondent groups regarding the reasons for their consent to participate in the research.

The outcomes of these comparisons are detailed in . Notably, individuals tested in control conditions exhibit a stronger conviction than those in the experimental group that their decision was influenced by their intrinsic liking for participation in research (Q1). Conversely, individuals invited using the ‘we are looking for people like you’ formula attributed a more substantial role than those in the control group to reasons such as researchers actively seeking respondents like them for research (Q2), feeling distinguished by the researchers’ request to participate in the research (Q5), and believing they were the suitable candidates for the research (Q6).

Remarkably, the two groups displayed no significant difference in their agreement that the reason for their consent was the potential difficulty researchers might face in recruiting enough participants for their research (Q4), .

Discussion

Regrettably, the social influence technique, ‘we are looking for people like you,’ did not yield the expected results in this experiment. This outcome may be attributed to respondents’ awareness of a substantial participation pool (over 20,000 individuals) engaged in various online activities related to design, organized by the university. Consequently, there might have been a diminished sense that declining research participation would pose challenges for researchers in assembling an adequate participant cohort.

This interpretation finds indirect support in the absence of discernible differences between the two groups in their responses to this question. However, it’s noteworthy that respondents from the experimental group more frequently cited feeling honored as a primary reason for agreeing to complete the survey.

It’s crucial to underscore that while the motivation of feeling distinguished is notably present in the group invited using the phrase ‘we are looking for people like you,’ it does not increase the tendency of surveyed individuals in these conditions to participate in the survey. Our interpretation is that this pattern of results suggests the foundation for the effectiveness of the analyzed technique lies in the person being asked feeling that refusal might create inconvenience for the requester. As observed in our Study 3, when this sense was absent, the technique proved ineffective.

General discussion

We obtained very consistent results in Study 1 and Study 2, where the dependent variable was the consent of the participants to take part in a survey study: the response rate was significantly increased through the use of the social influence technique consisting of including the ‘we are looking for people like you’ phrase in the formulated request. In the first experiment, the response rate increased from 44.6% to 62.8%, whereas in the second experiment, the increase was from 62.2% to 84.1%. Unfortunately, we must emphasize very clearly that the results of Experiment 3 and Experiment 4 are no longer as consistent with the outcomes achieved in Experiments 1 and 2. This raises serious concerns about the generalizability of the effect across different contexts and manipulations. Let’s try to examine the reasons that may at least potentially account for such a state of affairs.

Experiment number 3 was designed in such a way that the participant was aware of being surrounded by a significant number of individuals meeting the criteria for participation in the survey. It turned out that using the phrase ‘we are looking for people like you’ was entirely ineffective in this context. Consequently, one might infer that the underlying mechanism for its effectiveness lies in the participant’s belief that refusing to complete the survey would cause inconvenience to the researcher. In accordance with this rationale, it may not be surprising that under conditions where participants perceive that their refusal will not pose issues for the researcher, as an alternative person can be readily approached, the technique loses its efficacy entirely. Regrettably, it is imperative to underscore explicitly that the findings of Experiment 4 no longer exhibit the same level of consistency observed in Experiments 1 and 2. This raises significant concerns regarding the generalizability of the effect across various contexts and manipulations. Let us endeavor to scrutinize the factors that could, at least potentially, explain such a situation.

The hypothesis positing the effectiveness of the ‘we are looking for people like you’ technique due to the participant’s belief that refusing to engage in the research would inconvenience the researcher was tested in an online experiment (Study 4). However, we have been unable to establish the efficacy of this technique in conditions where it is not employed in direct contact with the subject but is instead applied in online research. This difficulty may be attributed to the challenge of inducing sympathy in participants toward the researcher in online conditions, where it is more arduous to convey the researcher’s struggle in finding suitable individuals meeting specific criteria.

It is noteworthy that, despite our inability to demonstrate the effectiveness of the ‘we are looking for people like you’ technique in Experiment 4, individuals subjected to this technique frequently expressed reluctance to cause problems for the researcher when provided with explanations about the motives prompting their participation in the study. This pattern of results of Study 4, indirectly therefore support our thesis that a key factor in the success of the ‘we are looking for people like you’ technique is the participants’ reluctance to pose challenges for the researcher.

The results of the first two studies, indicating that in certain cases, this technique can significantly increase attendance, have important practical implications. Its application can both shorten the time needed for conducting survey research and reduce its costs. However, it is crucial to be aware of significant limitations regarding the feasibility of utilizing this technique in practice.

The outcomes of the initial two studies, demonstrating that this approach can notably boost participation under specific circumstances, carry significant practical implications. Its implementation has the potential to streamline the duration required for survey research and curtail associated costs. Nonetheless, it is imperative to acknowledge substantial limitations concerning the practical viability of employing this technique.

First and foremost, it is essential to recognize that the characteristic emphasized through the statement ‘we are looking for people like you’ was either neutral (Study 1) or positive (Study 2). It is uncertain what the outcome would be in the case of a quality that is not widely considered favorable (e.g., being short in the case of men or being obese in the case of both sexes), but due to ethical reasons (i.e., we did not want to expose our subjects to discomfort) we did not test it. Therefore, it can be said that the usability of the technique proposed herein is limited to certain situations and is of little practical value in the case studies on the general population (e.g., adults in a given country). Nevertheless, if our objective is to test a specific population and this specificity is connected with a quality that is at least relatively rare and is not considered negative, the use of this very simple technique, which entails no additional costs, may be very helpful.

Furthermore, our Study 3 underscore that the technique ceases to be effective when respondents lack a genuine reason to believe that researchers would encounter difficulty locating individuals similar to them. Hence, the utility of the technique is constrained to situations where there is a bona fide interest in studying individuals possessing at least a relatively rare characteristic. The inefficacy of the ‘we are looking for people like you’ technique in our Study 4 could be attributed to at least two possible reasons. Firstly, respondents may not have perceived themselves as belonging to a specific and challenging-to-reach group of individuals for the researcher. Secondly, it is plausible that, in the context of online research, evoking compassion from subjects toward the researcher, who may encounter difficulties if invited individuals decline participation, proves to be challenging.

Nevertheless, it cannot be dismissed that the ineffectiveness of the ‘we are looking for people like you’ technique in online conditions may stem from entirely different factors. Prior research indicates that individuals find it easier to decline requests not presented in direct physical, real-life contact (Dolinski, Citation2016). It is conceivable, then, that even if online respondents perceive that their refusal could inconvenience researchers, this might not suffice to motivate them to consent to participating in survey research. In such a scenario, the efficacy of the technique under examination appears to be confined to studies involving direct contact between the interviewer and the participant. Consequently, future research endeavors should prioritize investigating the applicability of the ‘we are looking for people like you’ technique in online research.

In summary, we have empirically demonstrated the effectiveness of the ‘we are looking for people like you’ social influence technique in offline conditions, particularly when the interviewee perceives that their refusal to participate would pose challenges for the researcher. It is crucial to note that the results obtained only partially support the hypothesis. Specifically, the obtained results can be considered as indirectly supporting the notion that the participant’s concern for the researcher’s well-being underlies the effectiveness of our technique.

To comprehensively address questions about the effectiveness conditions of the ‘we are looking for people like you’ technique, the psychological mechanism driving its effectiveness, and the reasons for its ineffectiveness in online research, additional extensive research is warranted.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The data that support the findings of this study are available from the corresponding author, TG, upon reasonable request.

References

  • Baruch, Y., & Holtom, B. C. (2008). Survey response rate levels and trends in organizational research. Human Relations, 61(8), 1139–14. https://doi.org/10.1177/0018726708094863
  • Booker, Q. S., Austin, J. D., & Balasubramanian, B. A. (2021). Survey strategies to increase participant response rates in primary care research studies. Family Practice, 38(5), 699–702. https://doi.org/10.1093/fampra/cmab070
  • Butler, B. J., Hewes, J. H., Tyrrell, M. L., & Butler, S. M. (2016). Methods for increasing cooperation rates for surveys of family forest owners. Small-Scale Forestry, 16(2), 169–177. https://doi.org/10.1007/s11842-016-9349-7
  • Carpenter, C. J. (2014). Making compliance seem more important: The “just-one-more” technique of gaining compliance. Communication Research Reports, 31(2), 163–170. https://doi.org/10.1080/08824096.2014.907144
  • Cialdini, R. B. (2021). Influence, new and expanded. The psychology of persuasion. Harper Business.
  • Conn, K. M., Mo, C. H., & Sellers, L. M. (2019). When less is more in boosting survey response rates. Social Science Quarterly, 100(4), 1445–1458. https://doi.org/10.1111/ssqu.12625
  • De Leeuw, E. (2005). To mix or not to mix data collection modes in surveys. Journal of Official Statistics, 21(2), 233–255.
  • Demaio, T. J. (1980). Refusals: Who, where and why. Public Opinion Quarterly, 44(2), 223–233. https://doi.org/10.1086/268586
  • Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys the tailored design method (4th ed.). Wiley.
  • Dolinski, D. (2016). Techniques of social influence. The psychology of gaining compliance. Routledge.
  • Dolinski, D., & Grzyb, T. (2023). 100 effective techniques of social influence. When and why people comply. Routledge.
  • Dolinski, D., Nawrat, M., & Rudak, I. (2001). Dialogue involvement as a social influence technique. Personality and Social Psychology Bulletin, 27(11), 1395–1406. https://doi.org/10.1177/01461672012711001
  • Fincham, J. E. (2008). Response rates and responsiveness for surveys, standards, and the journal. American Journal of Pharmaceutical Education, 72(2), 43. https://doi.org/10.5688/aj720243
  • Fowler, F. J. (2002). Survey research methods (3rd ed.). Sage Publications.
  • Groves, R. M., Cialdini, R. B., & Couper, M. P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56(4), 475–495. https://doi.org/10.1086/269338
  • Groves, R. M., Couper, M. P., Presser, S., Singer, E., Tourangeau, R., Acosta, G. P., & Nelson, L. (2006). Experiments in producing nonresponse bias. International Journal of Public Opinion Quarterly, 70(5), 720–736. https://doi.org/10.1093/poq/nfl036
  • Gueguen, N., & Pascual, A. (2005). Improving the response rate to a street survey: An evaluation of the “but you are free to accept or to refuse” technique. Psychological Record, 55(2), 297–303. https://doi.org/10.1007/BF03395511
  • Harrison, S., Henderson, J., Alderdice, F., & Quigley, M. A. (2019). Methods to increase response rates to a population-based maternity survey: A comparison of two pilot studies. BMC Medical Research Methodology, 19(1), 65. https://doi.org/10.1186/s12874-019-0702-3
  • Kaplowitz, M. D., Hadlock, T. D., & Levine, R. (2004). A comparison of web and mail survey response rates. Public Opinion Quarterly, 68(1), 94–101. https://doi.org/10.1093/poq/nfh006
  • Keusch, F. (2015). Why do people participate in web surveys? Applying survey participation theory to internet survey data collection. Management Review Quarterly, 65(3), 183–216. https://doi.org/10.1007/s11301-014-0111-y
  • Raport Gisplay. (2018). Mapa wzrostu. https://gisplay.pl/gis/6235-mapa-wzrostu.html
  • Saleh, A., & Morgan, K. B. (2017). Examining factors impacting online survey response rates in educational research: Perceptions of graduate students. Journal of MultiDisciplinary Evaluation, 13(29), 63–74. https://doi.org/10.56645/jmde.v13i29.487
  • Szelényi, K., Byrant, A. B., & Lindholm, J. A. (2005). What money can buy: Examining the effects of prepaid monetary incentives on survey response rates among college students. Educational Research & Evaluation, 11(4), 385–404. https://doi.org/10.1080/13803610500110174
  • Wu, M.-J., Zhao, K., & Fils-Aime, F. (2022). Response rates of online surveys in published research: A meta-analysis. Computers in Human Behavior Reports, 7, 10020. https://doi.org/10.1016/j.chbr.2022.100206