MRA teamed with Moskowitz-Jacobs in 2008 to test components of email survey invitations and learn what survey introductions were most likely to elicit respondent cooperation.

INSERT CORRECT LINK TO REPORT HOSTED AT http://mra.marketingresearch.org/resources2/view/profile/id/17

(Published 01/28/08 // Revised 03/24/09)

INTRODUCTION
The survey introduction is considered to be a critical piece of the data collection process.

It is often the first contact that a potential respondent has with the research organization, the point at which the individual learns particular details of a research study, the decisive moment where the decision to participate (or decline) is made, and where the trust relationship between the respondent and researcher begins.   This being the case, why have there been so few attempts at scientifically studying introductions in surveys?

Much of the most compelling research we find, with respect to respondent cooperation, takes the form of published experiments by organizations that are generally seeking to understand the best way to maximize efficiency in-house.   One organization staggers incentive amounts for a ‘regular’ survey to understand cost-benefit, while another organization tests different caller-ID labels to observe which combination elicits the greatest participation, and so forth.

These efforts usually occur when opportunity presents itself and when the organization can perform the research at little, or no, cost.   This means that the survey research profession often lies in wait for useful data and has little control over the focus and direction of  basic  research into  survey methods.    Fortunately, the  field  of  research continues to grow and professionalize, and there are numerous academics and, since 1992, at least one association (CMOR/MRA) that dedicate resources into conducting basic research to promote survey methods.

BACKGROUND
Respondent cooperation presents a challenge for research.  Survey research findings may

be biased by incomplete participation (e.g. Babbie, 2002; Groves 2006), that is, when selected persons refuse, are not reached and invited to participate, or otherwise do not complete the study, resulting in a response rate for the survey which is less than 100%.

The concern of biased findings has led to the development of and experimentation with numerous techniques to  minimize survey non-response.   For example, the effect of numerous telephone survey call-backs, refusal conversions, messages left on telephone answering machines, mail advance letters, survey incentives and many other techniques have been studied to determine factors that impact respondent cooperation (Sangster 2003).

Two  main  categories  of  circumstances  that  result  in  incomplete  participation  are situations where the sampled unit is not contacted and situations where the sampled unit is contacted but opts to refuse to participate (AAPOR 2006). Our research focuses on the latter of these situations - survey refusals.

Often times, the respondent makes the decision to accept or decline to participate at the point of survey introduction.  Our research explores methods of optimizing messages to determine whether response rates for online panels can be improved through systematically  designing  survey  introductions. Specifically, we employed an experimental research design to vary messages and concepts to respondents in order to develop a ratings index that describes the effect of phrases on respondent intent to participate.

METHODS
In  this  study  design,  we  randomly  generated  survey  introductions  that  were  then delivered to respondents through the “Ideamap” software program1 developed by Moskowitz-Jacobs, Inc.    The  survey  introductions  were  created  through  combining interchangeable phrases.

Via an online survey, respondents were instructed to rate each survey introduction on a 9 point scale, based upon how likely they would be to participate in the research study, if given that introduction. Respondents could select any number from 1 to 9. A response of “1” indicated that the respondent was “not at all likely” to participate given the introduction and a “9” indicated that they were “very likely.”   In total, the respondent rated 48 randomly generated survey introductions.

Each  survey  introduction  value  was  then  transformed  into  a  dummy  variable  and analyzed in an ordinary least-squares (OLS) statistical regression model.  The effect of each phrase within the survey introduction was isolated through the regression model, and the phrases were then labeled with their coefficient value and listed in the ratings index.

The data reported here come from 2 separate iterations of the same study design.  The first was fielded in 2006 (Martin and Moskowitz 2006) and the second was fielded in August 2007.

DEVELOPMENT OF SURVEY INTRODUCTIONS & PHRASES
To form fluid and logical survey introductions, we developed 6 categories. Each category represents a different type of information communicated to respondents during survey introductions.   We selected categories of information for this study, based upon their relevance and applicability to a wide range of surveys.

For the latest iteration of the study (2007), the 6 categories included were

  1. introduction/basic appeal to participate,
  2. importance of participation,
  3. survey subject matter,
  4. flexibility/time burden of participation,
  5. incentive offered for participation, and
  6. assurances of privacy and data confidentiality. (Note: The 2006 iteration of the study included the category that communicated the simplicity/ease of participation to the respondent. In 2007, this category was replaced by a category which communicated privacy assurances to the respondent. This practice of rotating categories allows the research to expand into additional areas, and explore the impact of other information that might be included in survey introductions.)

Each of the six categories was then populated with 6 phrases that represents a different facet of the category concept.  For example, 6 phrases (messages) were developed that each represent a different level of time-burden to explore the category #4.

At this stage of the research, both the importance of the broad ideas/concepts  to the respondent,  as  well  as  the  perceived differences between the 6 messages in each category is unknown. The exploratory nature of the research is designed to allow the respondent to provide insight into the performance of phrases that might be typical of survey introductions. In total, 36 phrases were developed. Examples of phrases from the 6 groups include the following:

Intro/Basic Appeal

Your Opinion Counts!

You've been selected for a research study!

Importance of Participation

Your participation in this survey will help improve products and services

Your participation in this survey will help make improvements to a government program

Survey Subject Matter

We'd like to know your thoughts about gasoline prices

We'd like to know your thoughts about dairy products

Flexibility/Time Burden

We only need to ask you about 10 questions

This survey will only take a few moments of your time

Incentive Offered for Participation

Participate and $5 will be credited to the credit card of your choice

Get a free copy of the results after you participate!

Privacy/Data Confidentiality

As member of the Better Business Bureau we take your privacy seriously and will respect your confidentiality

Your privacy is important to us, your answers will be combined with others, and will never be linked with you personally

Each survey introduction drew 1 phrase at random from each of 3 or 4 of the 6 categories (also selected at  random).   Only 3 or 4  categories were combined for each survey introduction (out of the possible 6) to maintain statistical independence.

The categories function to ensure that each survey introduction makes sense intuitively, represents different ideas, and that no survey introduction conflicts with itself.   If, for example, phrases were not pulled from separate categories- a respondent might encounter a survey introduction that included 2 or more conflicting phrases (e.g. ‘we only need to ask you about 10 questions’ and ‘we only need to ask you about 25 questions).

CODING AND ANALYSIS OF DATA
Dummy variables were developed for regression analysis of the components of the survey introductions. Any survey introduction that yielded a rating of 1 to 6 was coded as a “0” meaning that the introduction would not result in interest in survey participation, and any survey introduction that yielded a rating of 7-9 was coded as a “1” meaning the introduction would result in interest in survey participation.

The coding of responses from the bottom 2/3 (1 through 6) of the rating scale follows convention in survey research, and assumes that these numbers encompass the negative and neutral attitudes towards the survey introduction. Likewise, the top 1/3 of the scale (7-9) represents positive attitudes toward the survey introduction.

The mathematical formula used to analyze the data is listed below:

Rating= kө + k1(Element #1) + k2 (Element #2)…k36(Element #36)

Each element in the formula represents a phrase used in the survey introductions.  The coefficient   (elements   1   –   36)   values   indicate   the   impact   of   those   specific phrases/messages on respondent intent to participate. Thus, an element with a coefficient of X would indicate that an additional X% of respondents (added to the constant kө) expressed an interest in participating in the study when that message/phrase was included in the survey introduction.

SAMPLING RESPONDENTS
The respondents included in the 2 iterations of this study reported here were sampled from multiple Internet panels, and were not randomly selected from a greater population (e.g. residents within the United States).  Respondents were selected (at random) from within the panels, from all available panel members aged 18+ residing within the United States of America.

Notably, there is some evidence that Internet panelists may be unique with respect to respondent cooperation.   The MRA Research Profession Image Study (2007) found substantial differences in motivations to participate in research between internet panelists and average US residents.  This is fairly logical, as panelists have unique experiences with research, and have (perhaps) developed certain expectations toward participating in research, such  as  respondent incentives.    In  fact,  there  were  substantial  differences between the panel sample used in the Image Study (2007) and the RDD and Intercept samples across the concept ideas tested in this study (particularly among the concepts of incentives, confidentiality, and time-burden).

Respondents  were  invited  to   complete  the  survey  experiment  through  a   bland introduction, created to offer as little information about the project as possible.  In this way, the experiment minimizes any possible bias that might result from including respondents who may be inclined to participate in research due to concepts offered at the survey introduction of this project.   Since the project is based upon testing different messaging/concepts to see the effect of participation on respondents, all efforts were taken to ensure that participating respondents were not attracted by particular messaging.

Notably, this study is designed to explore respondent cooperation in Internet panels, and proceeds to test various elements/phrases of survey introductions with numerous panels. Although elements are  changed between the  study iterations, several are saved and included on all versions of the study for the sake of comparison.  In this manner, this study aims to  expand overtime, and compare across the different volunteer internet panels.

VALIDATION OF RESULTS
The various phrases that respondents received were randomized and appear in numerous unique combinations within the survey introductions.  Respondents were thus unable to feign true conscious participation in the study by instead opting to answer in logical patterns which might deceive the research team.  Results are further validated through an individual-level analysis of the R-squared statistic, the mode of which is approximately .8 in both studies.

An additional validation step was completed with the 2006 iteration of the study.  The best performing and worst performing elements were combined and delivered in the context  of  survey  introductions  to  an  independent  sample.    The  results  generally supported the validity of the data, although there were some mixed results indicating the need for further study (Martin and Moskowitz 2006).

FINDINGS
The following charts list the results from the 2 iterations of the study conducted in the USA.  The first column lists the results from the 2006 study performed with combined samples from GMI, Lightspeed Research, and Luth Research (n = 1366).  The second column lists the results from the 2007 study performed with combined samples from Greenfield Online and Survey Sampling International, Inc (n = 710).

The constant for both iterations is given in the 3rd  row of the first table.  The constant represents the theoretical level of interest in participating without any elements/messages. Notably, the constant for the first iteration of the study (2006) is 11 points lower than the second iteration (2007). This indicated that respondents from the 1st iteration of the study demonstrated a lower basic likelihood of participating in surveys than those who participated in the 2nd iteration.

 

 

Iteration  1

Iteration  2

Base Size

1366

710

Constant

39

50

Introduction

 

 

 

Help us design a new product!

 

1

x

Answer our survey and be eligible to enter our prize drawing

 

0

x

Help us improve our product!

 

1

x

Tell us what you think and enter a prize drawing!

 

2

x

Win one of 5 grand prizes by answering a simple survey!

 

5

x

Your opinion matters! Tell us what you think!

 

2

x

Your Opinion Counts!

 

x

2

You've been selected for a research study!

 

x

1

Want to make a difference?

 

x

1

We need your help!

 

x

0

Let your voice be heard!

 

x

0

Take part in our important study!

 

x

0

Importance

In today's ever-changing world, manufacturers are constantly introducing new products

 

 

0

 

 

x

How many times have you seen ideas for new products come and go?

0

x

With so many products out there, it's hard to know which ones are right for you!

1

x

Be a part of the discovery by telling us what features are most important to you

1

x

Help us shape the future of this new product

1

x

Here's your chance to share your input for this new idea!

0

x

Your participation in this survey will help improve products and services

Your participation in this survey will help make improvements to a government program

x

 

x

2

 

0

The results from this survey may affect your life personally

Participation in this survey is crucial, your opinion represents the opinions of many others

x

 

x

0

 

0

You can make an impact by participating in this survey

x

0

No one else can take your place in this survey

x

0

 

Subject  Matter

 

 

We need YOU, the consumer, to help us design a new totally

HEALTHFUL and FUN snack!

 

3

 

x

An independent research firm needs to know what features YOU, the consumer look for in a laundry detergent product

 

2

 

x

Introducing a new Carbonated Coffee Beverage…designed to boost your energy

-4

x

Introducing a new Credit Card that offers shopping rewards

-7

x

We need YOU, the consumer, to help us design a new

DVD PLAYER that will revolutionize the way you watch movies in the future!

 

3

 

x

We'd like you the consumer to help us design a new fragrance idea

-2

x

We'd like to know your thoughts about gasoline prices

x

5

We'd like to know your thoughts about dairy products

x

3

We'd like to know your opinion about prescription medications

x

2

We'd like to know your opinion about the Internet

x

2

We'd like to ask you questions about home improvement

x

-1

We'd like to ask you questions about space tourism

x

-4

 

Flexibility

 

 

Simply click on the link below (if your email does not support hotlinks,

cut and paste the link into your browser) and complete the short, easy-to-answer survey

 

 

-1

 

 

x

Just click on the link below to get started

1

x

It's easy to participate, simply click on the link below to proceed

1

x

To participate in the survey, just click on this link or cut and paste it into your browser

0

x

By a simple click of your mouse you can instantly participate in our study and share your opinions

 

2

 

x

Participating is as easy as 1, 2, 3…you're just one click away

3

x

 

Burden

 

 

Depending on your connection speed, the survey should take 10 minutes to complete

3

x

Requires only 15 minutes of your time

1

x

Complete the survey in 20 minutes

-1

x

You can stop at any point during the survey and return to complete it at a later time

4

x

Participate at any location with Internet connection anytime of the day

-1

x

Shorter than the typical survey! Takes only 10 minutes of your time

7

4

This survey is super-short! (we promise)

x

5

We only need to ask you about 10 questions

x

4

This survey will only take a few moments of your time

x

3

Your time is very important to us, this survey should only take a moment to complete

x

2

We only need to ask you about 25 questions

x

1

 

Incentive/Benefit

 

 

Participate and win one of 3 cash prizes

11

18

Participate and you'll automatically be entered in a drawing

By participating in this survey you'll have a chance to win a trip for two to the

Caribbean

2

 

6

x

 

x

Participate and have a chance to win the latest I-pod

3

x

A chance to win a palm pilot

1

x

Earn frequent flyer miles on the credit card of your choice

-4

-12

Participate and earn a free gift card from Home Depot!

x

22

Participate and $5 will be credited to the credit card of your choice

x

17

Get a free copy of the results after you participate!

x

-5

The results of the survey will be published in the New York Times!

x

-6

 

Security/Privacy

 

 

Your answers will remain completely confidential

We take your privacy seriously. We will not ask you to buy anything, nor share your personal information with any outside parties

x

 

x

3

 

2

Your answers are secure, any information sent over this website cannot be seen by outside parties

 

x

 

1

As member of the Better Business Bureau we take your privacy seriously and will respect your confidentiality

 

x

 

1

Your privacy is important to us, your answers will be combined with others, and will never be linked with you personally

 

x

 

0

We safeguard your confidentiality, click below to read our privacy policy

x

-1

DISCUSSION AND CONCLUSION
Overall, the categories of information that had the most impact on individual’s decision to participate were subject matter, incentive, and time/burden of participation.   This finding is true of both the 2006 and 2007 iterations of the study.

The results from the ratings index offer multiple levels of insight into the effects of different messages and concepts on respondent cooperation. First, when applied to the ‘average’ panelist, the overall importance of several categories of information (i.e. incentive, subject matter, and burden) is  illustrated  through the performance of  the concepts, while a lack of importance is seen in others (e.g. importance of participation, flexibility). Of course, with enough information about the individual panelist, phrases/elements could be selectively chosen to appeal to the individual’s interest in participation.

In such cases, it is likely that the concept categories not rated as important by the general sample would have a greater influence to specific groups.   For example, a researcher could emphasize those particular aspects of the survey invitation which assure privacy to a respondent who is sensitive to data confidentiality issues.  In this manner, each sample could be segmented, and offered a survey introduction that is tailored to the concepts that are most important to them. The broad concept categories would be emphasized, deemphasized, or excluded in the survey introduction to elicit the greatest participation possible.

Second, the study offers insight into specific messages within broader concept categories. For example, an additional 9% of respondents are interested in participating in research when the subject matter is gasoline prices versus space tourism.  This is likely due to the saliency of gasoline prices (highly salient) versus space tourism (not very salient).

Moreover, not only does the subject matter “space tourism” not promote interest from respondents, it actually detracts from the percentage of respondents who are interested in participating.   Thus, a researcher interested in actually performing a study regarding space tourism might attempt to deemphasize the subject matter or communicate in a broader sense (i.e. ‘study about tourism’). Messages may be carefully crafted based upon the effect that specific wording has on respondent cooperation.

CROSS-STUDY COMPARISON
This particular line of research aims to expand over-time, continually adding and testing new sets of messages. We deliberately included several messages from the 2006 iteration on the 2007 iteration in order to help explore whether messages found to have a particular effect in one panel may have a similar effect in others.

We selected 3 of the messages that demonstrated an effect on the likelihood of participation in the 2006 combined sample order to examine whether a similar effect would occur in the 2007 combined sample.  The combined samples from both iterations tended to react similarly toward the messages:

 

 

Iteration  1

Iteration  2

Base Size

1366

710

Constant

39

50

Burden Concept Category

 

 

 

Shorter than the typical survey! Takes only 10 minutes of your time

Incentive  Concept Category

 

7

4

Participate and win one of 3 cash prizes

 

11

18

Earn frequent flyer miles on the credit card of your choice

 

-4

-12

The two messages that displayed substantial positive effects on participation both followed similar trends. First, the message from the “Burden Concept Category” indicating the survey would take 10 minutes to complete displayed an above-average improvement in interest in participation in both study iterations.  In the first iteration of the study (2006), this message was the most powerful phrase, in terms of boosting interest in participation, within the “burden” category of introduction concept categories. In the second iteration of the study (2007), this phrase was the second most powerful, following a phrase that promised that the survey was ‘super-short.

Similarly, within the “Incentive Concept Category,” the best performing message was one that promised the respondent “one of 3 cash prizes” for their participation.  Again, this was the second best performing message in the second iteration of the study- this time following the incentive of a gift card from Home Depot.     In both cases, the best performers from the 2006 study repeated with substantial effects in the 2007 study.

The worst performing message from the “Incentive Concept Category” was an offer for frequent flyer miles to be added to the respondent’s credit card.   This incentive was included on the 2007 study, and resulted in a similar reaction from respondents, who also indicated that this message was the least desirable out of their incentive offers.

The results from these 3 phrase comparisons do not, by themselves, demonstrate that all (or most) of the messages used with one of the combined-samples would create a similar effect with the other.   They do, however, offer a support to the idea that individuals participating in panels may be similar in their motivations for participation.

ADDITIONAL ANALYSIS/FUTURE RESEARCH
Additional  iterations  of  the  study  will  expand  understanding  with  respect  to  the sensitivity of  respondents to  the  different survey introduction concepts and  specific messages.  Basic patterns are already emerging with the first 2 iterations of the study. Now that these patterns are uncovered, they can be studied.

For example, in the second iteration of the study, one aspect of messaging that may cause some difference in participation is the manner in which the actual appeal is worded. Phrases that began with a comparatively strong request (i.e. ‘we’d like to know…’) versus a weaker appeal (i.e. ‘we’d like to ask…’) performed slightly more successfully in gaining cooperation.  Is this a result of the strength of wording, or simply the subject matter attached to the request?   This question will be directly isolated and tested in a subsequent version of this study.

Additionally, numerous other concept categories will eventually be tested.   The MRA Research Profession Image Study (2007) offers confirmation of the results indicated through this study- subject matter, incentives, and survey (time) burden are important factors to respondents. But, there are also other concepts that go into the survey participation decision. One such  area is  the  effect  that  including the  name of  the sponsoring organization has on participation.

Ultimately, this study offers the greatest promise in offering organizations an efficient means of tailoring survey introductions. This may occur from the standpoint of the panel provider, the research provider, or another entity. Conventional survey introductions have been generally crafted with a ‘one-size-fits-all’ mentality.  Survey introductions that are tailored to the respondent may come to offer a new way of efficiency in survey research.

For instance, panel organizations may seek to replicate this design to understand the motivations of their members. In-depth segmentation and analysis of important characteristics of these panelists can further help to refine survey introduction messaging.

Research suppliers, on the other hand, can focus on the type of research they perform. An organization that seeks to boost cooperation rates in healthcare research may replicate this study, focusing entirely on healthcare subject-matter messaging, incentives that are practical for their organization to offer, and the effects that  burden, flexibility, and privacy messages have on the participatory behaviors of their ‘typical’ respondents.

References
Babbie, Earl. 2002 The Basics of Social Research. 2nd Edition Belmont, CA :Wadsworth. Glaser, Patrick. 2007 “The Research Profession Image Study.” Published by CMOR.

Groves, Robert M. 2006 “Nonresponse Rates and Nonresponse Bias in Household Surveys.” Public Opinion Quarterly 70: 646-675.

Martin, Birgi and Moskowitz, Howard. 2006 “Optimizing the Language of E-Mail Survey Invitations.” Article found at http://www.mji- designlab.com/fileadmin/user_upload/articole/General/Optimizing_the_Language_of_e- mail_Survey_Invitations.pdf

Sangster, R. L. 2003 “Do Current Methods Used To Improve Response To Telephone Surveys Reduce Nonresponse Bias?” Published online at the Bureau of Labor Statistics Website (found at http://www.bls.gov/ore/pdf/st030290.pdf )

The American Association for Public Opinion Research. 2006. Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. 4th edition. Lenexa, Kansas: AAPOR.