Lana Rucks

Principal Consultant, The Rucks Group

Lana Rucks, Ph.D., is Principal Consultant of The Rucks Group a research and evaluation firm that gathers, analyzes, and interprets data to enable our clients to measure the impact of their work.


Blog: A Rose Isn’t as Sweet by Any Other Name: Lessons on Subject Lines for Web Surveys

Posted on February 25, 2015 by  in Blog ()

Principal Consultant, The Rucks Group

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Survey developers typically spend a great deal of time on the content of questionnaires. We struggle with what items to include, how to ask the question, whether an item should be closed-ended or open-ended; the list of considerations goes on.  After all that effort, we generally spend less time on a smaller aspect that is incredibly important to web surveys: the subject line.

I have come to appreciate the extent to which the subject line acts as a “frame” for a survey. In simplistic terms, a frame is how a concept is categorized. Framing is the difference between calling an unwanted situation a challenge versus a problem. There is a significant literature that suggests that the nature of a frame will produce particular types of behaviors. For instance, my firm recently disseminated a questionnaire to gain feedback on the services that EvaluATE provides. As shown in the chart below, initially we received about 100 responses. With that questionnaire invitation, we used the subject line EvaluATE Services Survey. Based on past experience, we would have expected the next dissemination to garner about 50 responses, but we got closer to 90. So what happened? We had started playing with the subject line.

Rucks_Chart1

 

EvaluATE’s Director, Lori Wingate, sent out a reminder email with the subject line, What do you think of EvaluATE? When we sent out the actual questionnaire, we used the subject line, Tell us what you think. For the next two iterations of dissemination, we had slightly higher than expected response rates.

For the third dissemination, Lori conducted an experiment. She sent out reminder notices but manipulated the subject lines. There were seven different subject lines in total, each sent to about 100 different individuals. The actual questionnaire disseminated had a constant subject line of Would you share your thoughts today? As you see below, the greatest response rate occurred when the subject line of the reminder was How is EvaluATE doing?, while the lowest response rate was when Just a few days was used.

Rucks_Chart2

 

These results aren’t completely surprising. In the 2012 presidential election, the Obama campaign devoted much effort to identifying subject lines that produced the highest response rates. They found that a “gap in information” was the most effective. Using this explanation, the question may emerge as to why the subject line Just a few days would garner the lowest response rate, because it presents a gap in information. The reason this occurred is unclear. One possibility is that incongruity between the sense of urgency implied by the subject line and the importance of the topic of the email to respondents made them feel tricked and they opted not to complete the survey.

Taking all of these findings together tells us that a “rose by any other name would not smell as sweet” and that what something is called does make a difference. So when you are designing your next web survey, make sure crafting the subject line is part of the design process.

Webinar: Reducing the Outcomes Angst: A Step-by-Step Approach to Identify What to Measure

Posted on May 16, 2012 by , , in Webinars ()

Presenter(s): Jason Burkhardt, Lalitha Locker, Lana Rucks
Date(s): March 21, 2012
Recording: https://vimeo.com/38991661

Deciding what to measure (and what not to measure) towards gathering evidence of impact can be a daunting task, but it doesn’t need to be. In this webinar, Lana Rucks, an ATE external evaluator, will provide a step-by-step approach for making the decisions around what should be measured as an indication of impact. Using an actual ATE project as a framework, attendees will learn how the varying aspects of evaluation (e.g., logic modeling, operationalizing variables, triangulation, etc.) come together in the real world. Regardless of whether you are in the planning phase or already started the implementation your project, you’ll walk away knowing how to better communicate the story of your project/center’s impact.

Resources:
Slide PDF
Handout PDF
Questions and Answer Transcript