Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Ed Eval Week: Shelly Engelman and Brandi Campbell on The Advantages of Using Retrospective Surveys

Hi! This is Shelly Engelman and Brandi Campbell with The Findings Group, LLC, a private evaluation firm in Atlanta, GA.

Evaluators typically implement pre/post surveys to assess programmatic impact on participants.  However, pre/post surveys are plagued by challenges:

1. Participants have difficulty responding to the “pre” survey items because they have little knowledge of the program content and choose to leave many items blank.

2. Participants feel overburdened with the “post” survey because they answered similar items on the “pre” survey and do not fill-out the “post” survey.

3. A participant is not present for either the “pre” or “post” survey, resulting in an incomplete data set for that individual.

4. Participants gain insights into program content and see it differently than at the beginning. Known as the Response Shift Bias, participants may overestimate their initial attitudes due to lack of knowledge at baseline; after the program, their deeper understanding affects their responses on the “post” survey.

Lesson Learned: Retrospective Results – Complete and Stable

Retrospective surveys ask participants to compare their attitudes before the program to after.  Because a participant completes a retrospective survey in one sitting, responses are more complete.  Not only is there a higher completion percentage with this method, but it also has been found to reduce the Response Shift Bias in participants.

Lesson Learned: The Utility of Retrospective Results

In several of our projects, the retrospective survey had advantages over the pre/post survey.  It yielded more complete datasets and higher response rates. On the other hand, because students complete the survey after the program, they may not accurately remember their attitudes before the program.  This is especially prevalent if the program occurs over several months.  Additionally, younger participants may have trouble navigating the retrospective survey format and may require additional assistance.

Contribute to the Practice of Retrospective Surveying

We appreciate that the evaluation community has more to learn about appropriate uses for retrospective surveys. To more fully understand the differences in true pre/post vs. retrospective pre/post approaches, The Findings Group is conducting pre surveys followed by retrospective pre/post surveys on a handful of programs.  We expect to measure the differences, if any, between the two “pre” response sets.  We invite you to do the same and share your results.  We could put together a panel presentation at AEA 2014!

Hot Tips: Implementing a Retrospective Survey

It is simple to rewrite pre-post survey items for a retrospective survey.

Pre/post survey: I am confident in my ability to solve computer science problems.

Retrospective pre-survey: Before this workshop, I was confident in my ability to solve computer science problems.

Retrospective post-survey: After this workshop, I am confident in my ability to solve computer science problems.

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

4 thoughts on “Ed Eval Week: Shelly Engelman and Brandi Campbell on The Advantages of Using Retrospective Surveys”

  1. Thank you for the article Shelly and Brandi. It was very insightful as I have been researching the most optimal types of surveys to conduct for a youth cooking program. In my experience, a post-program survey is what is usually used in youth programs – memory ability in children vary greatly.

    I originally thought a post survey would be sufficient, but upon reading your article, have changed my mind and will be conducting a retrospective survey as I am looking for difference in attitude and skill before and after completion of the program. I appreciated your tip on how to reword questions from a pre/post survey to a retrospective survey.

    I am curious whether you think there should be a deadline on conductive the retrospective post-survey? I am thinking of sending out the survey 5 years upon completion – do you think that may be waiting too long to gather accurate results? Any tips on how can I ensure accuracy in memory from participants perhaps through sequence/order of questions or wording?

    Thanks again.

  2. Hi would you say thet retrospective surveys should only focus on attitudes, could they also focus on behaviors? For exmaple, “Before the training, I read to me child.”

  3. This is very timely as I have been wondering whether to continue and how to improve retrospective questions in a survey for youth participants in an Out-of-school-time program.

    Do you think we should consider something like an ‘expiration date’ or ‘half-life’ for people’s memories when writing retrospective questions? If so, do you have any tips for clearly phrasing the instructions?

    For example, one of our questions was “Think back to what you were like before you started this program. … ‘Before I started this program, most of the time I enjoyed math’ … agree a lot / agree a little / etc”

    My concern is that some youth might have joined the program a month ago, while some might have attended the same program for many years. Does that undermine the findings? Should the instructions be more specific, for example “Think back to what you were like at this time last year”?

    Thanks for your thoughts!

  4. Davis Patterson

    Shelly and Brandi, good summary. Have had some experience with using both types of questions for an evaluation and would be interested in participating if you decide to set up a panel for AEA 2014.

    Davis

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.