Hi! This is Shelly Engelman and Brandi Campbell with The Findings Group, LLC, a private evaluation firm in Atlanta, GA.
Evaluators typically implement pre/post surveys to assess programmatic impact on participants. However, pre/post surveys are plagued by challenges:
1. Participants have difficulty responding to the “pre” survey items because they have little knowledge of the program content and choose to leave many items blank.
2. Participants feel overburdened with the “post” survey because they answered similar items on the “pre” survey and do not fill-out the “post” survey.
3. A participant is not present for either the “pre” or “post” survey, resulting in an incomplete data set for that individual.
4. Participants gain insights into program content and see it differently than at the beginning. Known as the Response Shift Bias, participants may overestimate their initial attitudes due to lack of knowledge at baseline; after the program, their deeper understanding affects their responses on the “post” survey.
Lesson Learned: Retrospective Results – Complete and Stable
Retrospective surveys ask participants to compare their attitudes before the program to after. Because a participant completes a retrospective survey in one sitting, responses are more complete. Not only is there a higher completion percentage with this method, but it also has been found to reduce the Response Shift Bias in participants.
Lesson Learned: The Utility of Retrospective Results
In several of our projects, the retrospective survey had advantages over the pre/post survey. It yielded more complete datasets and higher response rates. On the other hand, because students complete the survey after the program, they may not accurately remember their attitudes before the program. This is especially prevalent if the program occurs over several months. Additionally, younger participants may have trouble navigating the retrospective survey format and may require additional assistance.
Contribute to the Practice of Retrospective Surveying
We appreciate that the evaluation community has more to learn about appropriate uses for retrospective surveys. To more fully understand the differences in true pre/post vs. retrospective pre/post approaches, The Findings Group is conducting pre surveys followed by retrospective pre/post surveys on a handful of programs. We expect to measure the differences, if any, between the two “pre” response sets. We invite you to do the same and share your results. We could put together a panel presentation at AEA 2014!
Hot Tips: Implementing a Retrospective Survey
It is simple to rewrite pre-post survey items for a retrospective survey.
Pre/post survey: I am confident in my ability to solve computer science problems.
Retrospective pre-survey: Before this workshop, I was confident in my ability to solve computer science problems.
Retrospective post-survey: After this workshop, I am confident in my ability to solve computer science problems.
The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.