We are Christopher Hall, Brianna Hooks Singletary, Charity Odetola, and Sara Stephenson, doctoral and masters students in the UNC Greensboro Educational Research Methodology department specializing in Program Evaluation. We are members of the STEM Program Evaluation Lab (SPEL) under the tutelage of Drs. Ayesha Boyce, Aileen Reid, and Tiffany Tovey. Reflective evaluation practice involves intentional engagement with and consideration of experiences and insights (Smith & Skolits, in press). Reflective practice can be focused internally or externally; when practitioners are more attuned to both, there are opportunities to deepen the value of an evaluation by paying attention to how people learn, process information, and develop perspectives.
Active engagement in an individual reflective process can encourage increased self-awareness for evaluators. By intentionally analyzing thoughts and emotions during an evaluation, evaluators have the opportunity to improve, augment, or refocus their work in ways that benefit both those who are the focus of the evaluation and the evaluation itself. Utilizing reflective practices while planning and conducting an evaluation can be similarly advantageous. Planning sessions with stakeholders, focus groups, or individual interviews conducted in a reflective stance can inform evaluation plans and generate data that will often be richer and more meaningful than data collected from a traditional stance.
After gathering data, evaluators should reflect upon it, considering themes that might be best operationalized for the empowerment of marginalized stakeholders, improvement of program process, and/or developing opportunities to engage in challenging but important discussions. This reflective practice should be – to the greatest extent possible – conducted in tandem with evaluation stakeholders and participants. Intentionality can assist in incorporating reflective practice throughout the evaluation process. Structures can be developed as a natural process by leveraging human social patterns that favor talking about experiences and sharing opinions. Using reflective practice models can offer specific strategies for categorizing a series of questions, driving discussion and brainstorming sessions, and ultimately increasing engagement and interest in the evaluation itself (Bolton, G. & Delderfield, 2018).
Self Reflection: Make a habit of taking time after interviews, focus groups, or data collection or analysis to think through what went well, what didn’t, and what you felt and experienced as you processed the data (either alone or with other team members!). Think about what this means to you. Write down summaries of this process and refer to them regularly to notice patterns in your own thoughts and behaviors. Partnering with another trusted person – evaluator or not – to process your reflections can make this especially useful.
SWOT – Strengths, Weaknesses, Opportunities, and Threats (SWOT) analysis can be a tool for evaluators to partner with stakeholders to reflect on a group or organization’s ever-changing external and internal environment. This tool can shed light on potentially obscured effects, provide space for reflection on positive and harmful influences to the program’s context, and present recommendations that are based on collaborative reflection.
DATA Model – The DATA (Describing, Analyzing, Theorizing, and Acting) Model is a framework for individual or group reflection. Evaluators can use the model to modify how they are conducting evaluation practice after thinking through their current practice. Use of the model can lead to a greater rapport by asking participants to start with a foundation for their perspectives by describing the construct under consideration. Evaluators can use those responses to ask more probing questions about individuals’ analysis of their work within the program.
- Bolton, G. & Delderfield, R. (2018). Reflective Practice: Writing and Professional Development. United Kingdom: SAGE Publications.
- Smith, T. L., & Skolits, G. J. (in press). Conceptualizing and engaging in reflective practice: Experienced evaluators’ perspectives. American Journal of Evaluation.
- Smith, T. L., Barlow, P. B., Skolits, G. J., & Peters, J. M. (2015). Demystifying reflective practice: Using the DATA model to enhance evaluators’ professional activities. Evaluation and Program Planning, 52(2015), 142-147.
The American Evaluation Association is hosting STEM Program Evaluation Lab Week. The contributions all this week to aea365 come from evaluators working in this lab. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.