My name is Susan Wolfe and I am the owner of Susan Wolfe and Associates, LLC, a consulting firm that applies Community Psychology principles to strengthening organizations and communities.
I serve as the local evaluator for four programs that are funded by Federal grants for a five year cycle. Each program was required to set objective for each of the five years of funding for a pre-set group of performance indicators. At the end of each year they report their performance objective and their actual performance.
This year I facilitated staff retreats for two of the programs. I compared the objectives and actual performance for each indicator. I tagged the indicators where the program fell short of the objective by a good margin as “red light,” the indicators where the program just missed, or will miss next year as “yellow light,” and the ones where the program met the objective and will do so next year as “green light.” During the staff retreat I reviewed the program’s logic model with staff to show how their activities connected with each indicator, and then we went through the red and yellow light indicators and discussed the challenges associated with meeting the objectives. We then celebrated the success with the green light indicators — which were the largest in number.
For the last two hours of the retreat, I facilitated staff discussions whereby staff developed specific strategies, complete with timelines and clear responsibilities to improve performance for red and green light indicators. Staff at both programs were fully engaged and fully participated in the process, with the end result being a clear plan that was realistic and included accountability.
Lessons Learned: Program staff is interested in learning about the bigger picture of their program, and how their activities relate to performance reporting. If they are empowered with information and an opportunity to develop strategies, they will engage and build their capacity to make their program successful.
Hot Tip: Review logic models with all program staff to show them how their role fits into the larger picture. It helps them to become invested in maintaining accurate data records and in the evaluation process.
Lessons Learned: Staff may need more information to determine which direction to take to improve their performance. Evaluators need to be ready to facilitate that process by gathering or analyzing more data and allowing staff input into the evaluation process
Rad Resource: For more information about Community Psychology, its principles and values and see how they guide our work, see the website for the Society for Community Research and Action.
This is a bonus post from the week sponsored by the American Evaluation Association Community Psychology Topical Interest Group. The contributions from December 9-14 all came from CP TIG members, be sure to return to aea365 and take a look! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.