Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Dawn X. Henderson on Modeling evaluation at the undergraduate level

I am Dawn X. Henderson, a past fellow of AEA’s Graduate Education Diversity Initiative (GEDI) and member of the Ann E. Casey’s Expanding the Bench Initiative. I recently developed an undergraduate seminar course in Community Psychology at a Minority Serving Institution. Program evaluation is a competency in Community Psychology and modeling evaluation was critical in passing my evaluation “wisdom” on to a group of “underrepresented” students through a partnership with a nonprofit. I aim to share some hot tips and lessons learned with those interested in teaching and working in evaluation.

Hot Tips:

  • Practice logic models. In preparation of the evaluation report, the class met with the Executive Director to obtain information about the nonprofit, focusing on their programming and key activities. The process of building logic models allowed students to become familiar with services provided by the nonprofit and develop visual connections between inputs, activities, etc.
  • Recognize the individual strengths and knowledge of your students/team. Students worked in pairs to perform the quantitative and qualitative analysis; each pair had a student familiar with the methodology and a weaker student. Weaker students learned new knowledge about data analysis and students collaboratively compiled findings into text and graphs.
  • Divide the report in sections and assign main duties and responsibilities. Each section of the evaluation report had a student leader responsible for collecting information, majority of writing, and maintaining communication with students and faculty. Each student also had to review and summarize an article related to the nonprofit’s programs and services; summaries were integrated into the discussion or recommendation section of the report.

Lessons Learned:

  • Maintain lines of communication on progress with the nonprofit. Maintaining contact with the nonprofit about status, challenges, and their needs can be useful in building feedback and recommendations to improve content. Using this process allows undergraduate students to understand the important role of integrating the nonprofit throughout the process in order to ensure the evaluation report is an accurate representation of their program.
  • Develop timelines for important milestones/benchmarks. The majority of the evaluation report was completed at the end of the academic semester, making it a stressful process for students and myself. Building in benchmarks for each section of the evaluation report would have provided more opportunities for feedback and editing. I literally had to go through the entire report the night before its draft was due to the nonprofit.

The students approached the preparation of the evaluation report with limited knowledge in evaluation but some familiarity in traditional research in psychology. In the end, students discovered ways to translate research processes into evaluation and the nonprofit received useful information to support their programming and funding efforts.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.