Hi, my name is Rex Green. I am the Director of the Quality Transformation Team. I have been performing outcome evaluations of mostly youth after-school programs for the past 16 years in the San Francisco Bay Area. Community Crime Prevention Associates contracts with me to evaluate community based organizations receiving funding from local governments. Together we have analyzed over 700,000 surveys of youth, parents, and staff members of the agencies. I am a founding member of AEA but only recently rejoined the organization to resuscitate interest in performing outcome evaluations.
I am well aware of the challenges of putting all the right pieces together to obtain relevant and timely information about how well a program of services is performing, having conducted outcome evaluations since 1978. Never mind the cost! In 2003 I published my first article on a different approach to conducting outcome evaluations.
Briefly, this approach requires only one round of data collection employing specially designed questions to elicit how recipients of services changed as a result of the services. These data should be collected shortly before the program terminates. The cost and effort commitment for an agency is minimal, since the types of questions and the formatting of the questionnaires are pre-determined (not necessarily the content). The reports are presented using percentages for both the frequency of responses to the questions and for scores that combine answers to multiple questions.
- You can obtain a free trial version of my EZEVAL program by emailing me at email@example.com.
- This approach to evaluating human service outcomes is inexpensive, both in terms of purchasing the system and collecting the data.
- If something is very difficult to do, people will avoid doing it, e.g., outcome evaluations.
- Most service agency directors worry that real outcome information will make them look bad. Most government employees are looking only for what is going wrong. Toxic brew needing retraining to simmer away!
- Real innovation occurs when you think outside your familiar box, e.g., research designed outcome evaluations.
Green, R. S. (2003). Assessing the productivity of human service programs. Evaluation and Program Planning, 26(1), 21-27.
Green, R. S., Ellis, P. T., & Lee, S. S. (2005). A city initiative to improve the quality of life for urban youth: How evaluation contributed to effective social programming. Evaluation and Program Planning, 28(1), 83-94.
Green, R. S. (2005). Assessment of service productivity in applied settings: Comparisons with pre- and post-status assessments of client outcome. Evaluation and Program Planning, 28(2), 139-150.
Green, R. S. (2005). Closing the gap in evaluation technology for outcomes monitoring. Psychiatric Services, 56(5), 611-612.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.