I am Liliana Rodríguez-Campos, Co-Chair of the Collaborative, Participatory, and Empowerment Evaluation Topical Interest Group (TIG) of the American Evaluation Association. I am also an evaluation professor and the director of the Center for Research, Evaluation, Assessment and Measurement at the University of South Florida. Among other achievements, I received the American Evaluation Association’s Marcia Guttentag Award. During my evaluation career I have also been working on publications and presentations, emphasizing collaborative evaluation, including my book Collaborative Evaluations: A Step-by-Step Model for the Evaluator (available in English and Spanish). I have offered training in both English and Spanish to a variety of audiences in the US and internationally. I would like to share some tools and tips based on my experience.
Hot Tip: When conducting collaborative evaluations, the use of a variety of elements and strategies may enhance stakeholder involvement. For example, throughout my collaborative evaluations I follow specific elements and their suggested step-by-step guidelines: (1) identify the situation, (2) clarify the expectations, (3) establish a collective commitment, (4) ensure open communication, (5) encourage effective practices, and (6) follow specific guidelines. It is possible to gain new insights by using each element individually; however, to accomplish a comprehensive collaborative evaluation, I recommend the interactive use of these elements (for more information, please visit http://www.collaborative-evaluations.com/mce.html)
Rad Resource – Collaborative, Participatory, and Empowerment Evaluation TIG Webpage: Publications about evaluation in collaborative, participatory and empowerment evaluation are presented as well as blogs and links to websites containing useful information.
Rad Resource – Center for Research, Evaluation, Assessment and Measurement (CREAM): CREAM is a non-profit agency that achieves its mission through collaborative work on a variety of evaluation projects.
Rad Resource – Evaluation, Assessment, & Policy Connections (EvAP): EvAP is an evaluation unit that conducts evaluations and provides training and technical assistance in evaluation, assessment, and strategic planning to educational, community and service organizations across the United States.
Rad Resources – Latest Publications:
Rodríguez-Campos, L. & Rincones-Gómez, R. (in press). Collaborative evaluations: A step by step model for the evaluator (2nd Edition). Stanford: CA. Stanford University Press.
Rodríguez-Campos, L. (2012). Stakeholder involvement in evaluation: Three decades of the American Journal of Evaluation. Journal of MultiDisciplinary Evaluation 8(17), 57-79. Full Text
Rodríguez-Campos, L. Berson, M., Walker-Egea, C., Bellara, A, & Owens, C. (2012). Evaluation of a civic education project: An application of the model for collaborative evaluations. International Journal of Interdisciplinary Social Sciences, 4, 5-27.
Rodríguez-Campos, L., Bradshaw, W., & Lake, J. (2012). Addressing the services provided to children with disabilities through collaborative evaluation. The Global Studies Journal, 6, 7-23.
The American Evaluation Association is celebrating CPE week with our colleagues in the Collaborative, Participatory, and Empowerment TIG. The contributions all this week to aea365 come from our CPE TIG Colleagues. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice.
Hello there, Professor Liliana Rodrigues-Campos,
I am a participant in the Graduate Diploma for Professional Inquiry program at Queen’s University in Kingston Ontario Canada, en route for my Professional Masters of Education with a Classroom Focus. As part of our Program Evaluation course we were asked to find an article on AEA that interests us and respond to the author.
Last week in our course we were asked to read an article by Shula and Cousins, which I’m sure you are familiar, given your set of expertise, and discuss it. This particular article is about evaluation use, and one section of it focused on collaborative evaluation. I was one of a handful of class participants who made the argument for collaborative evaluations. I feel that they seem to have a much higher likelihood of being used upon program completion because the stakeholders really feel connected to the work as well as the evaluators.
Shula and Cousins refer to Finne et al. (1995): they “detail and critique an interesting approach they call “trailing research”. This collaborative model espouses (a) maintenance of a continual focus on learning, (b) clarification of roles and expectations, (c) creation of arenas for dialogue, (d) joint focus on all issues being investigated, (e) attention to the validation of findings, and (f) joint interpretation of results. While such models have enormous potential for impact, they pose formidable challenges for evaluators to implement.” (pg 200)
After reading your blog entry on AEA, I really love your easy to understand 6 Hot Tips. They are similar, but I appreciate your points 3 and 4 as welcomed additions to this list: the vocabulary of ‘establishing a collective commitment’ is much stronger and I feel it would lead to the better establishment of a common ground. Also, the word ‘ensure’ when you talk about open communication is really key – an evaluator really needs to check for understanding and be committed to answering questions that arise in order for the collaborative approach to be most effective.
I wonder if you have any insight on the best situations that would call for a collaborative evaluation. Which types of evaluation situations would be best served with a collaborative, participatory approach? Since you are so supportive of the approach, I’m interested to see if there are any circumstances in which you would avoid using it (except of course, online, or distance evaluations where in person communication is impossible).
Shulha, L., & Cousins, B. (1997). Evaluation use: Theory, research and practice since 1986. Evaluation Practice, 18, 195-208.
Hi Liliana
I see your notation about your new book coming out by Stanford University Press later this year. Could you say a bit more about it? Many thanks.
– David