Greetings! We are Tom McQuiston (USW Tony Mazzocchi Center) and Tobi Mae Lippin and Kristin Bradley-Bull (New Perspectives Consulting Group). We have collaborated for over a decade on participatory evaluation and assessment projects for the United Steelworkers (labor union). And we have grappled mightily with how to complete high-quality data analysis and interpretation in participatory ways.
Hot Tip: Carefully determine up front what degree of full evaluation team participation there will be in data analysis. Some practical considerations include: the amount of team time, energy, interest, and analysis expertise that is available; the levels of data analysis being completed; the degree of project focus on team capacity-building; and the project budget and timeline. How these and other considerations get weighed is, of course, also a product of the values undergirding your work and the project.
Hot Tip: Consider preparing an intermediate data report (a.k.a. “half-baked” report) that streamlines the analysis process for the full team. Before the full team dives in, we: review the raw quantitative data; run preliminary cross-tabs and statistical tests; refine the data report content to include only the — to us — most noteworthy data; remove extraneous columns spit out of SPSS; and assemble the tables that should be analyzed together — along with relevant qualitative data — into reasonably-sized thematic chunks for the team.
Hot Tip: Team time is a precious commodity, so well-planned analysis/ interpretation meetings are essential. Some keys to success include:
- Invest in building the capacity of all team members. We do this through a reciprocal process of us training other team members in, say, reading a frequency or cross-tab table or coding qualitative data and of them training us in the realities of what we are all studying.
- Determine time- and complexity-equivalent analyses that sub-teams can work on simultaneously. Plan to have the full team thoughtfully review sub-team work.
- Stay open to shifting in response to the team’s expertise and needs. An empowered team will guide the process in ever-evolving ways.
Some examples of tools we have developed — yes, you, too, can use Legos™ in your work — can be found at: http://newperspectivesinc.org/resources.
We never fail to have many moments of “a-ha,” “what now” and “wow” in each participatory process. We wish the same for you.
The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Salutations!
Thank you for posting on such an interesting topic in expanding the process of program evaluations. Initially, I had a concern in regards to increasing the participation of stakeholders in the process. My first concern: can evaluators trust that stakeholders will be objective in the process due to their invested interest? When it comes to tackling the hard issues that may not be necessarily favorable but much needed, will the lack of evaluation knowledge base create a road block?
Because some programs lay the foundation for certain projects, I’m wondering if applying this process would be better suited for short term programs in order to assess the success of this idea.
Thanks again for this post!!! Please keep us posted.
Participatory Analysis is key to the development of critical skills for new evaluation team members. Determining the amount of enthusiasm and expertise each individual team member possesses will help the team director to assign applicable tasks geared towards skill development. Investing in team capacity is a precious commodity. This process can also be expanded to sustain the stakeholder’s interests throughout the evaluation process. Instead of providing a final report with an analysis and interpretation of the data, the evaluator can present a series of drafts to the stakeholders to allow opportunity for feedback.