Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

McQuiston, Lippin, and Bradley-Bull on Participatory Analysis

Greetings!  We are Tom McQuiston (USW Tony Mazzocchi Center) and Tobi Mae Lippin and Kristin Bradley-Bull (New Perspectives Consulting Group).  We have collaborated for over a decade on participatory evaluation and assessment projects for the United Steelworkers (labor union).  And we have grappled mightily with how to complete high-quality data analysis and interpretation in participatory ways.

Hot Tip: Carefully determine up front what degree of full evaluation team participation there will be in data analysis.  Some practical considerations include:  the amount of team time, energy, interest, and analysis expertise that is available; the levels of data analysis being completed; the degree of project focus on team capacity-building; and the project budget and timeline.  How these and other considerations get weighed is, of course, also a product of the values undergirding your work and the project.

Hot Tip: Consider preparing an intermediate data report (a.k.a. “half-baked” report) that streamlines the analysis process for the full team.  Before the full team dives in, we:  review the raw quantitative data; run preliminary cross-tabs and statistical tests; refine the data report content to include only the — to us — most noteworthy data; remove extraneous columns spit out of SPSS; and assemble the tables that should be analyzed together — along with relevant qualitative data — into reasonably-sized thematic chunks for the team.

Hot Tip: Team time is a precious commodity, so well-planned analysis/ interpretation meetings are essential.  Some keys to success include:

  1. Invest in building the capacity of all team members.  We do this through a reciprocal process of us training other team members in, say, reading a frequency or cross-tab table or coding qualitative data and of them training us in the realities of what we are all studying.
  2. Determine time- and complexity-equivalent analyses that sub-teams can work on simultaneously.  Plan to have the full team thoughtfully review sub-team work.
  3. Stay open to shifting in response to the team’s expertise and needs.  An empowered team will guide the process in ever-evolving ways.

Some examples of tools we have developed — yes, you, too, can use Legos™ in your work — can be found at: http://newperspectivesinc.org/resources.

We never fail to have many moments of “a-ha,” “what now” and “wow” in each participatory process.  We wish the same for you.

This week’s posts are sponsored by AEA’s Collaborative, Participatory, and Empowerment Evaluation Topical Interest Group (http://comm.eval.org/EVAL/cpetig/Home/Default.aspx) as part of the CPE TIG Focus Week. Check out AEA’s Headlines and Resources entries (http://eval.org/aeaweb.asp) this week for other highlights from and for those conducting Collaborative, Participatory, and Empowerment Evaluations.

2 thoughts on “McQuiston, Lippin, and Bradley-Bull on Participatory Analysis”

  1. Tom, Tobi, and Kristin,

    Thanks for sharing these ideas with us.

    I really appreciated the “intermediate data report.” Our group produces a number of preliminary reports and memoranda to facilitate discussion, program direction, and informed decision making.

    In empowerment evaluations, we help participants and program staff members conduct much of their own analysis and assessment – but we are not purists. Depending on local evaluation capacity we do a lot of the work when necessary – but train folks in the process.

    In any case, we find preliminary analysis useful in all of our evaluation efforts (empowerment oriented, as well as evaluator controlled and directed).

    I also really appreciated your statements about building internal evaluation capacity – the capacity of your team: “Invest in building the capacity of all team members….An empowered team will guide the process in ever-evolving ways.” I agree 100%. The better and more qualified your staff, the better the thinking and final product – which only enhances local decision-making and program improvement. Of course, we extend that logic to local community participants and program staff members – the more informed they are and more competent they become in conducting evaluations the more helpful they will be to their organizations.

    Finally, an often understated part of what evaluation is all about is captured in your posting: “We never fail to have many moments of “a-ha,” “what now” and “wow” in each participatory process. We wish the same for you.”

    My experience is exactly the same in empowerment evaluation. In the middle of our evidence driven dialogues about where things are – we inevitably hit that “a-ha” moment – identifying a pattern or underlying issue that helps explain a great deal of the observed behavior. It makes the process richer and more rewarding when you hit that inevitable point in the dialogue (within the evaluation team and/or within the community and program), but rarely discussed in professional forums. Thanks for filling that gap.

    -David

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.