AEA365 | A Tip-a-Day by and for Evaluators

TAG | analysis

Hello! We are Stacy Johnson and Cami Connell from the Improve Group. At Evaluation 2013, we had the opportunity to present on our experiences using a unique mixed methods approach to collecting data.

Your data collection strategy has the potential to seriously impact your evaluation. You might ask yourself questions like: How do we make sure we are getting the whole story? What if one method isn’t appropriate for gathering all the information you need from a single source? How do you engage people in data collection in a way that makes them understand and want to use the findings? One way to address these questions is to think about each stage of data collection as a layered process by directly connecting quantitative and qualitative methods to complement each other and build a more in-depth and accurate story.

How is this different from how we traditionally think about data collection? We still access the same key sources to answer our evaluation questions, but the design includes a feedback loop to allow the evaluator to immediately integrate any initial findings into the data collection process as they emerge. This often means intentionally including additional interviews or focus groups after an initial stage of data collection to present data back to stakeholders and ask for feedback and relevant background about emerging themes.

Lesson Learned: Provide an orientation to data. Not everyone looks at data every day! Walking stakeholders through data increases the chances that they will want to use it to inform decisions.

Hot Tip: Create easy to interpret graphics to make data more accessible.

Lesson Learned: Make it a mutually beneficial process. In addition to gathering important information for the evaluation, it is equally important to make sure people feel like they are heard and that sharing their experiences can positively impact their work.

Hot Tip: Facilitate discussion about how data applies in day-to-day work.

Hot Tip: Encourage problem solving and planning for how data can inform changes or improvements.

Lesson Learned: Understand the stakes and relationships. Depending on the nature of relationships and potential consequences of the evaluation, there is a risk of people painting an overly positive or overly negative picture. In addition, when presenting data from one source to another, careful attention should be paid in masking the identity of the original source, especially when there are easily identifiable groups or an existing adversarial relationships.

Hot Tip: Include people with different perspectives and roles in the data collection process to uncover any underlying dynamics.

Hot Tip: Try to be aware of any adversarial or contentious relationships that may exist. This approach is not always appropriate depending on existing relationships.

Hot Tip: Mask the original source of data as appropriate.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, I’m Robert Brunger. I am an evaluator with the Ounce of Prevention Fund of Florida; a Tallahassee based non-profit organization that has worked since 1989 to improve the lives of Florida’s children and families.

If you are planning to use focus groups to learn more about what’s on the minds of your stakeholders, here are some suggestions to help you make sense later out of what gets said during the focus group itself.

Hot Tip #1: Digital recorders really are “the greatest thing since sliced bread!” They are available for less than $40 from electronic retailers. Spend enough to get a model that will allow you to transfer the audio file from the device to your computer. (Get a couple of spare batteries, too!)

Hot Tip #2: Practice with your digital recorder before your use it in a focus group. They are not complicated, but you will want to avoid any undue “fussing” in the focus group setting. Record some practice conversations to get used to the controls and volume levels.

Hot Tip #3: When it gets to “show time,” introduce the digital recorder in a very matter-of-fact fashion, get it started, and then pay no further attention to it until the meeting is over.

Hot Tip #4: Place your recorder in the middle of the table, or on a stool in the middle of a circle of chairs. A recent EVALTALK poster, Daphne LaDue, has made a persuasive case for using two digital recorders, pointed in different directions, as a way to improve your ability to figure out what’s been said later.

Hot Tip #5: Start the digital recorder(s) and a stopwatch at the same time. Your note-taker (and, yes, you do need a note-taker!) can make periodic marginal notes about elapsed time from the stopwatch that can be very helpful later in getting your notes and the recorded audio file(s) to match.

Hot Tip #6: It’s also helpful to create a seating pattern diagram to accompany your notes, and assign everyone an identifier – first names will work well, or numbers, or some uniquely identifying characteristic (e.g., red blouse woman, black man with beard, etc.). You can use this scheme while taking notes to identify individual speakers.

Hot Tip #7: Consider how badly you will need to have a full transcript prepared, as that can be a real “time sink,” taking five to six hours per hour of recorded material. If you are doing multiple groups, or if many people will be involved in interpreting the results, you probably will need them, but for smaller projects, your own summary of what was said, based on your notes and selected quotes from the audio file(s) may be entirely sufficient.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· · ·

Greetings!  We are Tom McQuiston (USW Tony Mazzocchi Center) and Tobi Mae Lippin and Kristin Bradley-Bull (New Perspectives Consulting Group).  We have collaborated for over a decade on participatory evaluation and assessment projects for the United Steelworkers (labor union).  And we have grappled mightily with how to complete high-quality data analysis and interpretation in participatory ways.

Hot Tip: Carefully determine up front what degree of full evaluation team participation there will be in data analysis.  Some practical considerations include:  the amount of team time, energy, interest, and analysis expertise that is available; the levels of data analysis being completed; the degree of project focus on team capacity-building; and the project budget and timeline.  How these and other considerations get weighed is, of course, also a product of the values undergirding your work and the project.

Hot Tip: Consider preparing an intermediate data report (a.k.a. “half-baked” report) that streamlines the analysis process for the full team.  Before the full team dives in, we:  review the raw quantitative data; run preliminary cross-tabs and statistical tests; refine the data report content to include only the — to us — most noteworthy data; remove extraneous columns spit out of SPSS; and assemble the tables that should be analyzed together — along with relevant qualitative data — into reasonably-sized thematic chunks for the team.

Hot Tip: Team time is a precious commodity, so well-planned analysis/ interpretation meetings are essential.  Some keys to success include:

  1. Invest in building the capacity of all team members.  We do this through a reciprocal process of us training other team members in, say, reading a frequency or cross-tab table or coding qualitative data and of them training us in the realities of what we are all studying.
  2. Determine time- and complexity-equivalent analyses that sub-teams can work on simultaneously.  Plan to have the full team thoughtfully review sub-team work.
  3. Stay open to shifting in response to the team’s expertise and needs.  An empowered team will guide the process in ever-evolving ways.

Some examples of tools we have developed — yes, you, too, can use Legos™ in your work — can be found at: http://newperspectivesinc.org/resources.

We never fail to have many moments of “a-ha,” “what now” and “wow” in each participatory process.  We wish the same for you.

This week’s posts are sponsored by AEA’s Collaborative, Participatory, and Empowerment Evaluation Topical Interest Group (http://comm.eval.org/EVAL/cpetig/Home/Default.aspx) as part of the CPE TIG Focus Week. Check out AEA’s Headlines and Resources entries (http://eval.org/aeaweb.asp) this week for other highlights from and for those conducting Collaborative, Participatory, and Empowerment Evaluations.

· · ·

Feb/10

11

Paul Pope on EZAnalyze

My name is Paul Pope. As an evaluation specialist for Texas AgriLife Extension Service, one of my major responsibilities is to train our “faculty in the field” in the basics of program evaluation. County extension educators (agents) are expected to plan and conduct program evaluations in order to demonstrate the impact of their educational efforts. One part of that process involves tabulation and analysis of evaluation data from surveys. For the educator who is already familiar with Microsoft Excel, I have found EZAnalyze to be an ideal analysis tool.

Rad Resource: EZAnalyze is an Excel-based analysis tool for educators. As an add-in, once installed, it simply shows up as an additional menu option within Excel. Having an easy-to-use tabulation tool allows our training to be much more focused on analysis strategies and steps to uncover program impacts rather than learning the mechanics of the tool itself.

Product Features:

  • Assumes the first row contains variable names; then data starts in second row.
  • All operations are incorporated into pull-down menus.
  • No cell ranges to define.  No formulas to create. No functions to learn.
  • Ability to create variables and run results by groups.
  • Descriptive statistics, correlation, t-tests, chi-square, and ANOVA.
  • Frequency tables include percent and valid percent columns.

For the County Extension Educator:

  • Very easy to learn and use. All point-and-click.
  • Intuitive (even for educators with limited experience working with data)
  • Free of charge for educators.
  • Data entry and analysis can be done within one software package (Excel).
  • Sufficient for basic analysis needs
  • Can take advantage of Excel’s features and tools to enhance results.
  • Avoids cost and learning curve associated with powerful statistical packages.

EZAnalyze is available online at http://www.ezanalyze.com

Note: There is a data analysis add-in that comes with Excel; however, it requires the user to define cell ranges and lacks some important features – most notably, frequency tables as a point-and-click option.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

·

Archives

To top