AEA365 | A Tip-a-Day by and for Evaluators

TAG | Relationships

Hello! I’m Martha A. Brown President of  RJAE Consulting. Lately, an endless stream of conference speakers, blog writers, Indigenous evaluators, and authors have confronted and challenged my “programming” as an evaluator. Traditional evaluation methods place tremendous emphasis on research methods and evaluation theory – but not necessarily on the people we work with and for. At the 2017 Canadian Evaluation Society conference, Nora Roberts told me that the very tools of our profession continue to oppress and silence others. Her statement sent me reeling. Gail Barrington spoke about the value of reflecting upon our work and our methods so we can improve our craft and learn more about ourselves. Indigenous speakers at multiple conferences reminded me that we are all interconnected and that our relationships with ourselves and each other are the most important things in life. All of this can be summed up in one word: love.

Additionally, I research, practice and teach restorative justice, which is grounded in Indigenous values such as interconnectedness, openness, honesty, vulnerability, and respect. I bring these values and restorative practices to my work. However, too many times I have felt like I am “breaking all the rules” that I learned in graduate school as I infuse love into my work and the people I work with.

When I read the invitation to submit a blog on evaluation and labor, the first thing that came to mind was to write about putting love and relationships at the center of our work. What would our work look like if each of us took time at the outset and throughout every evaluation to build trusting relationships with our “stakeholders” and “participants”? Do those of us who are products of Western culture even know how to do this? In a society that values goals, outcomes, and return-on-investment above all else, how can we return to the teachings and the ways of our ancestors and put our relationships at the center of everything we do? We knew this once, but have forgotten.

In AEA, many evaluators are truly committed to changing the world, to improving people’s lives, and to creating more just and equitable ways of doing what we do. But we don’t always know how to live out our goals. That requires us to critically reflect upon what we were taught, how we do our work, and to ask who is being inadvertently silenced, harmed, or oppressed during an evaluation – or in an evaluation classroom. It requires us to love.

Love requires us to engage our whole selves – mind, body, heart and spirit – in our work. We can learn how to do this by studying Indigenous values, practices, and ways of being. I am so grateful to those who helped me wake up, including our own Nicky Bowman.

Rad Resources:

The American Evaluation Association is celebrating Labor Day Week in Evaluation: Honoring the WORK of evaluation. The contributions this week are tributes to the behind the scenes and often underappreciated work evaluators do. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Kim Kelly, PhD, from the Psychology Department at the University of Southern California where I teach courses in statistics and research methods. I have been involved in the evaluation of STEM curriculum and professional development programs since 2002. I have been reflecting on the career path that led me from basic research in psychological science to an independent program evaluator of STEM education initiatives. I offer two insights that have been instrumental in my own professional journey from research to evaluator.

Rad Resources: Social scientists in particular struggle with the distinction between research and evaluation. To be honest, I still struggle with this distinction, and there are many varieties of opinion on the matter. It’s worth the time to consider published ideas, not to end the debate, but to consider the goals and methods of research and evaluation in order to appreciate the practical and intellectual differences between the pursuit of generalizable knowledge in research and the program specific feedback needed in most program evaluations. Gene Glass wrote about this back in 1971 in Curriculum Theory Network and the subject regularly appears in books and journals. See more recent comments in Jane Davidson’s Editorial in the 2007 Journal of Multidisciplinary Evaluation and by Miri Levin-Rozalis in the 2005 Canadian Journal of Evaluation. Reflecting on this key distinction has enabled me to appropriately refine my deep knowledge of the goals and methods of psychological science research to become a more effective program evaluator.

Cool Trick:  It may seem like a no-brainer to suggest establishing a good relationship with those we evaluate or evaluate for. The training of researchers often emphasizes a detached, objective approach to interaction with participants. Further, participants are typically cooperative as they have often volunteered to participate. When I first began program evaluation, I failed to appreciate the interpersonal dynamics associated with evaluations—the perceptions of threat often experienced by participants and clients, the reality of unwilling participants and investigators, and the barriers this lack of trust posed to obtaining valid data. In my work with programs, I emphasize rapport building on both social and programmatic levels to build trust. Rapport building at a programmatic level includes looking for ways to make evaluation data more useful and utilized as part of program development. For example, I shared results of content knowledge assessments with teachers in a metacognitive reflection activity. Being both a familiar and friendly face maximizes the likelihood that you will get the access and cooperation you need to do an effective program evaluation.

Kim Kelly is a leader in the newly formed STEM Education and Training TIG. Check out our TIG Website for more resources and information.

The American Evaluation Association is celebrating STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to aea365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! We are Stacy Johnson and Cami Connell from the Improve Group. At Evaluation 2013, we had the opportunity to present on our experiences using a unique mixed methods approach to collecting data.

Your data collection strategy has the potential to seriously impact your evaluation. You might ask yourself questions like: How do we make sure we are getting the whole story? What if one method isn’t appropriate for gathering all the information you need from a single source? How do you engage people in data collection in a way that makes them understand and want to use the findings? One way to address these questions is to think about each stage of data collection as a layered process by directly connecting quantitative and qualitative methods to complement each other and build a more in-depth and accurate story.

How is this different from how we traditionally think about data collection? We still access the same key sources to answer our evaluation questions, but the design includes a feedback loop to allow the evaluator to immediately integrate any initial findings into the data collection process as they emerge. This often means intentionally including additional interviews or focus groups after an initial stage of data collection to present data back to stakeholders and ask for feedback and relevant background about emerging themes.

Lesson Learned: Provide an orientation to data. Not everyone looks at data every day! Walking stakeholders through data increases the chances that they will want to use it to inform decisions.

Hot Tip: Create easy to interpret graphics to make data more accessible.

Lesson Learned: Make it a mutually beneficial process. In addition to gathering important information for the evaluation, it is equally important to make sure people feel like they are heard and that sharing their experiences can positively impact their work.

Hot Tip: Facilitate discussion about how data applies in day-to-day work.

Hot Tip: Encourage problem solving and planning for how data can inform changes or improvements.

Lesson Learned: Understand the stakes and relationships. Depending on the nature of relationships and potential consequences of the evaluation, there is a risk of people painting an overly positive or overly negative picture. In addition, when presenting data from one source to another, careful attention should be paid in masking the identity of the original source, especially when there are easily identifiable groups or an existing adversarial relationships.

Hot Tip: Include people with different perspectives and roles in the data collection process to uncover any underlying dynamics.

Hot Tip: Try to be aware of any adversarial or contentious relationships that may exist. This approach is not always appropriate depending on existing relationships.

Hot Tip: Mask the original source of data as appropriate.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings! My name is Shana Alford and I am an internal program evaluator at ACCESS Community Health Network, a 501 (c) federally qualified health center network of more than 40 health centers in under-served communities within the City of Chicago. As an internal evaluator, I objectively analyze how federally funded community health programs are implemented in targeted communities and determine measurable impact and outcomes.

In my experience, I have learned the art of asking the right questions to promote positive dialogue and build relationships. Let’s face it, sometimes you can be viewed as an outsider, even when you are an insider!  Asking the right questions doesn’t imply there is a set of wrong questions, but questions can stimulate dialogue and serve as a powerful medium for conversation, so it is important to set the stage to gather the information you need. Keep in mind that one or more conversations can lead to positive change or at least increase awareness and learning among staff, management, and you too!

The Art of Asking the Right Evaluation Questions

photo credit: WingedWolf via photopin cc

Hot Tips:Asking questions for evaluation purposes is an art and here are three that I use frequently:

1) Insight: Listen to your program team and learn about their unique experiences, their responsibilities, successes and challenges. When you have insight about a program then you are more likely to probe deeper into areas that seem to raise a red flag, or deserve attention because they are going so well. Staff will find pleasure that you know the ins and outs of their program.

2) Relevance: An effective question will be right for the moment and relevant for the group of people you are addressing. I have learned the hard way that asking questions to management that should be asked to program staff and vice versa can cause an awkward case of silence, frustration, or sometimes misunderstanding. It is important to know your audience.

3) Patience: Asking questions should not feel like an interrogation to staff or management. Also, they may not be able to answer a question or feel uncomfortable for many reasons. Therefore, it is importance to practice patience. If the purpose of the question is to learn something new, highlight an existing issue, or clarify, then the evaluator should give the program team time to respond, even if it is at a later date. Hint: If people are unresponsive to a question, sometimes taking a step back and asking the same question, but differently will yield the results you are looking for. This may sound unlikely, but it is true, try it!

Rad Resources:

The American Evaluation Association is celebrating the Chicagoland (CEA) Evaluation Association Affiliate Week with our colleagues in the CEA AEA Affiliate. The contributions all this week to aea365 come from our CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


· ·

Hi, I am Leah Goldstein Moses, current President of the Minnesota Evaluation Association (MNEA) and founder of the Improve Group. In both roles, relationships are very important—to both our success and joy. How do you cultivate relationships that energize and support you – and allow you to give in return?

Hot tip 1: Find a “home” in a community (or several). I have been fortunate to be active in Minnesota’s evaluation community through the MNEA. By serving on the Board and attending events, I meet other evaluators, learn about their work, and get ideas. I’ve also found communities among non-evaluators in specific sectors –arts, criminal justice, health, etc. – that keep me up-to-date, and allow me to share information across communities.

Rad resource: I highly recommend being involved in a community focused on evaluation. The Minnesota Evaluation Association serves evaluators across the upper Midwest; you can find other affiliates online. If there isn’t a local affiliate, join a topical interest group, the AEA LinkedIn group, or the evaltalk listserv – each have active members that are committed to the field.

Hot tip 2: Nurture relationships. You can nurture a relationship by giving something (i.e., an idea, resource, etc.), requesting something (problem-solving, resources, connections), or simply acknowledging the importance of the relationship. I try to balance the frequency of posing and answering questions in the communities I belong to; I also thank profusely! The MNEA board meets monthly, providing regular opportunities to connect; for the other communities I participate in, I try to attend events semi-regularly and contribute to forums several times per year.

Hot tip 3: Allow technology to help you. I use social media to keep in touch with my colleagues when we can’t meet in person. For example, for MNEA, we are experimenting with webinars as a way to meet. I use LinkedIn extensively, and follow people on Twitter. I find it energizing—but also a bit maddening–to dip into the chatter of social media periodically. To keep myself sane, I try to cluster my social media use into a few segments per week, and then I can easily work across platforms.

Twin Cities Hot Tip: Build your relationships with a fun visit to Minneapolis for the 2012 Evaluation conference! Be sure to take time to get to know each other and socialize at some of our wonderful local restaurants and attractions. One of my favorites is the Mill City Museum. This quirky little museum packs a lot of history about the development of the upper Midwest into a small space, and has a delicious bakery/coffee shop inside. My children (age 6 and 9) also love it, if you plan to make a family vacation out of the conference.

The American Evaluation Association is celebrating Minnesota Evaluation Association (MN EA) Affiliate Week with our colleagues in the MNEA AEA Affiliate. The contributions all this week to aea365 come from our MNEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Beverly Triana-Tremain and I am the President & Owner of Public Health Consulting, LLC. I want to share with you one of my favorite quotes by Mark Twain, “A habit cannot be tossed out the window…it must be coaxed down the stairs one step at a time.” This quote is never more appropriate than in the organizational setting where public health advocates are trying to change awareness, attitudes, behaviors, and policy. Currently, I am working with Dr. David Fetterman using the empowerment model to help non-profits reduce tobacco exposure in Arkansas .

The situation: The community advocates sometimes meet with resistance as they set ambitious goals for change. For one community advocate, his goal was to “eliminate second hand smoke in apartment complex apartment complexes by 2011.” To his surprise, he could not even get people to fill out a survey, never mind help to eliminate second hand smoke.  Many reasons were given including:  the change in behavior was threatening, they liked to smoke, it was none of his business, and it would reduce business. The problem with this approach was that they wanted a win too quickly. In addition, the goal within the time frame listed was not realistic – that is a significant change in only one year, e.g. eliminate second hand smoke in apartment complexes in several counties.  This true story will be the focus of this tip.

Hot Tip: Many grantees are overly ambitious and overpromise because they want to make real changes in the community.  However, this approach can backfire if they expect too much change too quickly in their own communities.

Hot Tip: We recommended that they revise their goals and strategies and make them more realistic. Small wins are key. We recommended that they make their 5 year goal a perimeter law, but their first one to two years about relationship building and getting buy-in for the change. For example, meeting with the owner of the building, then the manager, and then seeking a meeting with the tenants.

Hot Tip: We have found that success is more likely when community advocates first build a relationship with the intended audience , show them the benefits of why that behavior is unhealthy in a slow and nonthreatening way, and then build upon that relationship to continue introducing new activities, strategies and methods for success in that community.

What are your thoughts and experiences in this area of community change and empowerment evaluation?

Rad Resource: Fetterman, D.M., Deitz, J., Gesundheit, N. (2010). Empowerment evaluation: A collaborative approach to evaluating and transforming a medical school curriculum. Academic Medicine, 85(5), p. 813.

The American Evaluation Association is celebrating Collaborative, Participatory & Empowerment Evaluation (CPE) Week with our colleagues in the CPE AEA Topical Interest Group. The contributions all this week to aea365 come from our CPE members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting CPE resources. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice.

·

Hi, my name is Debbie Cohen. I am the Director of Evaluation at Community Mental Health Center, Inc., a community mental health center that serves five rural counties in Southeastern Indiana. One of my roles in the agency is to work with Indiana University to evaluate a Systems of Care collaborative that involves children utilizing the child serving systems in eight rural counties of Southeastern Indiana. Here are tips related to internal and external evaluators working together.

Hot Tip #1: Advantages of Combining Forces

Internal evaluators have an advantage over external evaluators since they work in the environment in which the program operates and may have been personally involved in some part of the program planning, however the external evaluator can offer a specialist expertise and greater objectivity.

Hot Tip #2: Regular Communication

Every Monday morning I talk to my primary contact at Indiana University. This call is rarely cancelled and we even have it via cell phones if one or both of us are out of the office that day. This call is very important and keeps both sides in the loop.

Hot Tip #3: Work as a Team

Even though there needs to be a line between what information is shared between an external and internal evaluator, it is helpful to approach the evaluation as a team. The internal evaluator may have access to information or community resources that an outsider is unaware of. On the flip side, an external evaluator may have university resources or other assets that could be helpful at an agency level.

Hot Tip #4: The Relationship Will Pay Off

A close working relationship between the internal and external evaluator will reap many benefits for both. The external evaluator who works closely with internal evaluator will have a much clearer sense of how the program functions and will be in a much better position to provide useful feedback. Additionally program staff are much more likely to trust the internal evaluator and may be more welcoming to the external evaluation process if an internal staff member is bringing the group together.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· ·

Archives

To top