AEA365 | A Tip-a-Day by and for Evaluators

Hello!  I’m Clara Pelfrey, a native Clevelander, a board member of the Ohio Program Evaluators Group and a member of the Local Arrangements Working Group for Evaluation 2018. I’d like to welcome you to a week of blogs about evaluation in Cleveland as well as tell you about my fair city and the things you can see and do. For discounts, visit the Evaluation 2018 website.

Cleveland skyline at night

Image Credit: Carlos Javier via Flickr

Rad Resources:

Arts & Culture.  The Cleveland Museum of Art is free to visit and has one of the world’s best collections. While you’re there you can stroll around the Wade Lagoon and admire Rodin’s “The Thinker”. The Cleveland Museum of Natural History has exhibits from dinosaurs to diamonds and don’t miss the outdoor Ralph Perkins II Wildlife Center & Woods Garden, a wonderful outdoor exhibit with live native animals that can climb through catwalks above your head. Cleveland’s Museum of Contemporary Art (MOCA) is free on the first Saturday of the month. The Rock and Roll Hall of Fame (aka the “Rock Hall”) has Elvis Presley’s motorcycle and Michael Jackson’s white glove and you can be a pinball wizard at the Rock & Pinball interactive exhibit. If you’re bringing your children, you’ll love the Great Lakes Science Center with its myriad of interactive exhibits, polymer fun house and IMAX theater. Theatergoers, be sure to catch one of the many shows at Playhouse Square, discounted for AEA. For classical music lovers, visit Severance Hall to experience the world-renowned Cleveland Orchestra for “Gerstein plays Rachmaninoff”.

Nature and the Outdoors. Immerse yourself in the Madagascar or Costa Rica habitats at the Cleveland Botanical Gardens.  Lakeview Cemetery is an outdoor museum, where you can visit President James A. Garfield’s Memorial and see breathtaking Tiffany glass in the Wade Memorial Chapel. For hiking or biking, be sure to visit the Cuyahoga Valley National Park. The Cuyahoga Valley Scenic Railroad, with its vintage engine and cars, passes old lumber and grain mills, historic villages, and art exhibits as it chugs through the valley along the Towpath Trail, a former stretch of the Ohio and Erie Canal. Don’t forget the awesome Cleveland Metroparks Zoo.

Totally unique.  The Victorian-era Old Arcade dates from 1890 and was the first indoor shopping center in America. The West Side Market is one of the oldest continuously operating food markets in the country. Visit the house where “A Christmas Story” was filmed in Tremont.

Food, drink, fun. The Jack Casino is walking distance from the convention. There are many restaurants on the east bank of the Cuyahoga river (aka “The Flats”), Downtown on pedestrian-friendly East 4th street or on West 3rd street.  Other places with lots of character and many restaurants include the Coventry Village area in Cleveland Heights or Ohio City and Tremont, which are on the west bank of the  Cuyahoga River, where Irish immigrants settled after helping build the Ohio & Erie Canal. If you’re looking for microbreweries, try the Great Lakes Brewing Company, the first microbrewery in Ohio.

 

We’re looking forward to the fall and the Evaluation 2018 conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Hi, I’m Sara Vaca, independent evaluator working in international development evaluation, helping Sheila curate this blog and being a Saturday contributor, and I want to share with you a short reflection on something that is bugging me lately.

Conducting an evaluation is a process: a rich, participatory, multi-partner, multidimensional process, that has to be later condensed, reflected, gathered in a report (a product), that is unidimensional, static and written by the evaluation leader or team. I have realized some time ago that those two parts of the assignment, equally important, are soooo different! And this represents a challenge in my practice, as I wonder: how can I make the report better represent the process?

Cool Trick: First, of course, try to make the evaluation process as technically good and adapted to the commissioners’ needs as possible. If the process is not good (see this related post), nobody is probably going to care about the report. In general, I enjoy conducing the evaluation process so much, and the feedback about the process is usually good or very good.

Ok, but what about the report?

That is when the second, less rewarding in my experience phase starts.

No matter what you get as findings, internal validity of the process has so far granted me acceptance during the debriefing and validation phase. However, the evaluation users are waiting to see it in a document, clearly stated and articulated, to be able to read it and assimilate what you saw through the process. And at that point, I notice a slight change of attitudes in them, and in consequence in me, as I have to adapt:

And I totally understand: the process is a soft activity and not easy to see the traces of it, but the report is the hard activity, and what it says, stays there forever… Still I find it very interesting how both parts of the same assignment differ in so many things.

Hot Tip: The only thing I’m doing so far is to make the materials in the debriefing presentation very consistent with the evaluation report. How I do that? I create summaries and visuals for my Powerpoint presentation (that I share with them when I leave the country) that will later be quite similarly reproduced in the draft.

However, looking forward to your ideas and tips… How do you reconcile these two equally important parts of evaluation? Thanks!

Hello, we’re Anthony Oboh, an I/O Psychology Doctoral candidate at Keiser University and Consulting Research Intern, and Sy Islam, Principal Consultant with Talent Metrics, a data-driven consulting firm. It is often said, “a picture is worth a thousand words” and that means using visuals can help people understand data more effectively. At Talent Metrics, we help professional organizations evaluate the effectiveness of their meetings. Visualizations can help professional organizations evaluate the quality of their meetings. Data visualization is an easy way for people to understand and interpret information. The simpler and more dynamic the visualization, the easier it is for people to interpret. Often, professional organizations evaluate their meetings based on content, relevance, and the satisfaction of the participated members. Such data can be presented in a simple yet powerful data visualization that can help these organizations evaluate their events more effectively.

Hot Tip: Use data visualization to report multiple pieces of data in a concise and simple manner.

A data visualization like a bar chart or stacked graph such as the one below is an easy way of presenting and communicating information about participants’ experience of a meeting or events. Most meeting feedback is collected after the meeting and can be presented in a simple visual like the stacked bar chart. However, more meaningful findings could be discovered by using longitudinal data. Professional organizations can evaluate how effective different subjects or meeting topics are over time.

Stacked Bar Indicating Members’ Reactions/Outcome of a Meeting

Stacked Bar Indicating Members’ Reactions/Outcome of a Meeting

A survey conducted at the end of a networking event, used to understand how well the event was, particularly to have a better understanding of the impact the event had made on the attendees.

You can use a simple line graph to track these same results over time. Review the line graph below. From this line graph we can evaluate the effectiveness of meetings on these criteria over time and provide appropriate content curation for future professional meetings.

Event evaluation responses

Lessons Learned: Using a longitudinal data such as the line graph above is extremely beneficial to understand how the content of professional networking events is perceived by audience members.

Rad Resources: Check out these resources to learn more about data visualization!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hi there!  My name is Lindsay Anderson.  I am a PhD student at the University of Minnesota studying Organizational Leadership and Policy Development with an emphasis in Evaluation Studies.  Having worked in social work before returning to school, I hold a high value on the importance of relationships and the notion that working together leads to better results.

Collaborative evaluations actively engage program stakeholders throughout the evaluation process and include approaches such as participatory (shared control), empowerment (stakeholder control) and collaborative (evaluator control).

Involving stakeholders can result in many benefits to an evaluation including increasing quality, effectiveness, ethical alignment, utility and use. Collaboration may help increase stakeholder understanding of the evaluation purpose, improve data collection and reporting quality, increase access to program resources, further the dissemination of evaluation results and facilitate program change.

Hot Tip: Identify WHO potential stakeholders are.

Stakeholders are anyone with a vested interest in the program and who therefore also have a stake in the evaluation.

  • Program participants may provide first-hand experience of the program being evaluated and are the most likely to be impacted by the program and evaluation.
  • Partnering organizations and community agencies can provide insight into the context in which the program is embedded.
  • Program providers represent multiple perspectives within the organization and build understanding of program activities and outcomes.
  • Primary users of the evaluation are instrumental in implementing evaluation findings.

Hot Tip: Decide HOW stakeholders will be involved.

Formal strategies to involve stakeholders in an evaluation can include forming an evaluation advisory group or conducting one-on-one interviews and/or focus groups.  An evaluation advisory group consists of stakeholders and evaluators that meet regularly throughout an evaluation to discuss evaluation materials and progress. Interviews or focus groups do not meet with regularity but can be useful in gathering ideas to define and revise evaluation plans.

Hot Tip: Decide WHEN stakeholders will be involved throughout the evaluation.

Stakeholders can be involved throughout the entire evaluation process.

  • Clarifying the evaluation plan: stakeholder perspectives provide information about program activities and expected outcomes to ensure the evaluation purpose and design align with program functions.
  • Data collection: stakeholders can be engaged to refine data collection strategies to maximize participant response. Evaluation instruments may be designed and validated through consultation with program experts and pre-existing program datasets can be utilized for data collection.
  • Data analysis: stakeholders can provide their interpretation of analyses, offering another perspective to triangulate findings and improve the accuracy of results.
  • Reporting findings: stakeholders can improve reporting of findings by: providing feedback on the mode in which results will be shared; ensuring reports are user-friendly; and expanding networks so results reach a larger audience.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I’m Barbara Klugman. I offer strategy support and conduct evaluations with social justice funders and NGOs in South Africa and internationally. I practice utilization-focused evaluation, frequently using mixed methods including outcomes harvesting and social network analysis (SNA). My own history spans social activism, directing NGOs and both working for and being on boards of foundation.

AEA’s Conference theme is Speaking Truth to Power, something that is particularly challenging because of inequitable power relations between nonprofits and their funders, and even between boards and staff.  Evaluators can play a useful intermediary role by providing both the evidence and the facilitation to open space for honest communication.

Hot Tip: I have found the following six factors influenced the effectiveness of my communication across power divides:

  1. Timing of the evaluation and a formative or developmental approach may enhance both grantee and funder interest in the outcomes.
  2. Making learning rather than compliance the evaluation objective creates an environment that welcomes insights to strengthen effectiveness and removes much of the fear and risk from evaluation.
  3. The evaluator needs a substantial capacity for evaluation practice that enhances trust-building to undercut anxiety and establish rules of engagement that allow those with least power the ability to engage, influence and use findings.
  4. The production of high quality evidence while self-evident will be more effective in speaking truth to power if all parties have agreed on the questions, mix of methods and evaluation rubrics
  5. A commitment to and comfortableness with the role of evaluator as social justice advocate assumes that the evaluator can navigate when it is appropriate for her to speak, and when to empower the evaluand to do so.
  6. Terms of reference give the evaluator the independent right and resources to communicate findings to audiences beyond the intended users or those to whom they disseminate findings. While recognising the concomitant ethical responsibility to do no harm, the right and resources to publish findings is critical to the ability of an evaluator to speak truth to power and for the resources that go into evaluation to contribute to broader learning in the field.

Rad Resources: As an illustrative example, see the public communications from the evaluation team of the Ford Foundation’s $54m Strengthening Human Rights Worldwide global initiative. The ToR included funds for the team to publicize findings in Spanish and English which included the summary report, a series of blogs and videos, an article for the international human rights journal SUR and a reflection in Alliance magazine.

Blogs:

The Value of Diversity in Creating Systemic Change for Human Rights

Finding Equity – Shifting Power Structures in Human Rights

Addressing Systemic Inequality in Human Rights Funding

Videos:

The Human Rights System is Under Attack – Can it Survive Current Global Challenges?

The Changing Ecology of the Human Rights Movement

Funding an Effective Human Rights Movement

 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello from Katherine Bergmann and Eryn Collins from Choices Coordinated Care Solutions, a national non-profit organization that supports individuals with significant behavioral and emotional challenges in community settings. We accomplish this by using evidence-informed methods that build on the strengths of those individuals and everyone involved.

Data drives our focus on effectiveness and efficiency at Choices and our Applied Research and Evaluation Team uses this data and translates it into consumable information that drives decision-making. We not only create detailed technical reports, we also develop infographics to present our analyses quickly and clearly for lay audiences.

We use several resources to better highlight actionable outcomes when presenting this information.

Rad Resource:  Piktochart, an infographic design program, has been essential to developing our graphic materials. The website offers free and low-cost templates to kickstart your creative efforts with no professional design experience required.

Rad Resource: Additionally, we use The Noun Project for icons and symbols to add visual interest and help lead consumers through the layout of our graphic presentations. When you use this resource, you have the capability to choose between royalty free and creative commons licensure, so it can fit within any budget.

Rad Resource: Since our goal of these infographics is to create content in plain language, the Hemingway App is a great resource to help make our writing more clear and concise. This free resource highlights problem areas within our content such as complicated sentence structures, redundant vocabulary, and use of passive voice.

In a recent example, Choices was tasked with providing a summary of outcomes to an influential stakeholder in one of our service areas. By using the resources listed above, Choices created a visually-appealing infographic that outlined the clear improvement youth and families were experiencing within our program (Figure 1). We accompanied statistics with graphic elements to clarify the patterns and trends in our data analyses. This output illustrates how evaluators can use graphics to disseminate results that meet the information needs of multiple audiences. Creating a consumable infographic helped our stakeholder readily understand our program success and served as a resource to inform future decision-making.

Figure 1: Infographic created for influential stakeholder.

Figure 1: Infographic created for influential stakeholder.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Sep/18

3

Are you really coaching? by Betsy Block

Hi! I’m Betsy Block, coach, consultant and jill-of-all-trades.  I added coaching to my capacity building work as I trained with two thought pioneers– Coaching Training Institute and CRR Global.

Evaluators have started asking critical, challenging questions: how do we build our client’s capacity? How do we incorporate equity into our own practices? How do we guarantee lasting, meaningful use of evaluation findings?  While independent, evaluation still has a critical role, more often evaluators are seeking alternate ways to increase integration of findings into practice.  In that search, evaluators are increasingly integrating non-traditional tools focused on the interpersonal side as innovative capacity building approaches.

Hot Tip:

Consulting, coaching and mentoring are all approaches we use in with clients to help drive adoption. Try focusing on which approach you are leaning into with your client and the implications of that.

Consulting, mentoring, coaching

Coaches are awesome.

Coaches encourage you, believe in you, and help you achieve your best!  The increase in evaluators offering coaching is a great trend, maybe in part because it feels more approachable than traditional consulting models.    Many evaluators received accredited coach training or hold certifications from the International Coach Federation, the largest accrediting body for coaching.  Many instinctively coach because it is just their way of being.  However, some offer coaching but deviate from evidence-based practices that trained and/or credentialed coaches use – and while these consultants can get positive results, they risk missing the mark in terms of equity and more impactful capacity building.

So how do you do more coaching? 

Lessons Learned:

  • Three simple words can make coaching happen: tell me more.
  • Hold clients as creative, resourceful and whole: prize their expertise above yours. It’s harder than we think. (Let’s get real, we are experts with cool tools!) Last summer, a client began our meeting by asking me to tell him what to do.  I shelved my expertise and told him he had all the knowledge to lead the project.  I asked him simply “What are you wanting?” and kept asking curious, short questions. He had already conceived a thoughtful, purposeful evaluation framework.
  • Coach the system. You are often coaching more than a person, and more than a group of people. Have you ever walked into your client’s office and felt that feeling in the air, like it had its own presence?  That’s the system – you have to reveal it to your client so they can figure out what they want to do about it.   Don’t judge or characterize, just notice it and give them a safe space to talk about it.  “I feel some kind of energy in this room in this room? What about you?”

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, we are Sy Islam, Michael Chetta, and Andrzej Kozikowski from Talent Metrics, a data driven consulting firm. We’re writing about using text analysis to identify new areas of improvement when conducting an evaluation. Evaluation research often uses surveys or other forms of data collection that limit potential outcomes of the evaluation. When designing a survey, you may not cover all the areas of potential improvement using fixed response scale items. Text analysis is a phenomenological approach that allows research participants to provide more detailed responses, typically to open-ended questions. Rather than constraining responses that you think participants will give, text analysis and open-ended questions allow research participants to provide their own ideas and insights into the workplace. Qualitative approaches can be used to gain insight on quantitative results. Text analysis is an efficient way to analyze large amounts of qualitative data rather than traditional methods that require more time and resources.

Hot Tip: Use open-ended questions, social media comments, and text analysis to look for areas of improvement and areas of weakness in an intervention or program.

When evaluating a program or site, open-ended items should be included in any evaluative survey. Another great source for open ended comments would be social media posts and pages. Every major business or service has a Facebook, Yelp, or Google profile. These resources can provide rich and surprising data.

We conducted an evaluation of urgent care centers to identify what language drove positive as well as negative ratings.  Organizing comments from Yelp into positive (4 or 5) and negative (3 and below), we identified language that was used most often in those ratings.  We had hypothesized that different language drove urgent care ratings on social media, and that they fell into three broad categories: 1) Facility, 2) Care Delivery and 3) Staff. We were surprised that patient reviews, both positive and negative, were driven by the same types of experiences.

Top words found in positive and negative reviews

Rad Resource: We used Tropes 8.4 to analyze the text from the social media reviews. Tropes is an open source software used in academic and applied research. It can also provide a graph of actors like the one below that illustrates the relationships between review quality and the language that is used.

Graph of Actors: Relationship of "negative reviews" to patient evaluation terms of interest

Lessons Learned: Evaluation of businesses, nonprofits, and their associated programs are often conducted using surveys or other forms of quantitative data collection. These methods often include the evaluators’ own assumptions about what is important in driving key outcomes. By using qualitative data, such as social media comments, new and unexpected discoveries can be made. Remember to listen to program participants or those who utilize your services. They are the real subject matter experts and can help you as you evaluate.

Rad Resources:

Two  text analysis articles we wrote that go into further detail regarding our projects.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor and I’m getting excited about my presentations at Evaluation 2018! I’m thinking about what content to share, what I want my participants to know, how I will present, what my slides will look like…so much to consider! That’s why I love p2i!

Rad Resource: AEA’s Potent Presentations Initiative (p2i) exists for the explicit purpose of helping evaluators improve their presentation skills, both at conferences and in individual evaluation practice. Potent Presenters think about three key components of a compelling presentation: Message, Design, and Delivery, and our free resources are largely organized around these areas. The p2i website features a p2i Presentation Tools & Guidelines page with downloadable checklists and worksheets, along with webinars and slides, and did I mention that they are all free?

Lesson Learned: Study successful presenters. P2i got its start back in 2012 with Stephanie Evergreen, data visualization and presentations expert, and has continued to grow over the years, with several people contributing to the effort, including me! The three key components and inspiration for much of the original content and came from a study of a dozen well-known AEA presenters – the “Dynamic Dozen” – who had the highest feedback scores from pre-conference professional development, AEA Summer Institute workshops, and Coffee Break Webinars.

Get a pulse from current presenters. As current p2i coordinator, I’m always on the lookout for ideas for new content. In a recent effort to catalyze ideas I asked evaluators to participate in a brief, informal survey about presenting to help inform our future work. The survey asked about people’s past and future presentation work and what might help them improve their practice. I shared the link to “You, the Presenter: What Would Help You Up Your Game?” in an aea365 post and on EvalTalk, and 190 people responded! The results are quite informative, and I shared them over several months in the AEA monthly newsletter in my p2i column.

Rad Resources: 

Are you presenting at #Eval18 or another conference or event? It’s a great time to check out AEA’s Potent Presentations Initiative and download free resources to supercharge your next presentation!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi Everyone! My name is Cherie Avent, and I am a second year Ph.D. student at the University of North Carolina Greensboro with a focus on program evaluation and research methods. I have been fortunate to work on diverse evaluation projects in which the faculty allow students to lead and select the theory that would serve as a guide for our purpose or aim. However, recent discussions in classes and with peers have centered on knowing self and the connections to theoretical orientation. I realized I had been working on evaluation projects without fully considering my own beliefs/values and the theoretical orientation from which I want to work. As a result, I was unaware of how my beliefs/values affected the evaluation designs, processes, and interactions with stakeholders.

Many scholars argue the need for critical reflection on these topics, but I wonder, how many of us do it. Particularly for novice evaluators, can we articulate who we are, what we believe/value, the role we serve, how knowledge is constructed, and other worldviews? Are we aware of how these answers shape our theoretical orientation? Are we able to articulate our theoretical orientation? Answers to these questions frame our approach and methods. The AEA Guiding Principles for Evaluators emphasizes the importance of self-reflection and being explicit about the role one’s beliefs play in the conduct of evaluation.

Lesson Learned: Begin self-reflecting early
It’s important to spend time reflecting on one’s beliefs and values because they show up in every aspect of our work. The reflection can begin with questions such as, who am I? What do I believe/value? How do my personal and professional experiences affect me as an evaluator? Then move into more complex questions: Why am I doing this work?  What do I believe the role of an evaluator is and what would I like my role to be? How do I believe knowledge is constructed? I am now starting to explore these questions, and I invite you to do the same.Hot Tip 1: Develop a small group/network to share your thoughts, dilemmas, and difficulties as a way to work through these questions. By dialoguing, you can help each other in understanding, clarifying, and expanding perspectives. More specifically, it enhances our ability to express our theoretical orientations to others verbally. The interactions might occur in-person, over the phone, or via online methods. There’s no limit!

Rad Resources:

The American Evaluation Association is celebrating Theory and Practice week. The aea365 contributions all this week come from Dr. Ayesha Boyce and her University of North Carolina Greensboro graduate students’ reflections on evaluation theory and practice. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

<< Latest posts

Older posts >>

Archives

To top