AEA365 | A Tip-a-Day by and for Evaluators

Search

Hi, my name is Donna Loveridge and I work with the Donor Committee for Enterprise Development (DCED), a forum for learning about the most effective ways to create economic opportunities for the poor, in line with the SDGs – based on practical experience in private sector development.

International development organisations, such as DCED members, have traditionally used grants to create economic opportunities for the poor. Now they are increasingly adding impact investing to their repertoire of strategies. But how do impact investors know social and environmental impacts are occurring? Recently, I compared current practices in social impact measurement (SIM) in the impact investing field and results measurement (or M&E) in international development to find the similarities and differences (the full paper can be found here).

Lessons Learned: Here are some insights:

  1. Impact: In international development, the term ‘impact’ has traditionally meant long-term changes (positive, negative, intended, unintended) produced by an intervention directly and indirectly. In impact investing, impact can mean short, medium or long-term changes, including outputs and changes that a business may make to the way it operates, as a means to an end, e.g. more community or employee consultation.
  2. Standardisation: Impact investing is more interested in standardising results measurement approaches and methods to compare investment opportunities and returns than international development. This can be challenging since appropriate approaches and methods are those that consider factors like the investment’s goals; implementation model; location; timescales; and stakeholders such as investor, investee and beneficiaries.
  3. Intermediaries: Currently, impact investing and international development place few expectations on businesses to measure their social and environmental impact – greater onus is placed on intermediaries like fund or programme managers.
  4. Monetary value: Impact investing places greater priority on assessing the monetary value of social impacts than most development programmes, although there has been a strong push towards assessing the value for money of development programmes in recent years by DFID, including using approaches such as social return on investment.

Lessons learned:

  • Communicate –Take the time to understand what people from the other field mean when they use certain terms or words.
  • Look to see what is already out there – exchanging views, sharing resources and cross-fertilisation will contribute to the development of SIM and results measurement.

Rad Resources:

More than 100 private sector development programmes use the DCED Standard for results measurement. Check out practical guidelines for results measurement on challenge funds, a type of development programme with some similar attributes to some impact investments, here.

In 2017, the DCED is looking at what information investees value enough to measure and evaluate themselves; and how results (financial, social and environmental) are attributed to different investors. Subscribe to the DCED newsletter here to keep up with developments.

Interested in learning more? For more information about the SIM TIG, see here. To join the SIM TIG, see here.

The American Evaluation Association is celebrating Social Impact Measurement Week with our colleagues in the Social Impact Measurement Topical Interest Group. The contributions all this week to aea365 come from our SIM TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Feb/17

11

Evaluators and Love… by Sheila B Robinson

I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor. I love evaluation work and evaluators! I wanted to write about evaluation and love, so I decided to revisit what I wrote around this time a couple of years ago. I figured an easy to find the post would be typing “love” into the search box. Turns out, 224 aea365 posts include the word love (well, 225, now!). For you data nerds (and who among us isn’t?!) that’s nearly 9% of our 2557 articles!

Lesson Learned: Many posts include invitations from authors seeking feedback and wanting to connect with other evaluators (e.g We’d love to hear from you…) but others give us insight as to what makes evaluators tick.

Through aea365, we have learned that:

  • Beverly Parsons loves “working with the CLIP process….Communities of Learning, Inquiry, and Practice, informal, dynamic groups of organizational members who learn together about their professional practice.”
  • Laura Peck has “learned to love the counterfactual.”
  • Susan Kistler loves “finding ways to make data understandable and useful.”
  • Susan Eliot claims “Everyone loves a good story.”
  • Carl Brun loves “talking about teaching evaluation.”
  • Matthew von Hendy loves “helping connect people with the information that they need to solve problems or make decisions.”
  • Laura Pryor and Nichole Stewart admit “we both love data.”
  • Bethany Laursen “fell in love with social network analysis (SNA) as a graduate student because SNA gave me words and pictures to describe how I think.”
  • Rita Overton loves “helping programs to improve and having a hand in making the world, or at least my corner of it, just a little bit better.”
  • Nick Fuhrman admits, “Teaching is my passion—I love it!
  • Corey Newhouse has “loved the ways in which (video) has enriched our process and our findings.”

Of course, data visualization is an object of love among evaluators:

  • Stephanie Evergreen is “in love with data visualization and reporting.
  • Yuqi Wang loves “figuring out different ways to visualize data.”
  • Sarah von Schrader and Katie Steigerwalt “love data visualization as a powerful way to share information!”
  • Tony Fujs loves “to visualize the data I have in my hands, but I also like to spend time visualizing data that I don’t have: Missing data.”

AEA and the annual conference also receive some evaluator love:

  • Kathleen Tinworth loves “the exposure to and connections across different disciplines.”
  • Don Glass shares “one of the things that I love about attending the AEA annual conference is getting the opportunity to better understand how my work can relate to and be informed by recent debates and developments in the field.”

Hot Tip: Liz Zadnik, aea365 Curator and sometimes Saturday contributor says “don’t be afraid to let your love of spreadsheets, interview protocols, theories of change, or anything else show!”

Finally, Susan Kistler, AEA Executive Director Emeritus, shares perhaps the most important message about love that we’ve had here on the blog: “Success is made manifest in health and happiness, confidence that you are loved and the capacity to love with others.”

Happy Valentine’s Day!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Laura Peck, recovering professor and now full-time evaluator with Abt Associates.  For many years I taught graduate Research Methods and Program Evaluation courses. One part I enjoyed most was introducing students to the concepts of causality, internal validity and the counterfactual – summarized here as hot tips.

Hot Tips:

#1:  What is causality?

Correlation is not causation.  For an intervention to cause a change in outcomes, the two must be associated and the intervention must temporally precede the change in outcomes.  These two criteria are necessary.  The sufficient criterion is that no other plausible, rival explanations can take credit for the change in outcomes.

#2: What is internal validity?  And why is it threatened?

In evaluation parlance, these “plausible rival explanations” are known as “threats to internal validity.”  Internal validity refers to an evaluation design’s ability to establish that causal connection between intervention and impact.  As such, the threats to internal validity are those factors in the world that might explain a change in outcomes that you think your program achieved independently.  For example, children mature and learn simply by exposure to the world, so how much of an improvement in their reading is due to your tutoring program as opposed to their other experiences and maturation processes?  Another example is job training that assists unemployed people:  one cannot be any less employed than being unemployed, and so “regression to the mean” implies that some people will improve (get jobs) regardless of the training.  These two “plausible rival explanations” are known as the “threats to validity” of maturation and regression artifact.  Along with selection bias and historical explanations (recession, election, national mood swings), these can claim credit for changes in outcomes observed in the world, regardless of what interventions try to do to improve conditions.

#3: Why I stopped worrying and learned to love the counterfactual.

I want interventions to be able to take credit for improving outcomes, when in fact they do.  That is why I like randomization.  Randomizing individuals or classes or schools or cities to gain access to an intervention—and randomizing some not to gain access—provides a reliable “counterfactual.”  In evaluation parlance, the “counterfactual” is what would have happened in the absence of the intervention.  Having a group that is randomized out (e.g., to experience business as usual) means that it experiences all the historical, selection, regression-to-the-mean, and maturation forces as do those who are randomized in.  As such, the difference between the two groups’ outcomes represents the program’s impact.

Challenge:

As a professor, I would challenge my students to use the word “counterfactual” at social gatherings.  Try it!  You’ll be the life of the party.

Rad Resource:

For additional elaboration on these points, please read my Why Randomize? Primer.

The American Evaluation Association is celebrating the Design & Analysis of Experiments TIG Week. The contributions all week come from Experiments TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Growing up in the South with my name – Lovely Dhillon – assured me of a few things.  One, people would likely remember me, and two, that I would have to try to live up to an often used Southern expression, “lovely is as lovely does.”  After having moved back to the South last year after over 20 years away, I realize how much I missed the colloquialisms, the regularly warm hellos, the willingness to engage in insightful conversation, and the sense of community.

As my new hometown, Atlanta, welcomes the AEA membership, I know Atlanta will provide that warmth, knowledge, community and, truth be told, calories.  I wonder, however, what we, as a membership organization, will provide Atlanta, other host cities and the larger community around us.

Please join us as we dig into just this question through a think tank, “Designing AEA’s Collective Impact” on Friday, October 28th in Room L505  (8:00 am – 9:30 am).  Drawing from AEA’s mission to “support the contribution of evaluation to the generation of theory and knowledge about effective human action,” Beverly Parsons, Denise Roosendaal, Matt Keene, Susan Wolfe and I will engage with you about ways the AEA membership does, could or should work in collective ways to impact the communities in which we have our annual meetings, and/or work collectively on engaging in broad social issues.  We will investigate what organizations in other sectors do toward collective action and consider what AEA members can design and set in motion in the next few months.

Cool Tricks:

One example of AEA action is the Community Psychology TIG sponsoring “Walk the Talk” sessions at annual conferences.  In Atlanta this year, the TIG will visit the Georgia Justice Project: http://www.gjp.org/. Participants will have a chance to interact with project staff and members and learn about their evaluation questions, challenges, successes, and needs.  Other AEA members are highlighting small, local nonprofits in workshop sessions as case examples so that AEA members can provide insights and suggestions that might otherwise be inaccessible.

Lessons to Be Learned:

We expect there are many other examples of how AEA members are working together to contribute to organizations and issues outside of our clients, colleagues or institutions.  This session will be a great way to hear all about those ideas and come up with others.

Hot Tip:

As you pack your bags for your trip to the city that was key to the civil rights movement, come with ideas of what we, as a membership, can do to collectively to advance positive social change.

Oh – and remember to come with a warm smile and an empty stomach!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I am Arnold Love from Toronto, the recent host city of the 2015 Para- and Pan American Games. Toronto also hosted the first Accessibility Innovation Showcase to mark the 25th Anniversary of the Americans With Disabilities Act and the 10th Anniversary of the Ontarians with Disabilities Act.

My evaluation interests include both sports and accessibility, so I want to share with you a powerful and enjoyable way of increasing evaluation use, called Jane’s Walk. It was a pivotal feature of the Para- and Pan Am Games and the Accessibility Showcase.

Jane’s Walk is named after Jane Jacobs, noted researcher and author of The Death and Life of Great American Cities. Jacobs championed the use of direct observation through “eyes on the street” and direct engagement to understand the “messy and complex systems” that comprise the urban landscape and to mobilize findings into action.

Rad Resource: Jane’s Walk is an informal walking tour. Check out the Jane’s Walk website to find out how walks “get people to tell stories about their communities, explore their cities, and connect with neighbors.”

Hot Tip: Several walks take place at the same time, each on a different theme. Local volunteers organize them based on their interests and expertise. For example, one walk during the Accessibility Innovation Showcase explored ideas to make busy intersections and entry to stores more accessible.

Hot Tip: Invite people of different ages and backgrounds to participate. The informal nature of Jane’s Walk encourages each person to voice their perspectives based on unique experience and insights. This energizes the conversations.

Hot Tip: Evaluators need diverse yet balanced views of the discussion topics. Facilitate this by finding two people with different viewpoints to co-lead each walk.

Hot Tip: Taking notes shuts down the trust and free exchange of ideas that are the hallmark of the Jane’s Walk. Instead, tweet your notes to yourself and encourage the other walkers to tweet their comments and ideas or share on social media.

Rad Resource: Adding an incentive can greatly increase use of the findings coming from the Jane’s Walk methodology. Check out how Jane’s Walk partnered with Evergreen CityWorks to offer micro-grants to implement the best ideas (http://janeswalk.org/canada/toronto/grants) with little money, but big results.

Rad Resource: Change Jane’s Walk into a game by geocaching. Hide small items (toys, badges, stories) in locations that fit a specific evaluation theme, such as a coffee shop with an accessible ramp. Then log the coordinates and cache description on http://www.geocaching.com. Use the app to find the cache. Its fun!

Evaluation 2015 Challenge: Organize a few Jane’s Walks for AEA 2015. A great opportunity to experience the methodology first hand and get to know Chicago and other AEA members better.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Happy Pi Day! I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor. If you had an internet browser open to this site today, 3.14.15 at precisely 9:26:53 am (EST) looking for today’s post to be about Pi Day, then you probably know that today is a particularly prodigious Pi Day as we have a date and time that represent the first 10 digits of pi, something that will not occur for another 100 years (OK, to be fair, another 12 hours, then another 100 years!).

Lesson Learned: Pi Day is celebrated internationally each year with deference to every mathematician’s favorite constant – the ratio of the circumference of a circle to its diameter —which is approximately 3.141592653… and has now been calculated to over 1 trillion digits! (Back in my school days, we only had to memorize 3.14.) Pi enjoys an illustrious history and you can read more about it and actually see one million of those daunting digits on www.piday.org.

Apple pi by Alex Cockroach via Flickr

Apple Pi by Alex Cockroach via Flickr

Now, as we slickly segue from the consistently celebrated constant to the perpetually plagued pie chart, let’s take a moment for a 360 degree look at pie chart perspectives from around the blogosphere:

We can only surmise how Cole Nussbaumer, of storytelling with data, feels about pie charts with her evocatively titled 2011 post, death to pie charts. Perhaps not surprisingly, it opens with, “I hate pie charts. I mean, really hate them.” Nussbaumer explains, “My main beef with pie charts … is this: our eyes aren’t good at attributing quantitative value to two dimensional spaces. In English: pie charts are really hard for people to read!”

Pie charts receive a bare modicum of redemption from Slate in a 2013 article, In Defense of Pie Charts wherein author Matthew Yglesias declares (somewhat not-so-convincingly, as it’s by way of a double negative), “it’s by no means true that pies are never the right way to go.”

Back in our own evaluator neighborhood, Kim Firth Leonard, of actionable data blog stands boldly on the fence declaring, “I’m not entirely ready to abandon using pie charts in my own practice” in her 2012 post, My love hate relationship with pie charts. And pie.

And evaluation’s dataviz darlings Stephanie Evergreen and Ann K Emery have allowed pie in moderation with only the simplest of ingredients, of course, as elucidated in their recipe card for gorgeous graphs, the Data Visualization Checklist (downloadable from either of their sites).

Coming full circle (seriously, how could we not?) with a return to Pi, we can perhaps see some esoteric connections to evaluation in this fact, courtesy of www.piday.org: “As an irrational and transcendental number, [pi] will continue infinitely without repetition or pattern.”

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Dear Evaluators, will you be  my valentine? Sheila B Robinson here, aea365’s Lead Curator and Saturday contributor feeling a bit mushy today.

On holidays, we are reminded of the way we feel about life, ourselves, and each other. We express thanks on Thanksgiving, appreciation for military personnel on Memorial Day, and hope for the future on New Year’s Day. And of course, today is a day we traditionally express love for one another.

Lesson Learned:  I love evaluation work and evaluators, but when I hear myself say that, I wonder why. With my first career – teaching – the reason for my passion was clear. I love people. I love teaching and learning. I love working on teams, taking pleasure in our shared passion for the work, and admiring colleagues who share so generously their time, attention, and knowledge. I love rooting for the underdog, helping build up those who are down, and find tremendous joy in watching them succeed.

But why evaluation? Sure, I love collecting and analyzing data as much as the next evaluator, but is that really it?

I offer two brief anecdotes as illustration:

1. For a graduate school course on qualitative research, I studied a small local music store and its owner. His passion was singular, his work ethic admirable, and his mantra was “it’s ALL about the music.” Despite all, business was not booming. I wondered what made the operation tick, so to speak, as it had been barely surviving (from a fiscal perspective) for decades, yet wildly popular with its cult following. My findings? It was not “all about the music.” It was about the people, their shared passion and connections, and social relationships. Music was just the raw material.

2. I started a conversation about cars with the owner of a very successful automotive business and received a tepid response to my excitement about a particular model. I asked, “so then, what cars do you love?” His response: “I don’t really love cars. I’m a businessman. I love running a business. Cars are just the vehicle  – uh, no pun intended.”

So why evaluation?  Same reasons as above. It’s all there in evaluation work as well, as I’m certain it is in law, medicine, or business too.

Rad Resource: YOU. Inspiration for this post came from an interview I granted to grad students from my university. They mentioned they had also interviewed two of the biggest names in evaluation. The students then asked me why, with all the blogs, free resources, accessibility and approachability of the “rock stars,” evaluators seem to be so generous with their time, knowledge, and intellectual property when that doesn’t appear to be the case in other disciplines. I think I have the answer.

LOVE.

Image credit: seaside rose garden via Flickr

Image credit: seaside rose garden via Flickr

 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi! We are Kathryn Sielbeck-Mathes and Rebecca Selove, co-authors of Chapter 6 of “Feminist Evaluation and Research: Theory and Practice”. In our article, based on three evaluations of substance abuse treatment programs for individuals with co-occurring mental illness and substance abuse issues, we discuss the importance of framing and shared understanding between evaluators and evaluation stakeholders.

Lesson Learned: Despite linking closely with our own values of fairness, social justice and gender equity within social programming, we did not spend sufficient time understanding the differing values, language, perspectives, frames, etc., of the program manager and his staff, rather assuming we were all interpreting trauma in the same ways and sharing the same values associated with addressing trauma during treatment specifically and programming for women in general. In hindsight, focusing on this understanding should have held the same importance in the evaluation as monitoring fidelity and measuring outcomes.

Hot Tip:

In order to gain attention and respect for the adoption of feminist frameworks, principles, and values for conducting program evaluation, it is imperative that we frame our conversations to connect rather than compete, align rather than malign and foster acceptance rather than objection from those we need to communicate to and with. This requires an understanding of their position on issues that follow from the language or lens of their value and belief systems.

Lesson Learned: Connecting through words, images, symbols, and stories grounded in values helps make solutions accessible and relevant to program stakeholders, service organizations, and funding agencies. Linking an issue to a widely held cultural value or belief helps start the framing process by appealing to program managers and staff, increasing their interest in learning more.

Hot Tip:

If it seems as if you are not being heard….you probably are not. A feeling of frustration can be a signal that reconstruction of a shared meaning based upon shared values is necessary!

Lesson Learned: Key tasks associated with feminist evaluation include 1) understanding the problem from the perspective of the women the program is designed to serve, 2) studying the interior and external context of the program to understand the realities and lived experiences of women, and 3) identifying the invisible structures that can undermine even the most diverse, gender-responsive, trauma informed program.

Hot Tip:

Feminist evaluators must engage in attentive conversations with those implementing and managing human service/treatment programs, listening closely for congruence and dissonance regarding the feminist frame. From the outset of a program evaluation, the feminist evaluator must be mindful and prepared for changing assumptions and language/communication that perpetuates injustice and the disempowerment of women.

Rad Resource: Combating structural disempowerment in the stride towards gender equality: an argument for redefining the basis of power in gendered relationships.

The American Evaluation Association is celebrating Feminist Issues in Evaluation (FIE) TIG Week with our colleagues in the FIE Topical Interest Group. The contributions all this week to aea365 come from our FIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Naomi Walsh. I am an independent consultant working primarily with nonprofits. Today, I want to share information about a favorite, free, online journal.

Rad Resource: NTEN is the Nonprofit Technology Network. Their annual membership is only $85 per year, about like AEA’s, and if you work with nonprofits at all – even if you aren’t a tech guru – they are a great resource. They have lots of training opportunities and a lively community, much of it free for members.

Rad Resource: NTEN’s online journal, NTEN: Change (A Quarterly Journal for Nonprofit Leaders) is completely free! You need only to fill out a very short subscription form. Their September issue is topical for evaluators in that the focus is “I Love Data.” It includes an interview with Mayur Patel of the Knight Foundation on tracking and demonstrating nonprofit impact and a Feature Focus on How Your Organization Can Embrace Data and Use What it Can Teach You from Katie Delahaye Paine.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings aea365 community! I’m Ann Emery and I’ve been both an external evaluator and an internal evaluator. Today I’d like to share a few of the reasons why I absolutely love internal evaluation.

Lessons Learned: Internal evaluation is a great career option for fans of utilization-focused evaluation. It gives me opportunities to:

  • Meet regularly with Chief Operating Officers and Executive Directors, so evaluation results get put into action after weekly staff meetings instead of after annual reports.
  • Participate on strategic planning committees, where I can make sure that evaluation results get used for long-term planning.

Lessons Learned: Internal evaluators often have an intimate understanding of organizational history, which allows us to:

  • Build an organizational culture of learning where staff is committed to making data-driven decisions.
  • Create a casual, non-threatening atmosphere by simply walking down the hallway to chat face-to-face with our “clients.” I hold my best client meetings in the hallways and in the mailroom.
  • Use our organizational knowledge to plan feasible evaluations that take into account inevitable staff turnover.
  • Tailor dissemination formats to user preferences, like dashboards for one manager and oral presentations for another.
  • Participate in annual retreats and weekly meetings. Data’s always on the agenda.

Lessons Learned: Internal evaluators can build evaluation capacity within their organizations in various ways:

  • I’ve co-taught Excel certification courses to non-evaluators. Spreadsheet skills can help non-evaluators feel more comfortable with evaluation because it takes some of the mystery out of data analysis.
  • I’ve also led brown bags about everything from logic models to research design. As a result, I’ve been more of a data “coach,” guiding staff through evaluation rather than making decisions on their behalf.

Hot Tips: Internal evaluators can use their skills to help their organizations in other ways, including:

  • Volunteering at program events. When I served food to child and teen participants at Thanksgiving, my time spent chatting with them helped me design more responsive data collection instruments.
  • Contributing to organization-wide research projects, such as looking for patterns in data across the participants that programs serve each year.
  • Partnering with graduate interns and external evaluators to conduct more in-depth research on key aspects of the organization.

Cool Trick: Eun Kyeng Baek and SeriaShia Chatters wrote about the Risks in Internal Evaluation. When internal evaluators get wrapped inside internal politics, we can partner with external evaluators like consulting firms, independent consultants, and even graduate interns. Outsider perspectives are valuable and keep things transparent.

Rad Resources:

AEA is celebrating Internal Evaluators TIG Week. The contributions all week come from IE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · · · · · · · ·

Older posts >>

Archives

To top