Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

ACA TIG Week: Jessica Sperling on Initiating an Organization’s Evaluation Program

My name is Jessica Sperling, and I work in research and evaluation for education, youth development, arts/media/culture, and civic engagement programs. I am currently a researcher with the City University of New York (CUNY), where I consult on evaluation for educational programs and for StoryCorps, a storytelling/narrative-sharing program. Before joining CUNY, I had developed StoryCorps’ evaluation program as internal staff.

Developing an organization’s research and evaluation program can be challenging for myriad reasons: non-intuitive outcomes and “hard-to-measure” desired impact, the existence of many distinct sub-programs, dynamic organizational priorities, resource limitations, and more. The fact is, however, many entities fitting these characteristics must nonetheless proceed and progress in evaluation. I thus outline select lessons in initiating and implementing an evaluation program at such organizations, drawing from my work with StoryCorps and other early-stage organizational evaluation programs.

Lessons Learned:

Start with the big picture. Begin evaluation planning with a theory of change and a macro-level evaluation framework focused around organizational goals. This should be obvious to evaluators, but you may need to make its value clear to program stakeholders, particularly if they prefer that you dive straight into data collection and results. In addition to permitting focused evaluation, this can also contribute to overall organizational reflection and planning.

Utilize existing research to inform projects and draw connections. Literature review is integral, and definitely a step not to be skipped! Previous research can inform your anticipated outcomes, situate your program within a larger body of work, and demonstrate the causal links between measured/observed outcomes and the organization’s broader desired impacts – a link you may not be able to empirically demonstrate through your own work.

Highlight evaluation for organizational learning. Overtly frame evaluation as an opportunity for strategic learning, rather than as a potentially punitive assessment. Highlight the fact that even seemingly negative results have positive outcomes, in terms of permitting informed programmatic change; most programs naturally change over time, and evaluation results, including formative evaluation, help the program do so in an intentional way. This perspective can promote stakeholder buy-in and develop a culture of evaluation.

An unusual or outside-the-box program doesn’t preclude rigor in research methods. In some cases, having relatively difficult-to-measure or atypical program goals may lead to a presumption (intentional or otherwise) that methods involved in such evaluation may be less rigorous. This, however, is not a given conclusion. Once short-terms outcomes are defined – and they should always be defined, even if doing so takes some creativity or outside-the box thinking – an approach to measurement should incorporate intentional, informed, and methodologically appropriate evaluation design.

Hot Tip: Spend time and energy building positive relationships with internal programs and staff, and with potential external collaborators. Both, in their own ways, can help foster success in evaluation implementation and use.

The American Evaluation Association is celebrating Arts, Culture, and Audiences (ACA) TIG Week. The contributions all week come from ACA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.