ACM TIG Week: Evaluating our Impact: Priorities and Instruments by Kari Ross Nelson

My name is Kari Ross Nelson. I am the Research and Evaluation Associate at Thanksgiving Point Institute and an independent consultant to museums and other informal learning organizations in the greater Salt Lake City area. My team at Thanksgiving Point is the research partner on an IMLS-funded project, Measurement of Museum Social Impact (MOMSI). This post shares background about and lessons learned from MOMSI.

Questions of impact were being raised even before challenges of the COVID-19 pandemic and racial and social inequities in the US over the past few years. Those questions are even more critical now, not only due to competitive funding opportunities, but to living true to the core values museums espouse. By developing an open-access toolkit, the MOMSI project aims to help museums of all sizes to assess their social impact and evaluate themselves in terms of service to their communities.

In today’s social climate, many in the museum field feel the need to evaluate their past and current practices and strategize what kind of impact they want to have. In fact, the 2022-2025 strategic plan for the American Alliance of Museums lists “Social and Community Impact” as one of four priorities. When museum professionals talk about making an impact, often they are referring to the educational, economic, and social dimensions. The first two are relatively straightforward to document. Social dimensions, however, are more nuanced. Yet measuring social impact is one way museums can move forward and evaluate how they are (or are not) of service to their community.

At MOMSI, we’re defining social impact as the effect of an activity on the social fabric of a community and the well-being of the individuals and families who live there, and measuring it based on four long-term outcomes: health and wellbeing, valuing diverse communities, continued education and engagement, and strengthened relationships. The project has recruited 38 museums across the United States to pilot an instrument that generates data about the social impact of museums on individuals in their communities, while informing a toolkit (available late 2023) that will increase the capacity of museums to address this need individually in the future.

Logos of the following organizations: Arkansas State University Museum; Atlanta History Center; Bellevue Botanical Garden; Carter County Museum; Chazen Art Museum; Children’s Museum of Indianapolis; Conner Prairie; Cradle of Aviation Museum and Education Center; Crystal Bridges Museum of American Art; Desert Botanical Garden; Florence Griswold Museum; Franklin Park Conservatory and Botanical Garden; Fresno Chaffee Zoo; Gallery One; Greensboro History Museum; Jackson Hole Children’s Museum; Jule Collins Smith Museum of Fine Arts at Auburn University; Kemper Museum of Contemporary Art; Los Angeles Zoo and Botanical Gardens; Minneapolis Institute of Art; Minnesota Historical Society, Minnesota History Center; Molly Brown House Museum; Montshire Museum of Science; Museum of Science, Boston; National Aquarium; Oklahoma City Zoo and Botanical Garden; Pérez Art Museum Miami; Plains Art Museum; Queens Botanical Garden; Rochester Museum and Science Center; Rockwell Museum; San Diego Chinese Historical Society and Museum; St. Louis Zoo; The Calaboose African American History Museum; The Glazer Children’s Museum; The Morton Arboretum; University of Michigan Museum of Natural History; Utah Museum of Contemporary Art.
Logos of MOMSI partners

Lessons Learned

Recognize that measuring impact is tricky! As with any evaluation, start by defining what you mean by “impact” and your organization’s  motivations for wanting to know about it.

Developing a vetted and validated instrument is a complex undertaking. If this is not part of your existing skill set, gather advisors who can guide you through the process, and be prepared for it to take time.

Be prepared to defend your methods choices. There are a lot of definitions of impact, approaches to measuring it, and, as we’ve found, a lot of people who will closely examine your work.

Likewise, be open to critique. Sharing your work at conferences is a great way to get helpful feedback, but be prepared to be pushed to new levels. We’ve dreaded some feedback we’ve received knowing it means extra effort, but the increased validity and earned credibility is worth it (see “be prepared to defend your methods” above!)

Rad Resources

  • Measuring museum impact and performance: theory and practice: Museum analyst and planner John W. Jacobsen provides the theoretical underpinnings as well as the practical application of measuring a museum’s impact. He presents 1,025 indicators drawn from 51 expert sources, allowing individual museums to consider their own priorities and purposes for measuring impact.
  • Measurement theory and applications for the social sciences: Yes, a textbook, but a very useful reference! It begins with a history of measurement, then thoughtfully guides the reader through scale development, item writing and analysis, and reliability and validity, as well as more advanced topics.
  • MOMSI Project: The website of the MOMSI project. You’ll find a description of the project and updates on current progress.

I have no doubt many reading this will have additional hot tips and rad resources for measuring social impact. Please share them in the comments below!


The American Evaluation Association is hosting Arts, Culture, and Museums (ACM) TIG Week. The contributions all week come from ACM TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.