We are Tracy McMahon (Education Development Center) and Gary Silverstein (Westat) on behalf of the NSF INCLUDES (Inclusion Across the Nation of Communities of Learners of Underrepresented Discoverers in Engineering and Science) Coordination Hub. The Hub supports NSF INCLUDES and its National Network, which brings together the power of many in a shared vision to shift inequitable systems and broaden participation in STEM education and careers. We’re sharing how we collaborate with Network members to create a shared system to measure the progress of many different organizations in achieving this shared vision for the future of diversity, equity, and inclusion in STEM.
Have you ever tried to complete a puzzle from the inside out without knowing the overall shape?
There’s a challenge in trying build a shared measurement system without pre-established boundaries. When we first set out, a primary goal was to aggregate outcome data from as many existing sources as possible. We quickly discovered that aggregating outcomes across projects that operate in varied contexts is difficult—especially if they aim to achieve distinct results across diverse systems. As such, our strategy shifted from aggregating outcomes to devising a framework for categorizing a diversity of outcomes.
Hot Tip: Work with your community to drive the process.
An effective strategy for using shared measures to examine the combined progress of multiple organizations is to collaborate with members of the community who will use the system. We used our Shared Measures Workgroup to vet our original ideas, generate new ideas, create common measurement tools, facilitate data collection, and inform our analyses. Working with Network members allowed us to leverage existing collection efforts and develop new tools that met a shared need for actionable data. This included co-creating a survey to measure projects’ progress operationalizing the NSF INCLUDES Five Elements of Collaborative Infrastructure.
Lesson Learned #1: You must be nimble.
Bringing in end user perspectives can result in new priorities, enhanced data collection tools and procedures, and refinements to the analysis and display of results. This can lead to new ideas not previously considered or planned for – which is great but can mean you have to pivot quickly. Our team needed to be nimble and quickly adapt whenever better ways of structuring the survey or framing our analysis of data were presented to us.
Lesson Learned #2: Shared measures can help people learn and connect.
In terms of outcome data, we are developing an inventory of cradle-to-career outcomes for projects engaged in broadening participation in STEM. The purpose is to showcase outcomes and associated learning across the Network. While these data cannot be aggregated, the structure of the inventory will hopefully facilitate connections around specific discrete outcomes. Like other topics Network members want to connect around, it’s important to provide space for discussions around how common outcomes are measured and achieved. Members have utilized Network affinity groups to build their own and others capacity in this area. The inventory will provide Network members another means by which they can share tools and knowledge.
Rad Resource
There are many great ways to engage with the NSF INCLUDES National Network. Check out the Shared Measures platform to learn more about Network’s progress in reaching a diverse population, developing a collaborative infrastructure, and promoting equity and systems change in STEM academic and career pathways. Join the Evaluation and Shared Measures Affinity Groups (login required, but it’s free and easy) to connect with others.
Acknowledgement
None of this would be possible without the insights, knowledge, dedication of the Shared Measures Workgroup members. We’ve enjoyed the hours working on this together with all of you.
The American Evaluation Association is hosting STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to AEA365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.