AEA365 | A Tip-a-Day by and for Evaluators

TAG | data management

Hello AEA365!  I’m Paul Collier. Over the last two years I worked as the Data and Evaluation Manager at the San Francisco Child Abuse Prevention Center (SFCAPC), a mid-size nonprofit focused on ending child abuse in San Francisco. In my time there and as a freelancer, I can’t count the number of times I’ve fielded questions from staff about data their organization has collected. They often go something like this…


How frustrating! But as someone serving as an internal evaluator or data analyst at an organization, I have to remind my staff questions like these are my friend. When my staff asked me questions about their data, I knew they’re engaged and interested in using it. But I often found the first questions they asked weren’t the questions that would really help them make decisions or improve their programs. This post is about helping your staff think critically and ask smarter questions about their data.

Hot Tip: Focus on highly strategic questions

Questions that can be answered with existing data come in all shapes and sizes. I like to consider first whether the results may help the organization improve or refine our programs. For example, questions testing the cause-and-effect relationships in our logic model or assumptions in our theory of change can and should inform programming. A second aspect of a strategic question is whether our team has expectations for the result. I often realized that our staff didn’t have expectations around average improvement or effect size, so I would find a few studies using comparable assessments and interventions to identify some benchmarks. Perhaps the most useful aspect of a strategic question is whether our staff can take action based on the results. I found that if my staff can’t envision how the results might actually be used, its wiser to help them think through this before spending my time (and theirs) analyzing the data.

Cool Trick: Plan for Analysis.

To be more strategic about the analysis questions I focused on, I built time between the request for analysis and doing the work. An initial conversation with the program manager or staff to learn more about the context of a question usually helped me refine it to be more specific and actionable. I found that batching analysis for a certain time in the year was also a useful planning approach that protected my time. I preferred to have this ‘analysis period’ in the winter, because my organization set its budget in the spring. This way, any changes to programming that resulted from the process could be planned for in the following year’s budget.

Rad Resources:

As you can tell, I think helping staff ask smarter questions is one of the most valuable things I do as an internal evaluator. For more reading on this topic, check out:

  • Michael Hyatt’s Blog on Asking More Powerful Questions: Michael Hyatt is a business author who provides some clear and easy to understand advice to aspiring leaders on asking questions.
  • Peter Block’s book, Flawless Consulting: Block’s Flawless Consulting provides many helpful suggestions for structuring analysis processes so they influence action. There are also several great chapters about overcoming resistance in clients, which I’ve found highly relevant for dealing with inevitable resistance in results within my team.
  • Rodger Peng, Ph.D.’s E-Book, The Art of Data Science: Peng illustrates what a practical data analysis approach looks like, framing each step as a process of setting expectations and understanding why results did or did not meet those expectations.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


My name is Sam Held and I am the Data Manager for Science Education Programs at ORAU— Oak Ridge Associated Universities. We are involved with science education and STEM workforce development from K-12 through postgraduate fellowships. I am involved with evaluations done internally (programs we manage) and externally in addition to all data reporting needs.

A recent trend in the STEM fields is the call to share or access research data, especially data collected with federal funding. The result is requirements from the federal agencies for data management plans in grants, but the different agencies have different requirements.  NSF requires a plan for every grant, but NIH only requires plans for grants over $500,000.

The common theme in all policies is “data should be widely and freely available as possible while safeguarding the privacy of participants, and protecting confidential and proprietary data” (NIH’s Statement on Sharing Data 2/26/2003). The call for a data sharing plan forces the PIs, evaluators, and those involved with the proposals to consider what data will be collected, how will it be stored and preserved, and what will be the procedures for sharing or distributing the data within privacy or legal requirements (i.e., HIPAA or IRB requirements). To me – the most important feature here is data formatting. What format will the data be in now and still be accessible or usable in the future or to those who cannot afford expensive software?

Rad Resource: DMPTool – a website from the University of California system for developing Data Management Plans. The best component of this site is their collection of funder requirements, including those for NIH, NSF, NEH, and some private foundations.  This site includes templates for the plans.

Rad Resource: Your local university – many universities have Offices of Research which have templates for these plans as well. For example, see:

Clipped from

Sam Held is a leader in the newly formed STEM Education and Training TIG. Check out our TIG Website for more resources and information.

The American Evaluation Association is celebrating STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to aea365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


My name is Dan Jorgensen and I currently serve as the evaluation and research coordinator for the State Personnel Development Grant at the Colorado Department of Education.  My primary responsibilities involve the evaluation of six state initiatives including Response to Intervention, Positive Behavioral Interventions and Supports, Autism/Significant Support Needs, Early Childhood, Communities of Practice, and Family-School Partnerships.  Needless to say, succeeding at this endeavor requires well-developed logistics regarding data management.  The purpose of this AEA365 contribution is to outline a simple process to facilitate the organization of new or existing data structures (see figure one).

Lessons Learned: Appropriately addressing data management issues lead to more refined evaluations and analytics.  In effect, time will be spent performing evaluation responsibilities as opposed to constantly organizing, reformatting, and scrubbing the data.

  • Develop appropriate data tracking and monitoring tools. This includes, at a minimum, an event calendar with data collection and reporting deadlines; a task list to monitor day-to-day work flow and a project notebook that clearly details one’s evaluation plan and all “processes” in case the proverbial “bus” finally hits you.  If you’re managing multiple initiatives and a wide range of data collections these tools are required.

  • Extant data collection structures must be accurately located, identified, and understood.  It’s possible that your data will be collected via surveys (on-line or otherwise), rubrics, state/federal data, and other sources.  The collection dates, tools, stakeholders, and locations of this data must be reliably determined so management structures can be established.
  • Determine how disparate data sources are maintained. Typically, data is maintained at a technical level based on the expertise of the “collector”.  For example, field consultants responsible for data entry may only be comfortable using products such as MS-excel or MS-word.  This leads to data structures being organized in a flat file format and/or creates the necessity for duplicate data entry (e.g. entry of word documents into excel).  This creates a problem in that it often limits reporting options and if not organized correctly is prohibitive to relational database structures.
  • Consolidate data to a single location and format.  This allows for the gradual modification of data structures into more advanced formats and facilitates the building of reports.  For example, my preference is to convert excel files to a MS-Access format with forms being created for data entry. In addition, the reporting capabilities of the MS-Access database provide both immediate and continuous feedback concerning evaluation objectives.  The next step would possibly be converting existing databases to make them web-based if necessary (e.g. SharePoint). This step would be based on availability of funds and need for easily accessed data entry platforms.



To top