Approaching Document Review in a Systematic Way by Linda Cabral

Greetings AEA365 readers.  I’m Linda Cabral from the University of Massachusetts Medical School’s Center for Health Policy and Research.  Many evaluations that I’ve been a part of in my 15+ year career have required a review of existing program documents.  This has involved a range of documents such as program descriptions, meeting minutes, proposals and grantee reports.  There can be many purposes to performing a document review.  Often, it can provide you the background necessary to formulate your primary data collection tools.  Other times, document review can be your sole data collection method when your evaluation only requires descriptive information such as number and type of sites or a description of participants and program costs.  Funders appreciate this data collection method because it does not pose a burden to program staff as the data already exists.  Regardless of the main purpose of your document review, I’ve found it helpful to be able to approach this type of review in a systematic way.

Hot Tips:

  • Catalog the types of documents available to you – Depending on whether your evaluation is of a single site or multiple ones, the amount and types of documents will vary. Ideally, all of the documents will be available electronically to make analysis and file sharing among teammates easier.
  • Develop a data abstraction/collection form – Not everything in the documents to be reviewed will be useful to your evaluation, so it will be important identify upfront the types of data that will best inform your evaluation questions. Once you have done so, creating a data collection table or coding framework of the types of information you seek to abstract from your documents will aid in your overall analysis.  I have found Excel spreadsheets useful for this purpose; this makes for easily sorting the data if there are multiple ways to review what you’ve collected.
  • Ensure that teammates are approaching data review consistently – If documents are being reviewed by more than one person, it is important to establish some inter-rater reliability checks to make sure that that data are being abstracted/collected/reviewed in the same way.

Lessons Learned:

  • With potentially volumes of data at your disposal, it can be easy to get distracted by data that is not relevant to your evaluation questions. Stay focused!
  • Be prepared for inconsistency within and across data sources. Determine how missing data and will be treated. Make sure to treat missing data differently than data that is not applicable (n/a). Ascertain before the evaluation begins whether or not any follow-up with the original sources is within the evaluation scope.  If not, do the best with what you’ve got!

Rad Resources: 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.