AEA365 | A Tip-a-Day by and for Evaluators

TAG | Peer Review

I’m Bronwyn Mauldin, Director of Research and Evaluation at the Los Angeles County Arts Commission. I’m going to share the informal peer review process we use to improve the quality of our work.

Even if you’re not writing for an academic journal you want to make sure your methods are rigorous, your findings watertight, your final report lucid and clear. How can you get an objective assessment prior to print if your report doesn’t go through peer review? Ask an external colleague who works in the same field or uses similar methods to read it and give you feedback. In fact, ask two or three of them. Here at the LA County Arts Commission we’ve established a practice of doing this for every research or evaluation report we publish. It’s a simple idea we’ve found to be remarkably beneficial.

This practice is especially useful for those of us who work in that area some call “gray literature” published by nonprofits, foundations, government or other non-academic institutions. While we may have the advantage of working closely with practitioners and subject-matter experts, we have less access to the kind of meticulous critique available in the academy.

Rad Resource: Your colleagues. Identify three or four experts outside of your organization, then ask them to review your report and comment on it. Provide guiding questions so they’ll pay attention to your key issues, but be open to whatever else they find. Be sure to credit your reviewers in the final report.

Lesson Learned: People can be remarkably generous with their time and expertise. We’ve sent reports to reviewers that run to 70 pages or more, and others that were loaded with charts and graphs. Most people we’ve asked delivered thoughtful, thorough feedback.

Lesson Learned: Timing and communication are critical. Reach out to potential reviewers to get their commitment early in the writing phase. Send them the finished report when the text and charts are complete (but before the design phase). Give reviewers enough time for their review based on the length and complexity of the report, and a clear deadline. It might take a reminder or two, but most people eventually come through.

Cool Trick: Don’t limit yourself to colleagues you know. Contact the top experts in your field – both academics and others. This is also a great way to raise your profile with experts you’d like to get to know.

Independent evaluators who want to use informal peer review will probably need to let the institution you’re working for know what you’re planning in advance. Invite them to recommend experts to serve as reviewers.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

I’m Leslie Goodyear, and I’m a Program Officer at the National Science Foundation, in the Division of Research on Learning in Formal and Informal Settings (DRL). The programs in DRL include: Informal Science Education (ISE), Discovery Research K-12 (DR K-12), Research and Evaluation on Education in Science and Engineering (REESE), and Innovative Technology Experiences for Students and Teachers (ITEST). I have a hot tip about how to become a proposal reviewer for NSF/DRL.

NSF’s proposal merit review process generally includes review by outside experts. For DRL, experts in Science, Technology, Engineering and Mathematics (STEM) education, research methods, learning sciences, evaluation, and other areas are typically brought together in panels. They discuss the relative merits of the proposals and offer their best thinking to NSF programs officers based on two primary review criteria: intellectual merit and broader impacts. Advice by reviewers and panels is critical to informing program officers, who make the recommendations for awards.

As a panel reviewer, you’ll read about 15 to 20 proposals (each is about 15 pages long); write reviews for about six to eight proposals; join about ten to 12 colleagues in a two-day review panel in Arlington, Va., home to NSF; discuss the proposals and the reviews; and rate the proposals as a priority for funding. Reviewers who travel to NSF are paid a stipend for the days they serve on the panel and their travel is covered by NSF; ad hoc reviewers, who normally review just one or two proposals without serving on a panel, are not paid. In addition to providing a valuable service to the NSF and the field, you’ll learn a lot about what makes a good proposal and how the review process at NSF works. Most people who participate think it’s a great professional development opportunity.

Because DRL programs require project evaluation, the proposals submitted include evaluation plans. Thus, DRL always needs experienced, competent evaluation professionals to gauge the quality of these plans. We primarily look for evaluators who have experience conducting evaluations of STEM education programs. We also look for evaluators with strong methodological training, experience with formal or informal educational settings (in-school or out-of-school), expertise in evaluating research, and practical expertise in evaluating community programs.

Hot tip: If you’d like to be considered for serving as a proposal reviewer, first go to the NSF website (below) and learn about our programs by reading the program solicitations. Then send your CV and a cover letter with a bit about yourself, your expertise and experience, and the program(s) for which you’d like to serve as a reviewer to me, Leslie Goodyear, lgoodyea@nsf.gov. I will then forward them to the appropriate cluster within the division.

NSF DRL Website: http://nsf.gov/div/index.jsp?div=DRL

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· ·

My name is Sally L. Bond, President & Senior Consultant, The Program Evaluation Group, LLC. In 2004, Marilyn Ray (Finger Lakes Law & Social Policies Center, Inc.) and I began development of a peer review process for the American Evaluation Association’s Independent Consulting TIG. One of the defining objectives of the IC TIG is to provide members, especially sole proprietors and those in very small firms who often “fly without a net,” with support mechanisms to enhance the quality of their evaluation work. Therefore, we conceptualized the IC TIG’s peer review process as a professional development opportunity for our members, both reviewers and reviewees. The peer-review process engages professional evaluation colleagues to serve as critical reviewers of other colleagues’ evaluation reports with the purpose of providing feedback to inform and improve their practice.

Rad Resources: Although participation in the IC TIG’s double-blind peer review process is currently available only to members of the IC TIG*, the peer review guidelines and framework are accessible to anyone who is interested in improving the quality of written evaluation reports. You can find them online in the public AEA eLibrary at http://bit.ly/ICReview. The IC TIG’s “Framework for Peer Reviewers, Evaluation Reports” addresses many of the issues and report elements covered in the “Evaluation Report Checklist” by Gary Miron of Western Michigan University; see: http://www.wmich.edu/evalctr/checklists/checklistmenu.htm. The checklist is a useful tool for designing reports and collaborating with colleagues and clients on report preparation. The IC TIG’s peer review framework was developed for a different purpose. It was specifically designed to elicit detailed written feedback from professional peers who have technical expertise in program evaluation.

Hot Tip: Are you interested in doing some self-guided professional development in report writing? Use the “Framework for Peer Reviewers, Evaluation Reports” as a guide to reflect more deeply on the quality of your written work.

Hot Tip: Would you like to start a professional peer learning group in your evaluation office or company? Use the “Guidelines for Peer Reviewers” to orient colleagues to the purpose and expectations of the peer review process, then use the “Framework for Peer Reviewers, Evaluation Reports” to structure the group learning process.

Hot Tip: Are you part of a large organization that wants to initiate an internal double-blind peer review process? Use the “Instructions for Evaluators Submitting an Evaluation Report for Peer Review” and the “Submitting Evaluator Cover Sheet” to gather contextual information that is relevant to the review of the submitted report.

Hot Tip: Are you providing technical assistance to evaluation consumers who have little knowledge of what constitutes a really good evaluation report? Use the “Framework for Peer Reviewers, Evaluation Reports” to develop appropriate expectations for high quality deliverables.

*Both AEA and the IC TIG welcome new members. All members of the IC TIG are members of AEA. You can join the association – and subsequently the IC TIG – online at http://www.eval.org/membership.asp. Every AEA member may join up to five TIGs at no additional cost.

· ·

Archives

To top