AEA365 | A Tip-a-Day by and for Evaluators

TAG | checklists

We are Kelly Robertson and Lori Wingate, and we work at The Evaluation Center at Western Michigan University and EvaluATE, the National Science Foundation-funded evaluation resource center for Advanced Technological Education (ATE).

Rad Resource:

We’re excited to announce our new rad resource, the “Checklist of Program Evaluation Report Content.” We created this checklist to address a need for practical guidance about what should go in a traditional evaluation report—the most common means of communicating evaluation results. The checklist is strictly focused on the content of long-form technical evaluation reports (hence, the name). We see the checklist as complementary to the exciting work being done by others to promote the use of evaluation through alternative ways of organizing, formatting, and presenting data in evaluation reports. If you want guidance on how to make your great content look good, check out the new Evaluation Report Guidance by the Ewing Marion Kauffman Foundation and Evergreen Data.

How is our checklist on reporting different from others you may have come across?

  • It not only lists key elements of evaluation reports, but it also defines these elements and explains why they are relevant to an evaluation report.
  • Its focus is not on judging the quality of a report. Rather, our checklist is intended to support practitioners in making informed decisions about what should be included in an evaluation report.
  • It’s not tailored to a specific type of program or evaluand and is presented as a flexible guide rather than rigid specifications.

We hope multiple audiences find the checklist useful. For example, new evaluators may use it to guide them through the report writing process. More experienced evaluators may reference it to verify they did not overlook important content. Evaluators and their clients could use it to frame conversations about what should be included in a report.

Lesson Learned:

It takes a village to raise a great checklist. We received feedback from five evaluation experts, 13 of our peers at Western Michigan University, and 23 practitioners (all experts in their own right!). Their review and field testing were invaluable, and we are so grateful to everyone who provided input—and they’re all credited in the checklist.

Like checklists? See the WMU Evaluation Center’s Evaluation Checklists Project for more.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Ray Kennard Haynes and I am an Assistant Professor at Indiana University- Bloomington and I have a keen interest domestic racial Diversity in Higher Education (HE).   Since the 1970s the United States (U.S.) has attempted to address Diversity by focusing primarily on race and gender through Equal Employment Opportunity (EEO) legislation. This legislation produced some gains; however, those gains have now eroded and are under threat due to legal challenges.

HE institutions in the US have ostensibly embraced Diversity and even claim to manage it. Evidence of this commitment to diversity can be seen in the proliferation of Diversity offices and programs at HE institutions and with the advent of the position of Chief Diversity Officer (CDO). The casual observer could reasonably conclude that Diversity has been achieved in HE. Surely, we see evidence of this reality with the CDO position and ubiquitous Diversity commitment statements. Note too, that the term university can also be construed as: the many and different in one place. Given this meaning and the fact that one in every two U.S. residents will be non-white by the year 2050, Diversity in higher education is a fait accompli. Is HE really diverse with respect domestic racial groups (i.e. African-Americans and Latino-Americans)?

Hot Tips: Research suggests that despite increasing racial diversity, communities and schools are re-segregating to levels representative of the 1960s. In highly selective institutions, diversity has come to mean many things and underrepresented domestic students and faculty are becoming an increasingly smaller part of the Diversity calculus. The evidence suggests HE is becoming less domestically diverse because of the negative co-variation between increases in domestic racial diversity and decreasing access for African-Americans and Latino-Americans to higher education, especially at highly selective schools.

One way for HE to show its commitment to domestic Diversity is to define and evaluate it within the broader construct of DIVERSITY that includes visible and non-visible differences.

Evaluation checklists can be applied to assess domestic diversity deficits and related program implementation thoroughness.

For HE institutions and evaluators who believe that domestic diversity matters, a good place to start is to create Domestic Diversity Evaluation Checklists that assess for both Diversity and Inclusion. These checklists should include dimensions that capture:

  • Diversity investment: the budget (investment) associated with domestic racial diversity
  • Structural diversity: the numbers of underrepresented domestic students and faculty
  • Diversity Climate: decision making and the level of meaningful cross-race interaction and inclusion in shaping the culture and direction of the HE institution

Rad Resources: For practical help on checklists you may access, see Western Michigan University’s page on evaluation checklists and some examples of evaluation checklists.

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I am Theresa Armstead, a behavioral scientist at the Centers for Disease Control and Prevention in the National Center for Injury Prevention and Control. I am a co-chair for the Community Psychology Topical Interest Group.   This week’s theme is Pursuing Meaning, Justice, and Well-Being in 21st Century Evaluation Practice. The theme is a blend of the themes from the recent biennial conference for community psychologists and the upcoming evaluation conference. For me the values reflected in the theme are participation, inclusion, collaboration, self-determination, and empowerment. The values are shared across my identities of community psychologist, evaluator, and behavioral scientist. In practice it is sometimes challenging to strike a balance between these values and evaluation expectations in government.

Hot Tip: Whenever possible I use checklists and templates to describe the information and content I need without prescribing how the information should be collected. I did this recently when providing guidance to grant recipients on conducting evaluability assessments. I used a checklist to identify common components of an evaluability assessment and some strategies for gathering information. I provided a template for reporting the findings that focused on the questions to be answered without prescribing how the report should appear. I am hoping all the reports will be brief and use data visualizations.

Hot Tip: Evaluability assessments (EAs) are a great way to meet the need for accountability and to be flexible.  Instead of prescribing the types of evaluation designs, methods, and plans across all grant recipients, EAs help each grant recipient clarify the type of evaluation that is most helpful for the programs and strategies they plan to implement. The resulting evaluation plan is data informed because of the thoughtful and systematic nature of EAs.

Lesson Learned:

–        There are opportunities to create space for participation, collaboration, and self-determination even when the focus is more on the end results than the process.

Rad Resources:

–        Check out Susan Kistler’s last contribution as a regular Saturday contributor for the AEA365 blog. She wraps up the Data Visualization and Reporting week by sharing Sarah Rand’s awesome post on the DataViz Hall of Fame and an interview with Sarah Rand. http://aea365.org/blog/?p=9441

–        Valerie Williams’ post on Evaluating Environmental Education Programs. In it she describes other ways EAs are useful beyond the traditional use of determining whether a program is ready for a more rigorous evaluation and she shares Rad Resources for learning about EAs. http://aea365.org/blog/?p=6298

–        Learn more about the Community Psychology Topical Interest Group and visit our TIG home page.

Clipped from http://comm.eval.org/communitypsychology/home

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org

· · ·

I’m Pat Campbell, president of Campbell-Kibler Associates, Inc.  Under NSF funding, Eric Jolly, president of the Science Museum of  Minnesota and I, with the help of a lot of friends, have been generating research-based tips, such as those below, to improve the accuracy of data collection, the quality of the analysis and the appropriateness of the data collected over diverse populations.

Hot Tips:

  • Ask for demographic information ONLY at the end of measures. There may be exceptions in cases for people with disabilities who will need accommodations in order to complete the measures.
  • Have participants define their own race/ethnicity and disability status rather than having the identification done by data collectors or project/program staff.  If a standard set of categories for race/ethnicity and/or disability is used, also, in an open-ended question, ask participants to indicate their own race/ethnicity and disability status.
  • Have members of the target population review affective and psychosocial measures for clarity. Ask them what concepts they think are being measured. If what is being measured is obvious and there are sex, race, or disability stereotypes associated with the concepts, consider using a less obvious measure if an equally valid measure is available.
  • Be aware that there can be heterogeneity within subgroups. For example, while people who are visually impaired, hearing impaired, and learning disabled are all classified as having disabilities, the differences among them are very large and it might be appropriate to disaggregate by different categories of disability.
  • When race/ethnicity, gender, or disability status is used as an independent variable, specify the reason for its use and include the reason in documentation of the results.

Lessons Learned:

  • All populations are diverse:  The diversity may be in terms of race, gender, ethnicity, age, geographic location, education, income, disability status, veteran status….  It may be visible or invisible. Most likely in every group there is a multiplicity of diversities.  High quality evaluations need to pay attention to the diversity of all populations being served.
  • Each individual is diverse.  As individuals, we have many demographic characteristics including our race, gender, ethnicity, age, geographic location, education, income, disability status, veteran status….  Rather than focusing on only one demographic category, high quality evaluations need to determine which categories are integral to the evaluation and focus on them.

Rad Resources:

  • Universal Design for Evaluation Checklist, 4th Edition.  The title, says it all.  Jennifer  Sullivan-Sulewski, & June Gothberg have developed a short planning tool that helps evaluators include people of all ages and all abilities in evaluations.
  • As soon as it goes live, we hope our website, Beyond Rigor will be another rad resource.  Let me know (Campbell@campbell-kibler.com) if you would like to be notified when that happens.

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

I’m Susan Kistler, the American Evaluation Association’s Executive Director. Today, I thought I’d write about a few resources available if you are developing evaluation contracts.

Rad Resource: James Bell offered a session at Evaluation 2010 on Contracting for Evaluation Products and Services. He offered advice in five areas: creating a feasible, agreed-upon concept plan; developing a well-defined request for proposals (RFP); selecting a well-qualified evaluator team that will fulfill the sponsor’s intent; constructively monitoring interim progress; and ensuring the quality and usefulness of major evaluation products. His session slides are available for free download.

Rad Resource: Melanie Hwalek of SPEC associations has shared an Evaluation Contract Template that she uses via the AEA pubic eLibrary. She also offered great tips for Evaluation Contracts in an aea365 post back in February of 2010.

Rad Resource: Daniel Stufflebeam developed an Evaluation Contracts Checklist “designed to help evaluators and clients to identify key contractual issues and make and record their agreements for conducting an evaluation.” It may be downloaded from the Evaluation Center at Western Michigan University along with a number of other evaluation-focused checklists.

Rad Resource: The American Evaluation Association has a vibrant Independent Consultants Topical Interest Group (TIG) that has the most active discussion list of any of AEA’s 40+ TIGs. If you’re not an AEA member, consider joining today, building your network, and learning from their collective expertise.

If you have ideas or resources to share regarding evaluation contracts, add them to the comments for this post!

Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! My name is Sudharshan Seshadri and I am currently pursuing my Masters degree in Professional Studies specializing in Humanitarian Services Administration.

I started to realize that “data” is being considered as a most promising feature to understand the activity of evaluation. To abstract the data needs, I believe as an evaluator, we should be conscious to explore the resources available in all forms of ubiquitous information.

I would like to share a few resources that are promising to beginners in the conduct of evaluation. For the purpose of ease of use, I shall classify the resources under three headings:

Rad Resources for Program Planning

1.      The Ohio State Evaluation Bulletin, Extension – A systemic approach to design and plan program evaluations. (http://ohioline.osu.edu/b868/)

2.      Program Planning – Program Development and Evaluation. PD & E). UWEX. (http://www.uwex.edu/ces/pdande/planning/index.html)

3. Planning a Program Evaluation: Worksheet (Co-operative Extension)

(http://learningstore.uwex.edu/assets/pdfs/G3658-1W.PDF)

4.      Evaluation design checklist, Daniel L.Stufflebeam, The Evaluation Centre,WesternMichiganUniversity. (http://www.wmich.edu/evalctr/checklists/)

5.      Key Evaluation Checklist (KEC), Michael Scriven. (https://communities.usaidallnet.gov/fa/system/files/Key+Evaluation+Checklist.pdf)

Rad Resources for Program Implementation, Monitoring, and Delivery

1.      W.K.Kellogg Foundation. Evaluation Handbook. (http://www.wkkf.org/knowledge-center/resources/2010/W-K-Kellogg-Foundation-Evaluation-Handbook.aspx)

2.      Program Manager’s Planning, Monitoring and Evaluation tool kit. Division for Oversight Services. Tool number 5. (http://www.unfpa.org/monitoring/toolkit.htm)

3.      Evaluation Models. View Points on Educational and Human Services Evaluation. Second Edition. Edited by Daniel.L. Stufflebeam, George.F. Madaus & Thomas Kellaghan. (http://www.unssc.org/web/programmes/LS/unep-unssc-precourse-material/7_eVALUATIONl%20Models.pdf)

Rad Resources for Program Utilization

1. Utilization – Focused Evaluation. Michael.Q. Patton. Fourth Edition. Sage Publications.

2.      Independent Evaluation Group. (IEG) The World Bank Group. Improving development results through excellence in evaluation. (http://www.worldbank.org/oed/)

3.      My M & E – A platform for sharing knowledge and practice amongst M & E practitioners worldwide. (www.mymande.org)

4. “Evaluate”, Evaluation centre operated by Western Michigan University, Specializing in National Science Foundation (NSF) Evaluations. (www.evalu-ate.org)

5. United Kingdom Evaluation Society.(UKES) Resources/Evaluation Glossary/ (http://www.evaluation.org.uk/resources/glossary.aspx)

Lessons Learned: Always initiate the search for data needs. In the information age, we have plethora of evaluation services in execution all over the world. Data acts a gateway to useful and significant research practices carried out in the profession of evaluation. Clearly, I accord benchmarking as an outcome of consistent resource search and utilization.

Hot Tip #1: How long can you stare at the Google search engine screen for your data needs? Expand your search through a multitude of web resources.

Hot Tip #2: Use networking to get instant responses to your queries. It allows you to create a new dimension for your learning and practice methods. For example, I created a separate page named “The Evaluation Library” for my books, references and tools in Facebook that I use frequently in the evaluation context.

Hot Tip #3: Ease of data access penetrates your interest to dig deeper. Stack or list all your resources in a platform that you visit frequently.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Archives

To top