AEA365 | A Tip-a-Day by and for Evaluators

CAT | Disabilities and Other Vulnerable Populations

Hi.  We’re  Sherry Campanelli, Program Compliance Manager and Laura Newhall, Clinical Training Coordinator, from the University of Massachusetts Medical School’s Disability Evaluation Services (DES). Although DES conducts evaluations regarding whether an applicant for public benefits can be found disabled, evaluation as a research endeavor is not our primary focus. Nevertheless, as an organization, we are committed to ongoing quality improvement efforts to enhance our services for people with disabilities. We use a team-based iterative approach to define and address problem functions and processes.

For example, we used the process described herein to develop Quality Assurance systems for our clinical, clerical and technical support processes. We have also used this method to tackle caseload backlogs, and effective processing of incomplete applications.

We’ve discovered over time, regardless of the issue or problem involved, that there are common techniques that help a quality improvement (QI) team be successful. We would like to share some of these lessons learned with you.

Lesson Learned: 

  • Determine and clearly state the issues to be solved and team goals.
  • Involve key staff (line staff doing the work and managers supervising the work) in the development of any QI initiative. They are in “the know” about areas that may be problematic.
  • Incorporate non-judgmental facilitation to keep up the momentum. Key components include:

o   Involving all participants in decision making/discussion;

o   Keeping meeting minutes and agendas;

o   Keeping track and sharing “to do” lists, “next steps” and progress towards goals;

o   Meeting on a regular and ongoing basis (don’t cancel meetings unless absolutely necessary);

o   Seeking management decisions and input as needed; and

o   Making sure you hear from the quiet folks in the room – they may need a little encouragement to speak up, but often offer great insights.

  • Utilize team members/subcommittees to perform specific tasks between meetings.
  • Utilize available qualitative and quantitative data.
  • Collect specific data, as necessary, to help define the problem and suggest solutions.
  • Do fact finding to support decision-making.
  • Maintain a “living” working document(s) as decisions are made to be incorporated into a final product.

Utilize pilot testing to determine feasibility and make changes (i.e., “fix bugs”) prior to full implementation.

  • Provide periodic communication to the rest of the department or organization during the project and at its conclusion.
  • Train all impacted staff on process improvements.
  • Conduct periodic assessments after implementation to assess success of the project.
  • Refine processes as new issues and changes occur.

Hot Tips:

  • Sometimes QI processes take longer than expected. “Keep going even when the going is slow and uncertain.”  G.G. Renee Hill
  • “To discover new ways of doing something – look at a process as though you were seeing it either for the first or last time.” Mitchel Martin

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hi, we are Monika Mitra and Lauren Smith from the Disability, Health, and Employment Policy unit in the Center for Health Policy and Research at the University of Massachusetts Medical School.  Our research is focused on health disparities between people with and without disabilities.

Evaluating a Population of People with Disabilities

In collaboration with the Health and Disability Program (HDP) at the Massachusetts Department of Public Health (MDPH), we conducted a health needs assessment of people with disabilities in Massachusetts.  The needs assessment helped us better understand the unmet public health needs and priorities of people with disabilities living in MA.  We learned a tremendous amount in doing this assessment and wanted to share our many lessons learned with the AEA365 readership!

Lessons Learned:

  • 3-Pronged approach

Think about your population and how you can reach people who might be missed by more traditional methodologies:  In order to reach people with disabilities who may not be included in existing health surveys, we used two other approaches to complement data from the MA Behavioral Risk Factor Surveillance System (BRFSS).  They included: an anonymous online survey on the health needs of MA residents with disabilities and interviews with selected members of the MA disability community.

  • Leveraging Partnerships

Think about alternative ways to reach your intended population:  For the online survey, we decided on a snowball sampling method.  This method consists of identifying potential respondents who in turn identify other respondents; it is a particularly useful methodology in populations who are difficult to reach and may generally be excluded from traditional surveys and affect one’s generalizability of findings.  HDP’s Health and Disability Partnership provided a network to spread the survey to people with disabilities, caregivers, advocates, service providers, and friends/family of people with disabilities.

  • Accessibility is Key

Focus on accessibility:  In an effort to increase the accessibility of the survey, Jill Hatcher from DEAF, Inc. developed a captioned vlog (a type of video blog) to inform the Deaf, DeafBlind, Hard of Hearing, and Late-Deafened community about the survey.  In the vlog, she mentioned that anyone could call DEAF, Inc. through videophone if they wanted an English-to-ASL translation of the survey.  Individuals could also respond to the survey via telephone.

Rad Resources:

  • Disability and Health Data System (DHDS)

DHDS is an online tool developed by the CDC providing access to state-level health data about people with disabilities.

  • Health Needs Assessment of People with Disabilities Living in MA, 2013

To access the results of the above-mentioned needs assessment, please contact the Health and Disability Program at MDPH.

  • A Profile of Health Among Massachusetts Residents, 2011

This report published by the MDPH contains information on the health of people with disabilities in Massachusetts.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hello from Mary Crave and Kerry Zaleski, of the University of Wisconsin – Extension and Tererai Trent of Tinogona Foundation and Drexel University.  For the past few years we’ve teamed up to teach hands-on professional development workshops at AEA conferences on participatory methods for engaging vulnerable and historically under-represented persons in monitoring and evaluation. Our workshops are based on:

  • More than 65 years of collective community-based experience in the US and more than 55 countries
  • Our philosophy that special efforts should be made to engage people who have often been left out of the community decision-making process (including program assessment and evaluation)
  • The thoughtful work of such theorists and practitioners as Robert Chambers, a pioneer in Participatory Rural Appraisal.

Lessons Learned: While many evaluators espouse the benefits of participatory methods, engaging under-represented persons often calls for particular tools, methods and approaches. Here’s the difference:

  1. Vulnerability: Poverty, cultural traditions, natural disasters, illness and disease, disabilities, human rights abuses, a lack of access to resources or services, and other factors can make people vulnerable in some contexts. This can lead to marginalization or oppression by those with power, and critical voices are left out of the evaluation process.
  2. Methods and tools have many benefits: They can be used throughout the program cycle; are adaptable to fit any context; promote inclusion, diversity and equality; spark collective action; and, support community ownership of results – among others.
  3. 3.     Evaluators are really facilitators and participants become the evaluators of their own realities.

Hot Tip:  Join us to learn more about the foundations of and some specific “how-to” methods on this topic at an upcoming AEA eStudy, February 5 and February 12, 1-2:30 PM EST. Click here to register.

We’ll talk about the foundations of participatory methods and walk through several tools such as community mapping, daily calendars, pair-wise ranking, and pocket-chart voting.

Rad Resources: Robert Chambers 2002 book: Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities.

Food and Agricultural Organization (FAO) of the UN: http://www.fao.org/docrep/006/ad424e/ad424e03.htm (click on publications, type in PLA in search menu)

AEA Coffee Break Webinar 166: Pocket-Chart Voting-Engaging vulnerable voices in program evaluation with Kerry Zaleski, December 12, 2013 (recording available free to AEA members).

June Gothberg on Involving Vulnerable Populations in Evaluation and Research, August 23, 2013

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I’m Kenneth Kelty (with assistance from Seb Prohn*), a senior in Western Carolina University’s University Participant Program. I have intellectual disability (ID) and autism. I was a co-evaluator on a project this year where we looked at improving social inclusion for students with ID on my campus. I earned an AEA student travel award to help make evaluation more inclusive and facilitated the session on Reflection and Discussion on Cultivating Cultural Competence in the Field of Evaluation.

I felt very welcomed at AEA. But, when meeting with the student travel award winners, I learned about a time at this year’s conference when an African American woman speaking on an AEA panel was ignored or talked over by other panelists. Even at AEA, everyone should work on making the conference feel more inclusive. Everyone has a voice that matters.

Lessons Learned:

  • I can help others walk in my shoes and understand the importance inclusion if I have a voice in evaluation.  Who better than me to explain my perspective and evaluate my experience?
  • Use PhotoVoice. This tool makes evaluation more accessible. When evaluators explain their photographs, they connect it with meaning in their lives. Taking pictures as data for evaluation  has helped me to see what I have done in an evaluation and helps others realize I’m doing everything my typically-developing peers are doing. It has helped me feel more included and speak up about changes.
  • Don’t skip training participant evaluators in Photovoice! It may take weeks to train but it will be worth it. During training, I learned about ethics and the importance of consent forms. I also became better at taking pictures and telling my story.

Hot Tips:

  • When you evaluate with people with ID, look at possibilities and their strengths. Sometimes they can recognize or remember something you don’t.
  • Photo consent forms can be hard to remember when you go out to collect data. It helps to put photo consent forms in an electronic format, preferable one that can be shown to subjects and signed using an i-Pad or smartphone.
  • For people who don’t want to talk or cannot talk about their PhotoVoice pictures, encourage them to blog, type, or email (with assistive technology, if needed). You can also encourage them to take more pictures – they are “worth a thousand words.”

Rad Resources: There are several apps that can be used to make photo release forms easier to sign:

http://getsigneasy.com/

https://signnow.com/

http://www.docusign.com/

Believing in participants’ self-determination is necessary for inclusive evaluation. This series helps readers better understand how to honor and promote self-determination in evaluation and elsewhere:

This brief by Maria Paiewonsky will help evaluators implement multiple modes and methods for researchers with ID.

*Seb Prohn is the UP Program ‘faculty liaison & outreach coordinator’. Beyond performing internal evaluations for the UP Program he evaluates other NC postsecondary education programs for individuals with intellectual disability.

Clipped from http://www.ngsd.org/everyone/research-practice-self-determination-issues

This week, we’re diving into issues of Cultural Competence in Evaluation with AEA’s Statement on Cultural Competence in Evaluation Dissemination Working Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! I am Jeff Williams, PhD Candidate in Public Policy and Public Administration at the George Washington University, member of the evaluation team at CRDF Global, and member of the Washington Evaluators. Today I’ll be sharing tips and tricks about writing surveys for communities that have English as a second language, or which don’t use English at all.

Lesson Learned—Evaluators with English as a mother tongue often are engaging populations that have English as a second, third, or fourth tongue, or have no exposure to the language at all (see chart below from the 2007 American Community Survey Report for U.S. language breakdown). Translating questions and answers can be tricky in a face-to-face interview setting, but affords much more room for nuance and clarification than does administering a survey in which there is little or no interaction between the respondent and the survey designer. In these situations, all of the clarification needs to happen before the survey leaves the shop, and that calls for more up-front investment in getting it right.

languages

Hot Tip—Pilot the language first and then the content. Before testing the survey for program-specific content, first make sure the translation of what you wrote is what you want to say. If the testers are getting a translation that does not reflect the true questions, then any pilot feedback they provide potentially is compromised.

Cool Tricks—So where do you get the language testers?

  • Use in-house expertise. If, for example, your survey is going to a Russian-speaking audience and you have 15 people in your organization who speak Russian at a variety of levels, run it by them. Do “very satisfied” and “extremely satisfied” really translate into a meaningful difference?
  • Use a local university. Are there faculty that specialize in that language and have conversational as well as technical fluency? Is there a student group composed of native speakers of your survey’s target language?
  • Use a local ex-pat community. Keep potential IRB issues in mind and be creative – contact local community organizing committees, restaurant owners, or others involved in the target language community to help set up an informal focus group. Bring some snacks; make some friends.

On a related note, be sure you are familiar with AEA’s position on cultural competence in evaluation.

Hot Tip—Insider’s advice for Evaluation 2013 in DC: Avoid the car rental if you can as parking is costly and a bit of a pain in DC. This is a walking city, and October is a usually a good weather period for us. Metro is clean and relatively inexpensive for longer commutes, and taxis are always available if needed in a pinch.

This is the last of three weeks this year sponsored by our Local Arrangements Working Group (LAWG) for Evaluation 2013, the American Evaluation Association Annual Conference coming up next month in Washington, DC. They’re sharing not only evaluation expertise from in and around our nation’s capital, but also tips for enjoying your time in DC. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

No tags

Greetings, I am June Gothberg, incoming Director of the Michigan Transition Outcomes Project and past co-chair of the Disabilities and Other Vulnerable Populations topical interest group at AEA.  I hope you’ve enjoyed a great week of information specific to projects involving these populations.  As a wrap up I thought I’d end with broad information on involving vulnerable populations in your evaluation and research projects.

Lessons Learned: Definition of “vulnerable population”

  • The TIGs big ah-ha.  When I came in as TIG co-chair, I conducted a content analysis of the presentations of our TIG for the past 25 years.  We had a big ah-ha when we realized what and who is identified as “vulnerable populations”.  The list included:
    • Abused
    • Abusers
    • Chronically ill
    • Culturally different
    • Economically disadvantaged
    • Educationally disadvantaged
    • Elderly
    • Foster care
    • Homeless
    • Illiterate
    • Indigenous
    • Mentally ill
    • Migrants
    • Minorities
    • People with disabilities
    • Prisoners
    • Second language
    • Veterans – “wounded warriors”
  • Determining vulnerability.  The University of South Florida provides the following to determine vulnerability in research:
    • Any individual that due to conditions, either acute or chronic, who has his/her ability to make fully informed decisions for him/herself diminished can be considered vulnerable.
    • Any population that due to circumstances, may be vulnerable to coercion or undue influence to participate in research projects.

vulnerable

Hot Tips:  Considerations for including vulnerable populations.

  • Procedures.  Use procedures to protect and honor participant rights.
  • Protection.  Use procedures to minimize the possibility of participant coercion or undue influence.
  • Accommodation.  Prior to start, make sure to determine and disseminate how participants will be accommodated in regards to recruitment, informed consent, protocols and questions asked, retention, and research procedures including those with literacy, communication, and second language needs.
  • Risk.  Minimize any unnecessary risk to participation.

Hot Tips:  When your study is targeted at vulnerable populations.

  • Use members of targeted group to recruit and retain subjects.
  • Collaborate with community programs and gatekeepers to share resources and information.
  • Know the formal and informal community.
  • Examine cultural beliefs, norms, and values.
  • Disseminate materials and results in an appropriate manner for the participant population.

Rad Resources:

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

I am Anne Chamberlain, and I am a Senior Research Associate at IMPAQ International, a rapidly-growing social-science research and evaluation firm serving clients in the areas of health; labor and human services; education; survey research; and international development. Recently we have been broadening our experience in researching barriers faced by people with disabilities (PWD), and strategies being employed to minimize or eliminate those barriers. For this post, I will share a lesson-learned related to the development of a survey for a U.S. Department of Labor-funded project, Evaluating the Accessibility of American Job Centers for People with Disabilities.

How many times have you said to yourself that “next time” you will faithfully keep a log regarding decisions made and actions taken during the life of your evaluation project?  Fortunately our evaluation team members kept meticulous records of the many decision points that coalesced into a survey on a very difficult topic: accessibility. The topic is relevant here, because it is so nebulous and so politically charged that the survey development process was particularly labor-intensive. When it was time to show our development process to numerous stakeholders each with their own agenda, we were able to illustrate it only because there were records of the thirty plus draft iterations; decisions made in response to multiple Offices and Administrations; activation of advice from two Technical Working Groups, etc.  An illustration of our development process is below:

aea365ac

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Lesson Learned: Keep a Log of Your Evaluation Moves

  • I encourage evaluators to maintain a single document that acts as a diary, keeping track of decisions and actions, who made them, when, and why.  It’s helpful if this is searchable by who, when, and why.  You can add a keyword scheme so that each entry also has keywords listed, to further your organization and search functions. The log may be used by you alone, or by your whole team. If it’s the latter, make sure they respect the formatting and add a ‘field’ to record who made the entry. Finally, put this resource somewhere visible. For me, this has been the key to regular use.  A simple Word document on your desktop can encourage you to record anything important before closing out of your computer at the end of the day.

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

No tags

Hi! This is Nivedita Ranade and Tom McKlin with The Findings Group.  We are evaluating two federal-level grants that provide mentoring to students with disabilities.  These mentoring relationships are critical to program success because the students see the mentor as their primary source of support.  The program administrators qualified these relationships as “nurturing” because the mentor is invested in the students’ maturation and psychosocial development.

Lessons Learned:

  • The quality of nurturing makes the role of the mentor unique and different from other advisers.  An academic adviser is concerned with a student’s course load and grades.  An Office of Disability Services advisor is concerned with providing the right services and accommodations.  However, a mentor is concerned with the overall success of a student.  Rather than targeting just one aspect of the student’s life (e.g. academic), the mentor takes a holistic approach towards the students’ needs.
  •  The mentor is a nurturer because he/she responds to students’ needs and trajectories.  A freshman student who is struggling to adjust to campus life may need assistance in terms of stress and time management as opposed to finding an internship.  On the other hand, a senior student who is hoping to move on to a job after he/she graduates may benefit from having an internship to build his resume and professional skills.
  • Nurturing corresponds to scaffolding. This idea is based on Vygotsky’s Social Development Theory, which asserts that learning typically depends on interactions with a more knowledgeable/competent other.  Knowledge is acquired as novices interact with experts and peers and engage in “legitimate peripheral participation” wherein the novices observe the practice of experts and become experts over time.  Learning in these contexts is generally scaffolded to provide supports until novices move from peripheral to full participation.
  • Nurturing leads to self-advocacy.  Self-advocacy leads to self-determination and personal responsibility.   By helping the students to self-advocate, the mentor helps the students re-work their conceptions of what it means to have a disability.  Students learn to define and come to terms with their disability, which in turn enables them to disclose their disability in situations where it is necessary to do so.  The confidence that comes with self-advocacy leads to increased personal responsibility and self-determination because students feel that they can achieve what they want despite their disability.
  • Thus, nurturing is an essential quality in a good mentor, and we surmise that the field might benefit from a survey construct that focuses on nurturing in mentor/mentee relationships.

 Rad Resource:

  • How to measure mentoring: We have compiled a mentor-mentee survey based on these lessons learned.  Please contact us at nivedita@thefindingsgroup.com.

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

My name is Jennifer Sullivan Sulewski and I am a Research Associate at the Institute for Community Inclusion (ICI), University of Massachusetts Boston and past co-chair of AEA’s Disabilities and Other Vulnerable Populations (DOVP) TIG.

At ICI, we do a lot of work involving analysis of publicly available datasets to determine outcomes and trends for people with disabilities at the state and national levels. These data can be used to assess areas of need, establish baselines, and track progress over time for outcomes such as employment rates, economic status, and educational attainment. My post today highlights a couple of particularly useful data sources.

Hot Tip:

  • Pay attention to how disability is defined in your data source.  Each system or survey is likely to have a different set of disability categories and definitions. For example, the Census Bureau determines if respondents to the American Community Survey have a disability by asking if they have any of six specific conditions or functional impairments (http://www.census.gov/people/disability/methodology/acs.html). The Social Security Administration defines disability as a long-term impairment affecting the ability to work. Other systems (such as Vocational Rehabilitation and developmental disabilities services) have their own ways of assessing disability status and eligibility for services.

Rad Resource:

  • U.S. Census Bureau. The Census Bureau website includes a wealth of searchable data and customization reports on both the decennial census (last conducted in 2010) and more frequent data collection efforts such as the American Community Survey.
  • Statedata.info. This website compiles data on employment outcomes and other population statistics for people with disabilities nationally and state by state, using data from state intellectual/developmental disabilities agencies, the Rehabilitation Services Administration, the U.S. Department of Labor, the Social Security Administration, and the Census Bureau. In the interest of full disclosure: statedata.info is developed and maintained by my team at the ICI.

 

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

I’m Pat Campbell, president of Campbell-Kibler Associates, Inc.  Under NSF funding, Eric Jolly, president of the Science Museum of  Minnesota and I, with the help of a lot of friends, have been generating research-based tips, such as those below, to improve the accuracy of data collection, the quality of the analysis and the appropriateness of the data collected over diverse populations.

Hot Tips:

  • Ask for demographic information ONLY at the end of measures. There may be exceptions in cases for people with disabilities who will need accommodations in order to complete the measures.
  • Have participants define their own race/ethnicity and disability status rather than having the identification done by data collectors or project/program staff.  If a standard set of categories for race/ethnicity and/or disability is used, also, in an open-ended question, ask participants to indicate their own race/ethnicity and disability status.
  • Have members of the target population review affective and psychosocial measures for clarity. Ask them what concepts they think are being measured. If what is being measured is obvious and there are sex, race, or disability stereotypes associated with the concepts, consider using a less obvious measure if an equally valid measure is available.
  • Be aware that there can be heterogeneity within subgroups. For example, while people who are visually impaired, hearing impaired, and learning disabled are all classified as having disabilities, the differences among them are very large and it might be appropriate to disaggregate by different categories of disability.
  • When race/ethnicity, gender, or disability status is used as an independent variable, specify the reason for its use and include the reason in documentation of the results.

Lessons Learned:

  • All populations are diverse:  The diversity may be in terms of race, gender, ethnicity, age, geographic location, education, income, disability status, veteran status….  It may be visible or invisible. Most likely in every group there is a multiplicity of diversities.  High quality evaluations need to pay attention to the diversity of all populations being served.
  • Each individual is diverse.  As individuals, we have many demographic characteristics including our race, gender, ethnicity, age, geographic location, education, income, disability status, veteran status….  Rather than focusing on only one demographic category, high quality evaluations need to determine which categories are integral to the evaluation and focus on them.

Rad Resources:

  • Universal Design for Evaluation Checklist, 4th Edition.  The title, says it all.  Jennifer  Sullivan-Sulewski, & June Gothberg have developed a short planning tool that helps evaluators include people of all ages and all abilities in evaluations.
  • As soon as it goes live, we hope our website, Beyond Rigor will be another rad resource.  Let me know (Campbell@campbell-kibler.com) if you would like to be notified when that happens.

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

Older posts >>

Archives

To top