AEA365 | A Tip-a-Day by and for Evaluators

TAG | budget

We are Debi Lang and Kathy Muhr, members of the Research and Evaluation Unit at the University of Massachusetts Medical School Center for Health Policy and Research.

Populations considered hidden or hard to reach for participation in qualitative evaluation studies may be small in size, their members difficult to locate, or hard to distinguish from general populations. In their article, Salgalnick and Heckathorn state such groups historically include subjects in HIV/AIDS research but can include undocumented immigrants, or the homeless.

Evaluations that rely on data from hidden or hard to reach populations present challenges when names and contact information do not exist, are not accessible, or are generated in a way that may introduce biased results. In two recent projects, we used approaches to identify 1) family members of Hospice patients who had died; and 2) adults with mental health conditions who are deaf/hard of hearing (D/HH) or Latino.

Hot Tip: Avoid Bias

  • For the Hospice project, we used claims and enrollment data to identify family members of Hospice decedents, rather than request the information from Hospice providers. This approach avoided a potentially biased sample of family members who were predominantly satisfied with their services.

Hot Tip: Hire Cultural Brokers

  • To identify D/HH or Latino adults with a mental health condition, we hired cultural brokers, with the experience and language of the groups we wished to contact.  As peers and integral members of our evaluation team, the cultural brokers helped to identify group members and create a viable sample of potential participants.
  • To recruit cultural brokers, we made announcements at various stakeholder and committee meetings, brought copies of the job description, and brainstormed with attendees to identify likely candidates.

Hot Tip: Maintain Confidentiality

  • Whether gathering names and contact information of potential study participants from a database or by word-of-mouth, use compliance procedures to maintain confidentiality of personal information and to protect their rights. Both projects required approvals from either an Internal Review Board (IRB) or Compliance Unit to identify and recruit participants.

Lessons Learned: Budget Wisely

  • To budget a project which identifies hidden populations, consider the time needed to generate the study sample, including IRB and data access approval.
  • Consider costs for hiring cultural brokers and/or translators, as well as for participant incentives, travel, and costs associated with rescheduling meetings. These expenses support successful recruitment and data collection activities.

RAD Resources:  The hyperlink above and the following resources discuss sampling designs used to identify hidden or hard to reach populations.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · · · ·

My name is Bonnie Richards, and I am a Research Associate with Vital Research, an evaluation and consulting company located in Los Angeles, California. The Social Services sector is among the industries we serve when designing and implementing evaluation projects.

Lessons Learned: So, what frames a Social Services evaluation project? Consider how values and valuing impact the following.

 Perspective                     

  • Many different stakeholders may be involved at some point, as either sources of data or as informants for conducting the project itself.
    • The agency or service provider itself interested in an evaluation.
    • The funders or foundations are now interested in the impact of their funding.
    • Program staff, service recipients and their families, community members, policy makers and others may be involved.
  • The purpose and value of conducting an evaluation may not be clear. Be prepared to explain your work to different audiences.

Budget

  • It may not come as a surprise that budget is hugely influential.
    • Internally, budget conservatively and realistically (or you may find yourself volunteering some pro bono services). What are the critical components that must be included in the project?
    • External Budgets(e.g. state or federal) and their fluctuations (e.g. cuts or increases) can have a huge influence on your work.
  • Social Service organizations might be under a severe amount of stress during times of funding crisis. This can strain communication, involvement, and cooperation.

Data

  • Do your funders value and understand data? Why are they collecting it? Are they a learning organization that is self-motivated to understand impact, or are they simply meeting a requirement from higher up?
  • Are agencies tracking data? They might need help developing a system for tracking information of interest. Consider what kinds of data you may be able to obtain:
    • Indicators? (e.g. attendance or other numbers)
    • Outcomes? (i.e. an observable difference in attitude or behavior)
  • Consider how findings will be used. What are the potential implications of making a judgment about the overall merit or worth of a program?

Lesson Learned: There are several adjectives the evaluator should endeavor to embody, particularly in the context of a Social Services evaluation project: Conscientious, Intentional, Open-minded, Flexible.

Hot Tip: If you work with stakeholders who are unfamiliar with research and might have difficulty interpreting data, consider the ways you can relay findings in meaningful ways. Consider attending a workshop or session on new and creative visual displays that can be used in presentations and reports for clients.

 

Rad Resource: Orient yourself to any major evaluation theorists before attending a session with them. Evaluation Roots: Tracing Theorists’ Views and Influences, edited by Marv Alkin, is a great resource for understanding different theoretical approaches, including Valuing, this year’s conference theme.

 

The American Evaluation Association is celebrating this week with our colleagues at the Southern California Evaluation Association (SCEA), an AEA affiliate. The contributions all this week to aea365 come from SCEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Susan Kistler and I am the Executive Director for the American Evaluation Association. I contribute each Saturday’s post to the aea365 blog. This week I am writing from Atlanta at the Nonprofit Technology conference.

Do you work for or with nonprofit organizations? Have you experienced challenges due to financial constraints that make technology purchases for evaluation beyond the budget?

Hot Tip: Take a look at TechSoup, the “technology place for nonprofits.” TechSoup has resources, training, a peer-to-peer community, and a donated technology program – TechSoup Stock. Their donated tech program gives nonprofits access to products from a range of big name (and not so big name) companies. Examples include the full Microsoft Office Suite including Access and Excel; ArcGIS from ESRI for spatial analysis; and Crystal Reports from SAP for data visualization and reporting. And the cost? Each product has an administrative fee, but most are well below even discounted retail prices. As an example, the full Microsoft Office 2007 suite is $20. Organizations do need to go through a relatively painless qualification process, and the eligibility criteria vary from product to product, but the resource is definitely worth checking out.

The opinions expressed above are my own and do not necessarily represent those of my employer, the American Evaluation Association.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· · · · · · ·

Archives

To top