AEA365 | A Tip-a-Day by and for Evaluators

TAG | community-based

Hi, I am Lisa Hinz, Leadership and Civic Engagement educator with University of Minnesota Extension. In my position, I am challenged to both foster and document action resulting from our community leadership education programs.

My colleague Jody Horntvedt and I developed a system to encourage program graduates to identify and follow through on their own action items. Evaluation and engagement are of equal importance for us – like two sides of a coin. I am going to highlight the approach and results from our “Action Items” evaluation. Later, Jody will focus on the engagement side.

Our focus here is the Red River Valley Emerging Leadership Program, a five-month cohort program to develop leadership potential and networks among younger leaders in a large region of northwest Minnesota.

At the final session, participants complete an action items worksheet sharing how they intend to practice their leadership in the year ahead.

Cool Trick: For evaluation purposes, we collect the worksheets, scan them, and mail the original back to participants within one month. This way, participants can keep track of their action items, and we have the data for later use. We enter the action items data into Excel, with each action item linked to the participant.

Cool Trick: We designed an online survey using Qualtrics to find out the extent to which participants followed through on their intended actions. Qualtrics allows personalized surveys in which participants can rate their follow through on each of their own action steps by use of embedded data. Here’s how this looks in Qualtrics:

Hinz1

Lessons Learned: These graduates take action – 84% of the 2012 cohort followed up on at least one action step – and they told us how. As seen below, we learn about the degree of follow-through and get more detail on actions taken (or not taken). We also learned that 73% of these graduates applied things learned from the program in ways other than their identified action steps. Here are some examples:

Hinz2

We found even more impressive results in a repeat of the study with 2013 graduates in which 84% took action and 94% applied learning in ways beyond what they identified.

This evaluation process gives an easy, inexpensive, and systematic way to gather data on graduates’ post-program actions – and stay in touch. The data provides outcome data specific to participants and suggests public value – the indirect impact – on the communities where these emerging leaders are exercising their leadership. It also provides opportunities to foster more value in evaluation through interviews and Ripple Effect Mapping.

Rad Resource: To learn more about evaluation of community leadership education, see the recent monograph Community Effects of Leadership Development Education: Citizen Empowerment for Civic Engagement.

We’re celebrating 2-for-1 Week here at aea365. With tremendous interest in the blog lately, we’ve had many authors eager to share their evaluation wisdom, so for one special week, readers will be treated to two blog posts per day! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, I’m Vincent Francisco, Associate Professor in the Department of Public Health Education at the University of North Carolina at Greensboro.  In working with initiatives that span from very small and targeted program development to state and national-level initiatives for broader systems improvement, there are several challenges and opportunities that present themselves.

Challenges

  • Ever changing nature of community context – including social and physical contexts
  • Vague community data sets to inform problem definition and potential solutions
  • Huge variables influencing the emergence and sustainability of problems
  • Small variables that we include in solutions, when compared to the large forces that cause the problems
  • Lots of competing theories and approaches to explain pieces of the overall picture, but little to explain everything enough to guide population-level solutions
  • Very little funding for solution development, or evaluation/applied research

Opportunities

  • Same list as above

Lesson Learned: Potential solutions and evaluation activities must draw heavily on an open-systems framework, given the open context of communities. Related evaluation activities include process variables, intermediate systems change (e.g., new and/or modified programs, policies and practices), and outcomes of these systems changes at the population level.

Lessons Learned: A variety of strategies for behavior change and broader community systems improvement are needed in varying amounts at varying times. Some artistry is required. The outcome has to matter. Many people need to be involved in implementation. Many solutions are needed, which requires significant planning and capacity building. A few people need to be tasked with coordination and follow-through. This takes real vision and significant leadership to implement. Selecting the right people is important, but so is building the capacity of them and others to make a difference. This is the only way to make large enough changes for behavior change to occur, and to the longevity of those changes, to make a difference in the outcomes we seek at the community-level. The same is true for targeted programs, as well as broad community coalitions and partnerships.

The result is a focus on approaches that include building the capacity of others to do this work.

Rad Resource: This capacity-building focus was part of the spirit behind the development of the Community Tool Box , a web-based resource for building the capacity of people to develop community-level solutions. To the training materials, we added several online resources that help people to organize their data to allow for ongoing feedback and improvement.

Rad Resource: We developed the CTB Work Stations to allow programs and community initiatives to track implementation of the solutions they develop and how implementation relates to changes in selected population-level outcomes. These outcomes could be community health and development issues, behavior or changes in risk and protective factors.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi everyone, I am Jyoti Venketraman, Director Special Initiatives at the New Jersey Coalition Against Sexual Assault.  I was hesitant to initially blog as I don’t consider myself an expert. But my awesome colleague Liz Zadnik , aea365 Outreach Coordinator and a recent blog post by Shiela B Robinson, aea365’s Lead Curator made me realize I don’t have to be an expert to contribute and can share my individual lessons learnt! So in that spirit, a project that I am undertaking currently involves collaborating with diverse communities. Evaluation plays a major role as it helps us answer two important questions: Are we making a difference? Are we good stewards of the resources we are using?  Below are a few crumbs of knowledge I have learnt in my evaluation journey so far.

Lesson Learned: Communities and individuals value different things from a project or intervention.  I learnt this early in my career as an evaluation newbie.  I find that when evaluation tools factor in differing stakeholder perceptions on what constitutes a “success,” you get a more holistic picture of what the actual impact of a specific project is within that community. This may run counter to stated project objectives but with well-planned and thoughtful stakeholder involvement, you can ensure you capture differing perceptions of “success.”

Lesson Learned: History matters. Historical context, historical trauma, and the trajectory of development a community takes can all be critical variables. Some community members may be more aware of it than others.  I have learnt that as evaluators we have to be open and intentional in affirming and acknowledging this in our practice.

Lesson Learned: Be open to a variety of data collection methods.  One of the reasons I like story telling is because it accommodates diverse views, provides a richer context and gives a window into how communities view the “success or impact “of a specific project.

Rad Resource: Many of my resources come from this blog or from what I have collected in my journey so far. On cultural humility I like Cultural Humility: People, Principles & Practices by Vivian Chavez (2012)

Rad Resource: On context in evaluation I like Participatory research for sustainable livelihoods from International Institute for Sustainable Development

Rad Resource: On storytelling I like CDC’s resource on Telling Your Program’s story, The California Endowment‘s resource on Storytelling Approaches to Program Evaluation and the Center for Digital Storytelling .

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I’m Catherine (Brehm) Rain of Rain and Brehm Consulting Group, Inc., an independent research and evaluation firm in Rockledge, Florida. I specialize in Process Evaluation, which answers the questions Who, What, When, Where and How in support of the Outcome Evaluation. Field evaluations occur in chaotic environments where change is a constant. Documenting and managing change using process methods help inform and explain outcomes.

Lesson Learned: If you don’t know what or how events influenced a program, chances are you won’t be able to explain the reasons for its success or failure.

Lesson Learned: I’m a technology fan, but I’m also pretty old-school. Like Caine in the legendary TV show Kung Fu, I frequently conjure up the process evaluation ‘masters’ of the 1980s and ‘90s to strengthen the foundation of my practice and to regenerate those early ‘Grasshopper’ moments of my career.

Old-school? Or enticingly relevant? You decide, Grasshopper! I share a few with you.

Hot Tip:  Process evaluation ensures you answer questions of fidelity (to the grant, program and evaluation plan): did you do what you set out to with respect to needs, population, setting, intervention and delivery? When these questions are answered, a feedback loop is established so that necessary modifications to the program or the evaluation can be made along the way.

Rad Resource: Workbook for Designing a Process Evaluation, produced by the State of Georgia, contains hands-on tools and walk-through mechanics for creating a process evaluation. The strategies incorporate the research of several early masters, including three I routinely follow:  Freeman, Hawkins and Lipsey.

Hot Tip: Life is a journey—and so is a long-term evaluation. Stuff happens. However, it is often in the chaotic that we find the nugget of truth, the unknown need, or a new direction to better serve constituents. A well-documented process evaluation assists programs to ‘turn on a dime’, adapt to changing environments and issues, and maximize outcome potential.

Rad ResourcePrinciples and Tools for Evaluating Community-Based Prevention and Health Promotion Programs by Robert Goodman includes content on the FORECAST Model designed by two of my favorites (Goodman & Wandersman), which enables users to plot anticipated activities against resultant deviations or modifications in program and evaluation.

Hot Tip:  If you short shrift process evaluation, you may end up with Type III error primarily because the program you evaluated is not the program you thought you evaluated!

Rad Resource: Process Evaluation for Public Health Research and Evaluations: An Overview by Linnan and Steckler discusses Type III error avoidance as a function of process evaluation. As well, the authors discuss the historical evolution of process evaluation by several masters including but not limited to Cook, Glanz and Pirie.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Archives

To top