AEA365 | A Tip-a-Day by and for Evaluators

CAT | Social Work

Hello! I’m Carrie Petrucci, MSW, Ph.D. I came to evaluation by doing it before I knew I was doing it. Truth be told, my mixed methods dissertation was indeed an “evaluation”. However, I discovered early on that I didn’t dare call it that because evaluation research was “poo-pooed” at most research universities (at that time, anyway). As an example, in my ethnography class, I’ll never understand why another student’s work doing ethnographic observation in a “laundree-mat”, as he called it, was somehow more valued by the instructor than my court observations, but there it was. I had earned my Masters in Social Welfare prior to going back to school for my Ph.D., and had worked as a child protective services worker and as a program director in community corrections. So there was no escaping the practical leanings of my work. My Ph.D. was in Social Welfare with a self-imposed minor in criminal justice. For both my MSW (1991-1993) and my Ph.D. (1998-2002), very few people understood why I was combining these two disciplines, but to me the answer was simple: that’s where our clients were (in jail or prison). Sadly, the statistics bear this out, then and now. Early on, I found some common ground of combining social welfare and criminal justice in scholars such as Michael Tonry, Norval Morris, David Wexler, Bruce Winick, Richard Enos, Joan Petersilia and Al Roberts. Later there would be many more. So what’s the point of this story?

Hot Tips:

  • First, be passionate about your work, and don’t be dissuaded by others who may not share your point-of-view. My interdisciplinary approach was not “in vogue” at the time that I initially pursued it, but has since become highly valued.
  • Second, find mentors who share your passion, or at least parts of it. I was incredibly fortunate to have an MSW research advisor and a dissertation committee that stood by my interdisciplinary approach. And it remains very much a part of my work almost 20 years later.
  • Third, trust your instincts, but also come to understand why you do what you do, and the evidence that supports it, but also explore the reasons against it. What other scholars and experts in the field share your view? What evidence do they provide? What about the “naysayers” on how you do what you do? Learn from all of them.
  • Finally, as a contracted evaluator, it may take a few years, but work to get to a place in which you’re only taking on projects that matter to you. The level of detail in this work is overwhelming, and in my opinion, the best way to maintain high standards is to care about what we do.

One last point – caring about what we do doesn’t mean we lack objectivity – but that’s another blog.

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Greetings!  My name is Jennifer Lyons, MSW from Lyons Visualization, LLC.  I am a social worker, data designer, and speaker.  In my independent consulting business, I bring creative energy to making data intriguing and impactful, while helping clients transform the way they communicate their story. Today I want to talk about a method I like to use that is effective in both engaging clients in the interpretation of data, while also setting the stage for an impactful visual summary of findings.

In this post, I am going to focus on a process to use after data is collected and analyzed.  After analysis, it is time to dive in and highlight the story within the data.  Part of storytelling with data is making meaning of the information in context. Our clients are the experts on the delivery of their programs, people they work with, and the reporting context.  It is important to include our clients in thoughtful interpretation of their data.  In this post, I am going to focus on using a worksheet to guide a data interpretation meeting and transform findings into a visual summary.

Hot tip: Start by designing the data interpretation worksheet.  This worksheet is the backbone to a visual executive summary of your findings. Below is an example of a simple data interpretation worksheet made for an evaluation of an after-school reading program.  Included are graphic displays of the data with blank boxes that give space for clients to add their interpterion. During the data interpretation meeting, you can use this worksheet to partner with clients to highlight and frame central findings in the data.

Hot Tip: Paste each graph from the worksheet on an empty slide and ask your clients to examine each data point.  Prompt them with questions about what they see as positive, negative, and surprising about the findings.  It is also important to ask your clients to think of relevant context.  As a group, process everyone’s recommendations and thoughts.  There are often a lot of important things being shown in one graph, but together, you can decide on what is most important.  Then, write the most important takeaway/s from the graph in the graph title.  This process is repeated for each graph.  By the end, you will have something like this:

Hot Tip: This completed worksheet can easily be transformed into a visual summary of your findings.  For this worksheet to transition to a visual executive summary, there are key aspects missing. Add effective titles from the worksheet, use color to showcase your story, and add an engaging visual.

Ta-da!  You have a nice visual report based on thoughtful data interpretation using your client’s feedback and expertise.  My hope is that by reading this post, you are more inspired to think of new ways to engage your client in the data and visually display findings.

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi all! My name is Rachel Vinciguerra and I am a master’s student of international development and social work at the University of Pittsburgh. This summer I worked on two program evaluations in Haiti: one a mid-point evaluation for a girls’ empowerment program, the other a preliminary design of M&E protocols for an emerging foster care agency. Coming from a social work background, and as an American evaluator working in Haiti, it was especially important to me the studies were culturally-responsive and took marginalized groups into consideration as major stakeholders.

Ultimately, it came down to sharing power with these groups throughout the evaluation process. I found that, when we put them at the center of design, implementation, and presentation, results were richer.

Hot Tip #1: Identify marginalized groups.

  • There are two pieces to this. First, you have to begin with considerable knowledge of the culture and community in which you are working in order to understand specific and, often complex, hierarchies of power. Second, you have to allow that structural knowledge to contextualize your early conversations with stakeholders in order to identify those groups in the program whose voices are not often heard.

Hot Tip #2: Engage marginalized groups on the same level as your organizational client.

  • Consider how you engage your organizational client as you plan for evaluation. Are they telling you what questions they want answered? Are you working with them to develop a theory of change model? Are you collaborating on the timeline of the evaluation? Now consider the marginalized groups in your evaluation and share power in the same way with them. They may be beneficiaries of the program, but they may also be groups within the organization that hired you.

Hot Tip #3: Ensure evaluation results can be understood by all involved.

  • It is research 101. Human subjects deserve access to the knowledge and research they help generate and you can make sure they get it. In the evaluations I worked on, this meant translating all reporting into Haitian Creole and communicating the results in the same diverse modalities I had for my client.

Lessons Learned:

  • Be patient. Be flexible. Be humble. Make and maintain space in your design to be responsive to marginalized groups and be ready to adapt quickly and with humility as needed.

Rad Resources:

 

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! My name is Jennifer Obinna and I am Research and Evaluation Director at The Improve Group, a consulting firm in Minnesota. Our evaluation practice strongly supports and encourages evaluation capacity development with our clients. One of the various ways we do so is to facilitate logic model clinics as a way to assist clients and their stakeholders articulate the desired state of their program implementation. As I often serve as an Empowerment Evaluator for sexual and domestic violence prevention efforts, I will highlight resources from that work that can be applied to any sector’s evaluation.

Social work’s person-in-environment perspective ecologically considers an individual and individual behavior within the environmental context in which that person lives and acts. In our logic model clinics, we borrow the social ecological model from public health to think through the person-in-environment as a program planning tool.

Centers for Disease Control and Prevention. Sexual violence prevention: beginning the dialogue. Atlanta, GA: Centers for Disease Control and Prevention; 2004.

We ask stakeholders a series of questions that informs the logical framework for their program in its fully implemented and desired state.

  • What assumptions—beliefs about the program, the people involved, the context, and the way the program works—do we hold?
  • What resources do we have to support this?
  • What activity components do we do that lead to outcomes?
  • Who do we reach/influence?
  • What do we produce/deliver?

For individual and relationship-level outcomes, we ask:

  • What changes in learning, knowledge, attitude, skills, or understanding do we see?
  • What changes in behavior, practices, or decisions do we see?

For community and societal level-outcomes, we ask:

  • What changes to institutions do we see?
  • What changes in condition in the community do we see?
  • What changes in social norms do we see?

To dive deeper into outcomes, we use the ABCDE method of writing outcomes. Example: “By the end of the program, 80 percent of program participants will be able to list two or more positive ways to communicate with peers evidenced by the results of a pre-test/post-test survey activity.”

Rad Resource: The Ohio Domestic Violence Network Primary Prevention of Sexual and Intimate Partner Violence Empowerment Evaluation Toolkit.

Hot Tip: Stakeholders are not always enthusiastic about using paper or electronic pre-/post-tests.  Therefore, we encourage “activity-based assessments,” a method that integrates evaluation into the program experience or educational curricula.

The person-in-environment perspective calls on us to make sure our logic models include “external Influences” that articulate factors outside of the implementer’s control (positive or negative) that may influence the outcomes and impact of the program/project.

Cool Trick: The clinic needs be 2 to 3 hours long per program. We convene two or three program designers/implementers in a room with a 12-by-6 foot portable sticky wall (nylon sprayed with spray mount) to facilitate laying out the logic model in a linear way together. Once documented, we encourage program staff to put the logic model elements into a diagram, shape, or symbol that resonates for them and their stakeholders. See page 15 of the Discovery Dating DELTA Evaluation Report for an example.

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Some thoughts on teaching evaluation to social work students… I’m Brandon W. Youker, social worker, evaluator, and professor at Grand Valley State University in Grand Rapids, Michigan. After a dozen plus years in higher ed., I’ve developed a commitment to community-based learning (CBL) as my primary pedagogy for teaching future social workers about program evaluation. I’ve had students from numerous program evaluation courses divided into smaller evaluation teams and asked them to design, conduct, and report on evaluations for local non-profit organizations and programs.

The benefits to students include learning by doing or experiential learning and honing their tools for thought as evaluation is one of the highest order thinking skills according to Bloom’s Taxonomy. Students also report enjoying the realism of the course and evaluation projects as they work in real environments, with real programs that make real impact on real people. Lastly, students not only learn about evaluation but they also learn through serving some of the community’s most vulnerable and disenfranchised populations.

The organizations and programs benefit by receiving high quality, independent, pro bono evaluation and evaluation consulting. The evaluation projects have led to enhancing organizations’ evaluation capacity through thinking more deeply and intentionally about evaluation and program and consumer outcomes, and they receive the student-created data collection instruments that they can use or adapt for use.

It’s important to collaborate with the organizations to develop multi-semester, multi-course evaluation strategies as well as for creating relevant lectures and meaningful assignments. In terms of scholarship, partnerships have led to presentations at academic conferences and journal publications. These evaluation projects allow me to serve my community, which consequently serves the university and the social work profession while building relationships with the local community.

Yes, there are obstacles to overcome. Nevertheless, the potential benefits clearly outweigh the effort for the students, community partners, and instructors. Besides, there are numerous CBL resources for course instructors.

I believe that evaluation is a social work tool for social justice. Thus, it is incumbent upon educators to encourage and support realistic and practical CBL experiences, which will ultimately lead to competent social workers who support sound evaluation and evidence-based practices and programs.

Hot Tips:

Most colleges and universities have CBL resources, guidelines, and policies to assist instructors (see the Association of American Colleges & Universities who lists CBL as one of ten high-impact educational practices [https://www.aacu.org/leap/hips]).

Rad Resources:

There is robust literature on CBL and service learning—the benefits and obstacles as well as suggestions for implementation; and there are a few articles discussing CBL with program evaluation courses, in specific. Newcomer (1985) provides a call to action for CBL pedagogy in program evaluation courses, while Oliver, Casiraghi, Henderson, Brooks, and Mulsow (2008) describe various evaluation pedagogies. Shannon, Kim, and Robinson (2012) discuss CBL for teaching evaluation and offer practical suggestions for doing so; and Campbell (2012) provides a guide for implementing CBL in social work courses.

Thanks for your interest and please contact me to discuss CBL further.

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Nicole Clark, licensed social worker and owner of Nicole Clark Consulting, where I partner with community-based groups, local/national organizations, schools and more to design, implement, and evaluate programs and services geared toward women and girls of color.

In 2016, I shared on AEA365 why it matters that evaluators know the difference between the types of social workers we engage with. Today, I’m discussing ethics, its role in social work, and how it all aligns with AEA’s Guiding Principles for Evaluators.

Rad Resource: The first edition of the National Association of Social Workers’ Code of Ethics was approved on October 13, 1960. Since its last revision in 2008, The Code of Ethics has become the standard for social workers throughout the field, in many organizations and for state social work licensing laws.

Hot Tip: Section 5 of The NASW Code of Ethics focuses on social workers’ ethical responsibilities to the profession. In section 5.02 of the Code of Ethics (titled “Evaluation and Research”), social workers engaged in evaluation should:

  • Monitor and evaluate policies, implementation of programs, and practice interventions
  • Promote and facilitate evaluation and research to contribute to the development of knowledge
  • Critically examine and keep current with emerging knowledge relevant to social work and fully use evaluation and research evidence in their professional practice
  • Follow guidelines developed for the protection of evaluation and research participants
  • Obtain written informed consent or assent from participants/guardians, disclosing the nature, extent, and possible benefits and risks associated with evaluation and research (as well as inform participants of their right to withdrawal from an evaluation at any time without penalty)
  • Take appropriate steps to ensure that participants have access to appropriate supportive services, and take appropriate measures to protect participants from mental distress or unwarranted physical harm during an evaluation or study
  • Only discussed information related to an evaluation or study with appropriate individuals professionally related to the evaluation
  • Accurately reporting findings and take steps to correct errors found in published data using standard procedures, and ensure the confidentiality of program and study participants when reporting findings and research results
  • Educate themselves, students, and colleagues about responsible evaluation and research practices.

Lesson Learned: The NASW Code of Ethics aligns with AEA’s Guiding Principles for Evaluators as both the Code of Ethics and the Guiding Principles serve as cornerstones for sound, responsible, ethical behavior for social work evaluators. Both focus heavily on the client-professional relationship by highlighting the dignity of our clients and the overall societal contribution of evaluation. AEA’s Guiding Principles, however, takes up where the Code of Ethics leaves off by adding greater emphasis on stakeholder engagement in the promotion of collaborative inquiry, equity, and cultural responsiveness related to race, gender, religion, age, and more.

What better profession for social workers to be aligned with than evaluation?

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We are Mansoor A.F. Kazi and Yeongbin Kim (University at Albany), realist evaluators for the Substance Abuse and Mental Health Services Administration (SAMHSA) System of Care Expansion grant at Chautauqua County, New York.

We use research methods drawn from both epidemiology and effective research traditions in partnership with human service agencies to investigate what programs of intervention work and for whom.

Lesson Learned: The emphasis is on data naturally drawn from practice, and therefore quasi-experimental designs can be used with demographic variables to match intervention and non-intervention groups. Binary logistic regression can be used as part of epidemiologic evidence based on association, environmental equivalence, and population equivalence.  In this way, evaluators and agencies can make the best use of the available data to inform practice.

The realist evaluation paradigm focuses on investigating how interventions may work and in what circumstances. This approach essentially involves the systematic analysis of data on:

  • Service users’ circumstances (e.g., demographic characteristics);
  • Dosage, duration and frequency of each intervention in relation to each user;
  • Repeated use of reliable outcome measures with each service user.

Hot Tip: Realist evaluators work in partnership with human service agencies to clean data and undertake data analysis with them at regular intervals and not just at the end of the year. This way, evaluators and human service agencies can work together to evaluate the impact of interventions on the desired outcomes utilizing innovative methods and addressing issues relevant for practice including diversity, investigating where and with whom the interventions are more or less effective, in real time.

As the data mining includes all service users (e.g. all students within a school district), it is possible to investigate the differences in outcomes between intervention and non-intervention groups, and these groups can be matched using the demographic and contextual data. Binary logistic regression can be used to investigate which interventions work and in what circumstances. The variables that may be influencing the outcome can be identified through bivariate analysis and then entered in a forward-conditional model. The variables that are actually influencing the outcome are retained in the equation, and those that are significant provide an exponential beta indicating the odds of the intervention achieving the outcome where the significant factor(s) may be present. This approach is used extensively in Chautauqua County, New York, including school districts and human service agencies, and the county (Chautauqua Tapestry) received the Gold Award for Outstanding Local Evaluation in 2010 from the federal agency SAMHSA.

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Alice Walters, a member of AEA’s Social Work TIG.  I am a doctoral student in human services and work as a non-profit consultant in fund development, marketing, and evaluation.  In my experience, reluctant stakeholders can be a common evaluation challenge.  I present the value of social work insight and appreciative inquiry methods for successfully engaging stakeholders.

Lesson Learned:  I learned stakeholders may not share your evaluation exuberance.  This may come as a shock, but someone has to bear the bad news – not everyone embraces evaluation!  There are many barriers to evaluation and stakeholder reluctance is a common one.  Below are a few tips and lessons to meet this challenge including a focus on Appreciative Inquiry as an evaluation strategy.

Hot Tip:  Begin an evaluation with honest conversations with stakeholders on concerns.  Stakeholders may fear scrutiny, poor evaluation outcomes, time commitments, additional responsibilities, or “needless” intrusions.  You won’t know the barriers until you ask them.  Negative stakeholder attitudes can sabotage the best evaluation plans unless addressed.

Lesson Learned: Understanding stakeholder reluctance and openly addressing concerns is the first step to a positive evaluation experience for all involved.  Social workers are trained to recognize and work with defensive people.  How do we do it?  We listen – a lot. We watch for clues like body language or a failure to speak up.  Then we empathize and focus on helping find positive motivations for participation.  Positive approaches to evaluation exist and Appreciative Inquiry is one example.      

Hot Tip:  Appreciative Inquiry (AI) is a particularly appropriate evaluation strategy for engaging reluctant stakeholders in a strengths-based process.  AI focuses evaluation on what is going well.  This strategy may reassure reluctant stakeholders who fear negative outcomes.  AI is also highly participatory, engaging multiple stakeholders.  The steps of AI include a strengths-based 4D method of Discovery, Dream, Design, and Destiny.  Questions to stakeholders are framed positively, “Remember a time something went particularly well, what was that like?”

Rad Resource:  More information on AI is available at Case Western Reserve University.

Rad Resource:  A great resource is the AEA Public eLibrary.  This is a searchable database for discussion posts, shared files, and blogs.  It is a great resource for finding and connecting to current developments in your evaluation niche.  Many AEA conference presenters upload their contributions in this space.

Rad Resource: Gail Barrington’s  “Using Appreciative Inquiry to Evaluate Learning Circles: Some Early Lessons” is one example culled from the AEA eLibrary.  It provides an example of applying appreciative inquiry for evaluation.

Help your reluctant stakeholders appreciate inquiry using these tips on people skills and creative evaluation strategies.

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Courtney Barnard and I am a social worker, coalition coordinator, and program evaluator for a children’s health care system in Fort Worth, Texas.

A key component of social work practice is the assessment process – at the individual, family, community, or systems levels. At the heart of assessment is a respectful curiosity, always asking questions and challenging assumptions. It blends the client’s experience with the evaluator’s professional knowledge, all to inform a well-thought out plan of intervention.

Hot Tip: Assessment is a continuous process. You can modify and apply these assessment steps (outlined by  ) at any point in an evaluation (planning, addressing challenges, and writing recommendations – anytime you need more information to determine the next steps).

  1. Exploration: Listen to your client’s unique story to gain and organize information. Listen for contradictions, expectations, and things left unsaid. Ask open-ended questions to get more information and to identify areas of further exploration.
  2. Inferential thinking: Apply your knowledge of evaluation to the information gathered during exploration to guide the development of your intervention. The conclusions you make in this phase may be inaccurate or incomplete; these can be tested and corrected in later phases.
  3. Evaluation: Look at strengths and needs of the client and environment of the given situation. It is important to understand the client’s motivation for change, resources available, and a realistic assessment of the client’s ability or environment to adapt to change.
  4. Problem definition: You and the client mutually agree on how to define the problem, how to frame it within its surrounding context, and determine what is achievable in the given timeframe. By this point, evaluators must strive to understand the whole situation while acting on one part of it (think global, act local).

Lesson Learned:

“A disciplined professional determines intervention through a carefully constructed assessment framework. This is the science of the process. The art is the [evaluator’s] own professionally developed style, area of specialization, […] and personality.” – Sonia G. Austrian

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I’m Jade Jackson.  I am an Evaluation Specialist at Lutheran Immigration and Refugee Service (LIRS) in Baltimore, MD.  My organization is a national refugee resettlement agency that provides social services to migrants and refugees.

I want to share with you some lessons learned from my personal transition to evaluator, as well as tips for agencies new to evaluation and evaluative concepts.  I began my career in the programs department of my agency, so I had the background information of working with program managers.  I came into evaluation because of my desire to improve the programming for the clients served by my agency. I currently serve on a two person team for Monitoring and Evaluation at LIRS.

Lessons Learned:

  • Start slow and small – Introducing evaluation into your agency culture can be quite the rollercoaster ride. It’s easier to agree with evaluative concepts like continuous improvement and learning, however there’s less likelihood for buy-in and more resistance when evaluation delves into data collections, definitions of goals and outcomes, and defining indicators.  Therefore, be sure to allow for the learning curve and consensus-building that will be necessary in promoting evaluation in your agency
  • Make it relevant – Evaluation provides opportunities to reflect on program design and how our interventions affected results. Use examples directly related to your agency’s scope of work to engage staff.  Staff will value evaluation components if they see a direct connection to their program’s overall improvement.  For example, after providing a general all staff training on monitoring and evaluation, my team partnered with a staff member from our Visitation program to provide a tailored webinar focusing on monitoring and evaluation, directly pertaining to the Visitation program.
  • Talk about the beginning, not just the end – Educate staff on the importance of engaging in evaluative thinking from onset of program design. By bringing up evaluation at the beginning of a project, you can emphasize a culture that incorporates evaluation at all levels of program implementation.  This creates tremendous insight when evaluating the project’s impact.

Hot Tip: Engage the champions of evaluation.  These are the people in your agency who are excited about evaluation and come to you with questions.  These people do exist.  Encourage these colleagues to discuss their interest in evaluation with other agency leaders, who can use evaluation findings to make key management decisions.

Rad Resource: I have found incredible resources and solutions to problems by accessing my network at AEA.  When you work with a small team, it’s important to look to other for insights and promising practices that can improve your work.  Additionally, reaching out to others provides valuable feedback. Post your questions to the EvalTalk listserv!

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top