AEA365 | A Tip-a-Day by and for Evaluators

I am Elizabeth Tully the Online Toolkit Manager at the Johns Hopkins Bloomberg School of Public Health’s Center for Communication Programs (JHU?CCP). One of the toolkits that I work on regularly is the Measurement, Learning & Evaluation (MLE) Project‘s Measuring Success Toolkit. This toolkit provides guidance on how to use data to plan a health program and to measure its success through monitoring and evaluation (M&E). Using data to design and implement health programs leads to more successful and impactful programs. Data can be used to solve health programming-related problems, inform new program design, assess program effectiveness and efficiency, and suggest evidence-based adaptations and improvements.

But this post is especially about the importance of designing useful resources for M&E practitioners – and the Measuring Success Toolkit is Rad!Tully

Hot Tip #1: Using the Toolkit. The Toolkit is meant to be used! It offers full text documents and usable tools that are in the form of an uploaded file that a user can download or a hyperlink to another website’s rad resources. It is organized both by steps in an M&E plan and by health topic. A handy Toolkit tutorial video is also available to assist new users in navigating the Measuring Success Toolkit.

Hot Tip #2: Curated Content Focuses on Use. The Measuring Success Toolkit team updates the toolkit with useful resources every quarter! This means that the content is curated by M&E experts and includes guides, checklists, protocols, indicators and other tools that can be used by M&E practitioners in the field. While you won’t find peer reviewed journal articles or lengthy end of project reports in this toolkit, you will find the tools and resources to help you plan for, monitor and evaluate a program that would be worthy of an esteemed journal. You’ll certainly be prepared to document your project’s success!

Rad Resources: Within the toolkit you’ll find plenty of great tools – 160 and counting! Here’s a quick list of our most popular (and most downloaded) resources on the site:

Sample Outline of an M&E Plan

Qualitative Research Methods: A Data Collector’s Field Guide

A Guide for Developing a Logical Framework

Please send suggestions for resources to include to contactus@urbanreproductivehealth.org. Our next update will be in mid-October!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! My name is Angela Fitzgerald and I am a Senior Researcher with the National Council on Crime and Delinquency (NCCD: www.nccdglobal.org) – a nonprofit that works to promote just and humane social systems. I have been involved in and have witnessed evaluation work from a number of vantage points, and a challenge that seems to consistently plague organizations is engaging community members (one subset of stakeholders) in the evaluation process. Engaging community members in the evaluation process helps to ensure that evaluation content is understood and of relevance to that audience, identifies advocates who can champion the projects being evaluated, and lends credibility to evaluation findings. I have compiled and listed a few ‘Lessons Learned’ to help organizations overcome the challenge of engaging community members.

Lessons Learned: Develop relationships with community organizations and groups. People are more likely to invest time in something with which they are already familiar or belong. For an organization undertaking an evaluation project, this may require connecting to other organizations or groups to which your desired audience belongs. Developing relationships with other community-based organizations or groups creates ambassadors who are willing to help recruit individuals on your behalf to participate in the evaluation.

Make the process accessible. Engaging community members may require a different process than engaging other types of stakeholders. For example, community members who want to be involved may not be available during the work day. Providing opportunities for engagement through different mediums (web-based, telephone, in-person) and during non-traditional work hours will help to maximize opportunities for individuals to become involved in the evaluation process. Also taking into consideration potential barriers to engagement for community members and working to overcome them (e.g., scheduling meetings in locations that are accessible via public transportation) signifies that you care about their involvement.

Follow-up with your audience after the evaluation is complete. Completing a proper evaluation takes much time and effort, and once completed it’s easy to move on to the next project before sharing findings with your community member stakeholders. Those who have been engaged will be interested in knowing the outcome of the evaluation, and possibly contributing to future work. Make sure to share evaluation findings with community members, and allow them to provide their input in the interpretation of findings. This exchange may be supremely important in defining next steps for the project, as well as ensuring that these community members will want to invest time on your future projects.

Rad Resources: Check out the Center for Disease Control’s (CDCs) website for more helpful information on engaging stakeholders in your evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Lisa R. Holliday, and I am an Evaluation Associate with The Evaluation Group in Columbia, SC. I was recently cleaning survey data and preparing it to be uploaded into a database. The survey had been offered multiple times, and I wanted to track the responses of participants who had completed the survey each time it was offered. To start with, I needed to create a list of non-duplicated names. I would then be able to use this list to determine which participants had taken the survey each time it was offered.

Hot Tip: Power Query can remove duplicates from a list. Power Query is a handy add-in for Microsoft Excel that allowed me to generate my list quickly and easily. Power Query is a business intelligence tool that works with Excel 2010 and 2013. It is available for free download at the Microsoft website: http://www.microsoft.com/en-us/download/details.aspx?id=39379

Cool Tricks:

Step 1: I created a unique field in my survey data that would allow me to identify each person.  Just in case I had two people from the same site with the same name, I concatenated their name with their location and job title. The final cell looked like this:

Holliday 1

I then created a master list of names (with duplicates) using cut and paste. Once this was done, I was ready to load my data in to Power Query.

Step 2: I selected to load my data “From Table” under the Power Query tab.

Holliday 2

Step 3: Within the Power Query window, I selected the concatenated column, and then “Remove Duplicates.”

Holliday 3

After my query ran, I selected “Close and Load.” Excel created a new table comprised of unique values only.

Holliday 4

Hot Tip: Why not use “Remove Duplicates” from the Data Tab?

Any data you load into Power Query can be refreshed, and the query will automatically be re-executed. This feature is valuable if you plan to add more data to your original data set. In contrast, “Remove Duplicates” under the Data Tab does not have this option.

Hot Tip: Other Functionality. Power Query has a lot of other useful functionality and is worth exploring. It can easily import data from a variety of sources (including websites), un-pivot data, and split columns (such as First and Last name).

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings, fellow evaluation enthusiasts! My name is John Murphy and I am an Evaluation Associate on the Education Research and Measurement team at Cincinnati Children’s Hospital Medical Center. We provide evaluative support to various forms of learning and career development, ranging from clinical orientation to leadership and management training. One of our clients does extensive work surrounding quality improvement education. While we have provided them with evaluations of their courses, we have been fortunate enough to glean important tips and advice from their wealth of knowledge in measurement theory. One of the most basic tools used in improvement science is the annotated run chart, a simple form of the line chart. What makes the annotated run chart different from the typical line chart are annotations, small text snippets that show when an intervention or event has taken place.

Murphy

More and more, I have embraced annotations as being crucial to providing context and a story to a data representation. As an aficionado of data visualization, I have begun the quest for the perfect annotation. Here is what I have found so far:

Rad Resource: Stephanie Evergreen and Ann Emery provided an amazing resource within the virtual pages of this very blog! Their data visualization checklist not only reaffirmed my enthusiasm for the annotation, it also gave concrete guides for font size and text direction.

Rad Resource: What discussion about data visualization would be complete without mentioning Edward Tufte? The first chapter of his 2006 book Beautiful Evidence, entitled “Mapped Pictures”, discusses, in rich detail, the benefits of various techniques of providing context to images. Placing content in its proper space, scale, and time is crucial for making all genres of data representation tell a compelling story.

Hot Tip: If you are creating many annotated run charts that are updated frequently, consider investing in BI software such as Tableau. While Excel data labels are functional for one-shot data representations, more dynamic software can save time and provide more flexibility so annotations fit the story instead of being limited by the medium.

Good luck in telling your data stories!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi. We’re Sherry Campanelli, Program Compliance Manager and Laura Newhall, Clinical Training Coordinator, from the University of Massachusetts Medical School’s Disability Evaluation Services. As an organization, we are committed to ongoing quality improvement efforts to enhance our services for people with disabilities. Earlier, we wrote about common techniques that help a quality improvement (QI) team to be successful. Today we share some potholes and pitfalls we’ve encountered in group facilitation and our tips for negotiating them successfully:

Lessons Learned:

  • New problems or issues frequently arise in the middle of a QI project. Team members, management, or external events (such as changes in the industry) can generate issues unrelated to the original charge. This can be discouraging for the team members and leader and can delay completion of the project. The following may be helpful.
    • Reaffirm the team’s goals, mission, and review data as a group to ascertain if the new issue should be addressed in this venue or in another way.
    • Allow team members to opt out of participating in the new task. Seek new members for the team as needed to address the new issue(s).
    • Keep a “hot” list of issues that arise to be addressed by future QI teams.
  • Recommendations from team not fully accepted. A less than enthusiastic response from decision- makers to a team’s recommendations is a challenge for any team.
    • Set expectations with the group up front that recommendations might be accepted, rejected or amended.
    • Sustain the group’s enthusiasm during the revision process by reminding them of the importance of their work and input regardless of the outcome.
    • Emphasize the positive feedback before sharing constructive feedback. Thank team members for their efforts.
    • Ensure that relevant decision-makers are regularly briefed so the team can make “mid-course corrections” toward options likely to be approved.
  • Difficulty achieving full team consensus. This can be due to dominating or defensive team member(s), incomplete information or team members needing more time for analysis.
    • Encourage subgroup and individual work on the issue between meetings.
    • Allow the team to live with ambiguity for a while to enable consensus to develop.
    • Document what’s already been decided and refer team members back to prior discussions.

Thoughts to Ponder:

“The best-laid plans of mice and men / Often go awry” – from a poem by Robert Burns. The QI team process does not always go smoothly; however, these unexpected challenges present opportunities for better overall outcomes.

From a motivational poster by the British government in 1939, the facilitator must “keep calm and carry on” through the potholes and pitfalls of the QI team process.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi there! I’m Karen Anderson and I’m the programming Co-chair for the Atlanta-area Evaluation Association. After wrapping up my experience as the AEA Diversity Coordinator Intern last year I wanted to stay active and connected to the AEA community. During my short time working on the programming team I’ve learned a few tricks of the trade for successful events.

Hot Tip: Have an awesome communications team! Our communications team does so much, including managing our AaEA website, AaEA Facebook, AaEA LinkedIn and creating a newsletter, which really helps to get the word out.

Hot Tip: Just like blog posts and conference titles tend to pull you in, the title of your events are just as important. Last year we had an event, which included a “speed dating” format to facilitate getting to know members and visitors better. We could have easily named it: “Atlanta-area Evaluation Association Social”, but we used “Speed Networking for Evaluators”. Make the overviews of the events short and interesting as well to capture people’s attention.

Hot Tip: Build in time for socializing and networking. We set aside 30 minutes before events for mixing and mingling. This is a great way to set the tone for events.

Hot Tip: Provide refreshments! We all know that if you provide food, they will come. Fruit, cheese and meat tray, hummus, crackers, juice, beer, and wine make up a nice spread that will keep the crowd happy.

Hot Tip: Encourage council members to get to know visitors and members. By engaging visitors and members you can encourage visitors to become members, and inform members on volunteer opportunities.

Meeting new people and learning about their interests and areas of expertise helps with planning events in more ways that one. This is a great way to help you learn the pulse of the group and if an individual is bold enough you can ask them to facilitate or co-facilitate an upcoming event.

Hot Tip: Keep up with current trends in evaluation locally, as well and nationally and internationally. This may seem a bit daunting, but subscribing to AEA’s Eval Talk, AEA-LinkedIn, and aea365 blog are 3 ways to get regular updates on what’s going on in the world of evaluation. This will help when planning events and keeping topics fresh!

Have any tips on making affiliate events successful?  Please share in the comments.

The American Evaluation Association is celebrating Atlanta-area Evaluation Association (AaEA) Affiliate Week with our colleagues in the AaEA Affiliate. The contributions all this week to aea365 come from AaEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, my name is Ayana Perkins, the programming Co-Chair of the Atlanta-area Evaluation Association as well as Senior Research Analyst and Evaluator at Infinite Services and Solutions.

Lesson Learned: When evaluating organizations for empowerment one of the characteristics to explore is whether the agency is set up to be an opportunity role structure. Organizations structured in this manner are more likely to retain their members due to the high level of engagement that is inherently required. Opportunity role structures, a popular term in community psychology and conceptualized by Kenneth Maton and Deborah Salem, are egalitarian systems that encourage members to shape the direction of the organization, offer purposeful activities, create easy access to intellectual and material resources, and deliver guidance and information that would allow members to competently serve as a leader in this organization. In developing our local affiliate of the AEA, we shifted to become more systems oriented resulted in the development of an opportunity role structure. These system changes included branching off and, routine training.

Our first act was to more actively encourage branching off.   Seemingly a self-sabotaging act since this could naturally result in splintering, the opposite occurred—members now had a new way to pursue topics of interests with the affiliate gaining access to members who were more personally fulfilled; a predictor of future participation. We gambled and won on the logic that people appreciate most that which directly reflects them.   The first group was an evaluation consultant group, and was later followed by a pro bono group, economic evaluation, and finally facilitation. Each group retains a commitment to the AEA affiliate but also uniquely provides a space where members can problem solve within specific professional topics.

Our second act was routine training. Beyond the standard professional development and social networking activities that we offer monthly, we also wanted to make sure that members felt more comfortable about advancing their work in evaluation. Almost 8 years ago, we created a new evaluator’s series to help guide our less experienced evaluators in securing employment. Earlier this year, we trained graduate students on conference style presentations using Ignite format in preparation for their delivery in a showcase at Morehouse School of Medicine, even scheduling additional dry run sessions to reduce speaker anxiety. More recently, we have developed a voluntary 30 minute orientation for new members to directly match them with their interests and invite them into leadership. Each of these training activities is ongoing.

Rad Resource: For more information on opportunity role structure, see Kenneth Maton and Deborah Salem’s 1995 article in the American Journal of Community Psychology on empowering organizations.

The American Evaluation Association is celebrating Atlanta-area Evaluation Association (AaEA) Affiliate Week with our colleagues in the AaEA Affiliate. The contributions all this week to aea365 come from AaEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hi, my name is Ayana Perkins, the programming Co-Chair of the Atlanta-area Evaluation Association as well as Senior Research Analyst and Evaluator at Infinite Services and Solutions. I am a qualitative enthusiast and often train other evaluators and researchers in these methods. What I have noticed is that participants are more likely to right to feel valued and engaged when sharing data using qualitative methods. In fact, interviews and focus groups are wonderful ways to encourage satisfaction and when done right, the evaluator can walk away with credible findings and the participant can leave renewed and excited about participating in the data collection event.

Hot Tips:

  • Practice, practice, practice. With each new project, every member of the evaluation team should have a firm understanding of what to expect in the data collection event. What contingency plans exist when the recorder doesn’t work or too many people show up? Investing time in learning how to respond to these worst case scenarios produces an investigator or an evaluation team that is well poised to resolve all unexpected issues.
  • Plan for informal conversation. Before any interview and focus groups, allow 5 to 15 minutes for informal conversation. This time should be built into the length of the focus group or interview. Further, the evaluator is able to shed the role of expert as well as imply that no greater effort than conversation is required. This strategy also increases an individual’s willingness to more fully participant in any icebreaker activity.
  • Create opportunities for success. Previous experience and personality differences can partially influence how a person’s will likely respond to the open ended format. Even with these influences, there are strategies to help the participant feel like their contribution was a successful effort:
    • Emphasize there is no right answer which helps to reduce social desirability in responses
    • Acknowledge that no response is a response and ask whether or not the question was meaningful to them or more time is needed before responding
    • Connect similarities in responses to enhance group dynamics

Focus groups and interviews do require a lot of preparation but this level of effort can be paid off with rich findings and satisfied participants. 

Rad Resource: Want to learn more about qualitative methods? Visit this website to identify ways to strengthen your project: http://www.qualres.org/. Although the site is not specific to evaluation, most of the recommendations would also apply for qualitative evaluation projects.

Have any of these strategies worked for you? Please share your experiences in the comments.

The American Evaluation Association is celebrating Atlanta-area Evaluation Association (AaEA) Affiliate Week with our colleagues in the AaEA Affiliate. The contributions all this week to aea365 come from AaEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, my name is Lindsey Stillman and I work at Cloudburst Consulting Group, a small business that provides technical assistance and support for a number of different Federal Agencies. My background is in Clinical-Community Psychology and so providing technical assistance around evaluation and planning is my ideal job! Currently I am working with several communities across the country on planning and implementing comprehensive homeless service systems. Much of our work with communities focuses on system change by helping various service providers come together to create a coordinated and effective system of care, rather than each individual provider working alone.

Lesson Learned:

  • The new HEARTH legislation includes a focus on system level performance versus program level performance. This has required communities to visualize how each program performance feeds into the overall performance of the system in order to identify how to “move the needle” at a system level. Helping communities navigate between the system level goals and the program specific goals – and the connections between them – is critical.
  • Integrating performance measurement into planning can help communities see the value of measuring their progress. All too often grantees or communities are given performance measures that they need to report on without understanding the links between their goals and activities and the performance measures. Presenting performance measurement as more of a feedback loop can help remove the negative stigma around the use of evaluation results and focus stakeholders on continuous quality improvement.
  • Working with agencies or communities to create a visual representation of the links between processes, program performance and system performance can really help to pull all of the pieces together – and also shine light on serious gaps. Unfortunately many federal grantees have had negative experiences with logic models and so finding creative ways to visually represent all of the key processes and outcomes/outputs/etc. can help to break the negative stereotypes. In several communities we have developed visual system maps that assist the various stakeholders in coming together to focus on the bigger picture and see how all of the pieces fit together. Oftentimes we have them “walk” through the system as if they were a homeless individual or family to test out the model and to identify any potential barriers or challenges. This “map” not only helps the community with planning system change but helps to identify places within the system and processes that measuring performance can help them stay “on track” toward their ultimate goals.

Rad Resources:

The American Evaluation Association is celebrating Atlanta-area Evaluation Association (AaEA) Affiliate Week with our colleagues in the AaEA Affiliate. The contributions all this week to aea365 come from AaEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Maureen Wilce, a founding member of the Atlanta-area Evaluation Association, and I’m Sarah Gill, president elect of AaEA. We’re both “true believers” in the power of evaluation to guide organizational learning. We’ve seen how good evaluation questions can help uncover important information to improve programs. We’ve also seen the opposite: how bad evaluation questions can waste time and resources – and increase distrust of evaluation in general.

Asking the right evaluation questions is critical to promoting organizational learning. Answers to good evaluation questions direct meaningful growth and build evaluation capacity. But what makes an evaluation question “good”? To get our answer, we reviewed the literature and then collected the practice wisdom of AaEA members and members of AEA’s Organizational Learning & Evaluation Capacity Building TIG. As we organized our thoughts, a checklist began to form. After more great discussions with our colleagues in AaEA and the TIG, we decided to structure the checklist around the standards. A few more refinements came as we used the resource in our work in CDC’s National Asthma Control Program, and finally, Good Evaluation Questions: A Checklist to Help Focus Your Evaluation was born!

Rad Resource: The Good Evaluation Questions Checklist, at http://www.cdc.gov/asthma/program_eval/AssessingEvaluationQuestionChecklist.pdf, is a tool to help ensure that the evaluation questions we create will be useful, relevant, and feasible. In keeping with the new accountability standard, it also provides a format for documenting our decisions when selecting evaluation questions.

Lesson Learned: Articulating what makes an evaluation question “good” requires thinking through several dimensions and assessing it against multiple criteria. A checklist can help us review evaluation questions to anticipate potential weaknesses and can also support communication with stakeholders during the question development process.

Rad Resource: While at the National Asthma Control Program website, check out our other evaluation resources, including our guides and webinars.

Get Involved: We received some great feedback from folks who attended our demonstration at AEAthanks to all who joined us! If you have additional suggestions about how to improve the checklist, please leave them in the comments below.

The American Evaluation Association is celebrating Atlanta-area Evaluation Association (AaEA) Affiliate Week with our colleagues in the AaEA Affiliate. The contributions all this week to aea365 come from AaEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Older posts >>

Archives

To top