AEA365 | A Tip-a-Day by and for Evaluators

My name is Tricia Wind and I work in the evaluation section of the International Development Research Centre in Canada – an organization that funds research in Africa, Asia and Latin America.  My section regularly does quality assessments of the evaluations that are commissioned by our program staff colleagues as well as by the organizations we fund.

Lessons Learned:

We have seen how quality depends not just on the consultants who undertake evaluations, but also on the program managers who commission them. Commissioners decide the scope of an evaluation and its timing. They define questions and facilitate use. They approve evaluation budgets. Commissioning decisions are key to evaluation quality.

Seeing that most evaluation resources are targeted to evaluators, IDRC teamed up with BetterEvaluation to produce a new interactive, online guide to support program managers.  It guides program managers in their roles and decision-making before, during and after an evaluation to ensure the evaluation is well designed, use-oriented and appropriately positioned within an organization.

Rad Resource:

The Program Manager’s Guide walks program managers through nine typical steps of commissioning and managing an evaluation. It provides high-level overviews of the steps, more detailed sub-steps and, and also links to further resources available on the rest of the rich BetterEvaluation website. It is available in English and French.

The GeneraTor: The guide is accompanied by a tool, called the GeneraToR, which prompts users to document the decisions they are making about an evaluation (its scope, uses, questions, timing, budget, evaluator qualifications, deliverables, governance, etc.) in an online form.  The form becomes a customized terms of reference that can be downloaded to share with stakeholders. The terms of reference are foundational for other documents for the evaluation, such as requests for proposals (rfps), consulting contracts, workplans and stakeholder engagement plans.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Good day, I’m Bernadette Wright, program evaluator with Meaningful Evidence, LLC. Conducting interviews as part of a program evaluation is a great way better understand the specific situation from stakeholders’ perspectives. Online, interactive maps are a useful technique for presenting findings from that qualitative data to inform action for organization leaders who are working to improve and sustain their programs.

Rad Resource: KUMU is free to use to create public maps. A paid plan is required to create private projects (visible only to you and your team).

Here are the basic steps for using KUMU to integrate and visualize findings from stakeholder conversations.

1) Identify concepts and causal relationships from interviews.

Using the transcripts, you focus on the causal relationships. In the example below, we see “housing services helps people to move from homelessness to housing” (underlined).

2) Diagram concepts and causal relationships, to form a map.

Next, diagram the causal relationships you identified in step one. Each specific thing that is important becomes a “bubble” on the map. We might also call them “concepts,” “elements,” “nodes,” or “variables.”

Lessons Learned:

  • Make each concept (bubble) a noun.
  • Keep names of bubbles short.

 

3) Add details in the descriptions for each bubble and arrow.

When you open your map in KUMU, you can click any bubble or arrow to see the item’s “description” on the left (see picture below). Edit the description to add details such as example quotes.

4) Apply “Decorations” to highlight key information.

You can add “decorations” to bubbles (elements) and arrows (connections) using the editor to the right of your map. For the example map below, bigger bubbles show concepts that people mentioned in more interviews.

Also, green bubbles show project goals, such as the goal “People transitioned out of homelessness.”

Cool Tricks:

  • Create “Views” to focus on what’s most relevant to each stakeholder group. To make a large map manageable, create and save different “views” to focus on sections of the map, such as views by population served, views by organization, or views by sub-topic.
  • Create “Presentations” to walk users through your map. Use KUMU’s presentation feature to create a presentation to share key insights from your map with broad audiences.

Rad Resources:

  • KUMU docs. While KUMU takes time and practice to master, KUMU’s online doc pages contain a wealth of information to get you started.
  • Example maps. Scroll down the KUMU Community Page for links to the 20 most visited projects to get inspiration for formatting your map.
  • KUMU videos. Gene Bellinger has created a series of videos about using KUMU, available on YouTube here.

Organizations we work with have found these map presentations helpful for understanding and the situation and planning collaborative action. We hope they are useful for your evaluation projects!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Mar/17

22

Engaging with EvalYouth by Khalil Bitar

I am Khalil Bitar (EvalYouth Vice-Chair). Along with Bianca Montrosse-Moorhead and Marie Gervais (EvalYouth Co-Chairs), I am very glad to have connected with all of you throughout our recent sponsored week. During the week, we presented EvalYouth, its achievements so far, and our plans for the near future. Despite the prominent work EvalYouth has achieved thus far, there is still more work to do.  EvalYouth hopes to build on our successes to achieve a lot more in 2017 and 2018.  Today, I’d like to tell you more about how to engage with EvalYouth.

Hot Tips:

During our sponsored week, you learned about the work of Task Force 1, Task Force 2, and Task Force 3.  We plan to start a fourth task force in 2017 focusing on youth inclusion in evaluation.  To do all this, we need the engagement of more members who are passionate about the future of Evaluation.

There are multiple ways to engage with EvalYouth:

Rad Resources:

Take a moment to read EvalYouth’s Concept Note, which details the network’s goals and objectives, governance structures, and a lot more.

Bianca, Marie, and I hope that we were successful in shedding light on EvalYouth and its work during EvalYouth week on aea365.  We very much look forward to hearing from you!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello AEA365!  I’m Paul Collier. Over the last two years I worked as the Data and Evaluation Manager at the San Francisco Child Abuse Prevention Center (SFCAPC), a mid-size nonprofit focused on ending child abuse in San Francisco. In my time there and as a freelancer, I can’t count the number of times I’ve fielded questions from staff about data their organization has collected. They often go something like this…

collier-post-imagea

How frustrating! But as someone serving as an internal evaluator or data analyst at an organization, I have to remind my staff questions like these are my friend. When my staff asked me questions about their data, I knew they’re engaged and interested in using it. But I often found the first questions they asked weren’t the questions that would really help them make decisions or improve their programs. This post is about helping your staff think critically and ask smarter questions about their data.

Hot Tip: Focus on highly strategic questions

Questions that can be answered with existing data come in all shapes and sizes. I like to consider first whether the results may help the organization improve or refine our programs. For example, questions testing the cause-and-effect relationships in our logic model or assumptions in our theory of change can and should inform programming. A second aspect of a strategic question is whether our team has expectations for the result. I often realized that our staff didn’t have expectations around average improvement or effect size, so I would find a few studies using comparable assessments and interventions to identify some benchmarks. Perhaps the most useful aspect of a strategic question is whether our staff can take action based on the results. I found that if my staff can’t envision how the results might actually be used, its wiser to help them think through this before spending my time (and theirs) analyzing the data.

Cool Trick: Plan for Analysis.

To be more strategic about the analysis questions I focused on, I built time between the request for analysis and doing the work. An initial conversation with the program manager or staff to learn more about the context of a question usually helped me refine it to be more specific and actionable. I found that batching analysis for a certain time in the year was also a useful planning approach that protected my time. I preferred to have this ‘analysis period’ in the winter, because my organization set its budget in the spring. This way, any changes to programming that resulted from the process could be planned for in the following year’s budget.

Rad Resources:

As you can tell, I think helping staff ask smarter questions is one of the most valuable things I do as an internal evaluator. For more reading on this topic, check out:

  • Michael Hyatt’s Blog on Asking More Powerful Questions: Michael Hyatt is a business author who provides some clear and easy to understand advice to aspiring leaders on asking questions.
  • Peter Block’s book, Flawless Consulting: Block’s Flawless Consulting provides many helpful suggestions for structuring analysis processes so they influence action. There are also several great chapters about overcoming resistance in clients, which I’ve found highly relevant for dealing with inevitable resistance in results within my team.
  • Rodger Peng, Ph.D.’s E-Book, The Art of Data Science: Peng illustrates what a practical data analysis approach looks like, framing each step as a process of setting expectations and understanding why results did or did not meet those expectations.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! We are Morgan J Curtis (independent consultant) and Strong Oak Lafebvre (executive director of Visioning BEAR Circle Intertribal Coalition).  Along with Patrick Lemmon (independent consultant), we have the good fortune of serving as the evaluation team for the Walking in Balance (WIB) initiative.

WIB is an innovative approach to violence prevention that focuses on 12 important indigenous values that encourage better harmony with other people and the land. The primary component of WIB is a 13-session curriculum that is built on a Circle Process and that, with some adaptations, can be focused on different populations. The Circle Process involves storytelling and sharing by all participants, including the Circle Keeper who serves to move the conversation forward. A teaching team of four, seated in the four directions, diminishes the role of a single expert and promotes Circle members talking with each other rather than to the Circle Keeper.

Lessons Learned: This program presents many exciting evaluation opportunities and challenges. One of the challenges is ensuring that the evaluation is both culturally responsive and methodologically sound. As part of this challenge, all members of the evaluation team are located in different cities and the evaluation consultants have all been white folks. This process has included much trial and error in our collaborative process and in the evaluation methodologies themselves. The team wanted to design an evaluation that aligned with the program’s principles and also integrated into the circle process as seamlessly as possible. We currently have a pre and post question for each session; participants write their answers on notecards and share aloud with the circle, which flows well with the storytelling focus of the circles.  Additional questions at the beginning and end of the Circle invite participants to share aloud how each session transformed them and ways continued engagement in the Circle impacts their lives. We capture responses from all parties to track how the Circle Process transforms both the teaching team and participants.  The VBCIC teaching team loves the seamless nature of the evaluation process and finds that checking in about what happens between sessions captures changes in behavior based on learning directly linked to Circle teachings.

Hot Tip: Listening plays a key role in both the Circle Process itself and in developing the evaluation. We have established a process of following the lead of the Visioning BEAR team both by listening intently to their struggles and hopes and also by offering options for how to tweak the evaluation. They move forward with what feels right to them and report back to us. Then, we keep tweaking. We are working to make the data analysis and interpretation processes more collaborative as we move forward, too.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! I’m Sarah Dunifon, Research and Evaluation Associate at the Wildlife Conservation Society (WCS). In my role, I create many reports and I’m always looking for efficient tools for data visualization. I’ve found a few different programs to display location data, which I’d like to share with the wider AEA community.

Rad Resources:

Google Fusion Tables

Google Fusion Tables is an experimental app add-on that you can link to Google Drive. It allows users to create online, interactive heat maps and feature maps. Privacy settings are managed in the same way as other Google products where users can decide on a range between public and only available to you as the user. Viewers can manipulate the maps in various ways, such as filtering results, scrolling around the map view, or switching between map types.

Feature map of countries with WCS offices – Fusion Tables

Infogr.am

Infogr.am offers quick and easy interactive heat maps which can be shared via weblink. The free version includes a United States map and a world map, and your data will be public, whereas the paid versions have data privacy, more map choices, and the option to download the maps as images.

Heat map of countries with WCS offices – Infogr.am

Excel Apps – Geographic Heat Map

With the “Geographic Heat Map” app on Microsoft Excel, you can create either a world or United States heat map. The data is private and you can save your final map as a picture, making it a good option for inserting into a report. This app doesn’t have much customizability in color and style, but I’ve been able to paste the image into another program (say Microsoft Word or PowerPoint) and edit the image there.

Heat map of countries with WCS offices – Geographic Heat Map with Excel Apps

Tableau

Tableau offers feature maps and heat maps for free, though the data will be public. This program is highly customizable and makes some beautiful visualizations. However, you might find there is a bit of a learning curve to using this software. The visualizations can be saved as an interactive display in “presentation mode” or uploaded to the Tableau Public gallery where they can be shared digitally.

Heat map of countries with WCS offices – Tableau

Excel Powermap

Powermap in Microsoft Excel lets you create private feature maps in a variety of themes on a 3D globe. The map can be an online interactive display or an image produced by taking a screen grab through the program. The screengrab puts a picture of the image onto clipboard, which you can then paste into another program.

Feature map of countries with WCS offices – Excel Powermap

Hot Tip: Consider how you intend to use the map before you start building it. If it needs to be interactive, choose an online format. If it needs to be put into a report, pick a program with capabilities to export a high-resolution image, rather than just a screenshot.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Say cheese! It’s me, Sheila B Robinson, AEA365’s Lead Curator and sometimes Saturday contributor. We have featured a number of blog posts over the years on finding and using images to complement your evaluation work, whether you are in the business of blogging, presenting, teaching, creating reports, or other areas. It’s more important than ever not only to become familiar with where to find images, but also with how you can and cannot use them legally.

Lesson Learned:

  1. Creative Commons is not the same as “copyright free.” According to Creativecommons.org,  The Creative Commons copyright licenses and tools forge a balance inside the traditional “all rights reserved” setting that copyright law creates. Our tools give everyone from individual creators to large companies and institutions a simple, standardized way to grant copyright permissions to their creative work. The combination of our tools and our users is a vast and growing digital commons, a pool of content that can be copied, distributed, edited, remixed, and built upon, all within the boundaries of copyright law.
  2. There are many Creative Commons licenses and it’s important to understand their differences. There are 6 main types ranging from more restrictive to less restrictive. Each license comes with language that helps the user understand whether attribution is required, and whether the product can be changed in any way or used for commercial purposes. Read this page to learn about each license.
  3. There is one type of license with NO restrictions! CC0 1.0 means that the designer has dedicated the work to the public domain by waiving all of his or her rights to the work worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law. You can copy, modify, distribute and perform the work, even for commercial purposes, all without asking permission.

Hot Tips: Where do you find good images? While there are countless websites that offer free and paid images, icons, and other graphics:

  1.  28 Places to Download Free Images for Websites and Blogs includes an updated list of and links to photo sites along with definitions of public domain and CC0.
  2. Nolan Haims Creative offers blog subscribers access to a bunch of great free resources, including a wonderful “taxonomic” reference list of photo sites

Cool Trick: Once you have a collection of images, what do you do with them? Check out Echo Rivera‘s email course on creating your own visual database. Echo found that searching for images while she was creating presentations wasn’t good for her workflow, so she advises creating visual database that organizes visuals in ways that make them easily accessible when you need them (minding the different types of licenses). Read this blog post for her explanation and rationale for this technique.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I am Paula Egelson and I am the director of research at the Southern Regional Education Board in Atlanta and a CREATE board member. Much of my current research and evaluation work centers on secondary career technical education (CTE) program effectiveness for teachers and students. The fidelity of implementation, or the degree to which an intervention is delivered as intended, for these programs is always a big issue.

Hot Tip:  Pay Attention to Fidelity of Implementation as Programs Roll out

What we have discovered over time is that factors that support fidelity of implementation crop up later in the program development process more than we ever expected. For example, CTE programs are usually very equipment heavy. During the field-testing stage, we discovered that due to a variety of vendor and district and state ordering issues, participating schools were not able to get equipment into their CTE classrooms until much later in the school year. This impacted teachers’ ability to implement the program properly. In addition, the CTE curricula is very rich and comprehensive which we realized required students to have extensive homework and ideally a 90-minute class block. Finally, we discovered that many teachers who implemented early on were cherry picking projects to teach rather than covering the entire curriculum.

Once these factors were recognized and addressed, we could incorporate them into initial teacher professional development and the school MOU. Thus, program outcomes continue to be more positive each year. This speaks to the power of acknowledging, emphasizing and incorporating fidelity of implementation into program evaluations.

Rad Resource:  Century, Rudnick, & Freeman’s (2010) American Journal of Evaluation article on Fidelity of Implementation provides a comprehensive framework for understanding the different components of Fidelity of Implementation.

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello. I am Sean Owen, Associate Research Professor and Assessment Manager at the Research and Curriculum Unit (RCU) at Mississippi State University. Founded in 1965, the RCU contributes to Mississippi State University’s mission as a land-grant institution to better the lives of Mississippians with a focus on improving education. The RCU benefits K-12 and higher education by developing curricula and assessments, providing training and learning opportunities for educators, researching and evaluating programs, supporting and promoting career and technical education (CTE), and leading education innovations. I love my role at the RCU assisting our stakeholders to make well-informed decisions using research-based practices to improve student outcomes and opportunities.

Lessons Learned:

  • Districts understaff research and evaluation specialists. Although there is an expectation there are personnel within districts that have strong backgrounds in program evaluation, we have found that is typically not the case in smaller, rural school districts. With a climate of tightening budgets, this is becoming more the norm than the exception. Districts have staff assigned with this role for program evaluation, but the role is accompanied by numerous others. 
  • “Demystify” the art of program evaluation. We have found that translating program evaluation to CTE may be confounding to some partners. Training key stakeholders about the evaluation process not only assists with the success of the current evaluation but also builds intellectual capital for future studies performed by the district. Guide districts to create a transparent, effective evaluation of their CTE program that encompasses students, facilities, advisory committees, teachers, and administrative processes.
  • Foster strong relationships. Identifying which RCU staff interact best with the school districts wanting assistance in program evaluation is key. Interpersonal communication is crucial to ensure that all the necessary information is gathered and steps in the evaluation process are followed. We have found that a more skilled evaluator who does not have a strong relationship with the partner will not help the district achieve their goals.

Rad Resources:

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

This is John Fischetti, Dean of Education/Head of School, at the University of Newcastle in Australia. We are one of Australia’s largest providers of new teachers and postgraduate degrees for current educators. We are committed to equity and social justice as pillars of practice, particularly in evaluation and assessment.

Hot Tips: We are in a climate of alternative evaluation facts and high stakes assessment schemes based on psychometric models not designed for their current use.

We need learning centers not testing centers.

In too many schools for months prior to testing dates, teachers — under strong pressure from leaders – guide their students in monotonous and ineffective repetition of key content, numbing those who have mastered the material and disenfranchising those who still need to be taught. Continuous test preparation minimizes teaching time and becomes a self-fulfilling destiny for children who are poor or who learn differently. And many of our most talented students are bored with school and not maximizing their potential. As John Dewey once noted:

Were all instructors to realize that the quality of mental process, not the production of correct answers, is the measure of educative growth something hardly less than a revolution in teaching would be worked (Dewey, 2012, p. 169)

The great work of Tom Guskey can guide us in this area. As assessment specialists we should be pushing back on the alternative facts that permeate the data world where tools such as value-added measures are used inappropriately or conclusions about teacher quality drawn without merit.

Failed testing regimens.

The failed testing regimens that swept the UK and US show mostly negative results, particularly for those who learn differently, are gifted, have special needs, have an economic hardship or who come from minority groups.

What we know from research on the UK and US models after 20 years of failed policy is that children who are poor in the UK and US and who attend schools with other children who are poor, are less likely to do as well on state or national tests as those children who are wealthy and who go to school with other wealthy kids.

It is time for evaluation experts to stop capitulating to state and federal policy makers and call out failed assessment schemes and work for research-informed, equity-based models that are successful in providing formative data that guides instruction, improves differentiation and gives school leaders evidence to provide resources to support learning. We need to stop using evaluation models that inspect and punish teachers, particularly those in the most challenging situations. We need to triangulate multiple data sources to not only inform instruction, that also aid food distribution, health care, housing, adult education and multiple social policy initiatives that support the social fabric of basic human needs and create hope for children and the future.

Rad Resources:  Thomas Guskey’s work on Assessment for Learning (For example, his 2003 article How Classroom Assessments Improve Learning.  Also see Benjamin Bloom’s classic work on Mastery Learning that reminds about the importance and nature of differentiated instruction.

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top