AEA365 | A Tip-a-Day by and for Evaluators

My name is Donna M. Mertens and I am an independent consultant based in Washington DC; my work is both domestic and international. I had the honor of being the keynote speaker at the Minnesota Evaluation Studies Institute (MESI) in March 2015. The MESI theme was Social Justice amidst Standards and Accountability: The Challenge for Evaluation. The concept of social justice in the context of evaluation implies that evaluators can play a role in addressing those wicked problems that persist in society, such as violence, lack of access to quality education for all, poverty, substance abuse, and environmental pollution.

Lesson Learned: Wicked problems and Social Justice. Evaluators are concerned and involved in contributing to the solution of wicked problems. They also recognize the importance of bringing a social justice lens to this work. Michael Harnar conducted a survey of 1,187 evaluators and reported that 69% (n=819) either strongly or somewhat agreed with this statement: Evaluation should focus on bringing about social justice.

Rad Resource: Mertens, D.M. editorial: Mixed Methods and Wicked Problems, Journal of Mixed Methods Research, 2015, 9, 3-6. Abstract http://mmr.sagepub.com/content/9/1/3.extract

Harnar, M. (2014). Developing criteria to identify transformative participatory evaluators. JMDE. http://journals.sfu.ca/jmde/index.php/jmde_1/article/view/383

Lesson Learned: Social Justice Lens Leads to Different Evaluation Questions. Evaluators who work with a social justice lens are concerned with the question of program effectiveness and answering the impact question, Did “it” work? They are also interested in asking other types of questions:

  • Was “it” the right thing?
  • Was “it” chosen and/or developed and implemented in culturally responsive ways?
  • Were contextual issues of culture, race/ethnicity, gender, disability, deafness, religion, language, immigrant or refugee status, age or other dimensions of diversity used as a basis for discrimination and oppression addressed?
  • How were issues of power addressed?
  • Do we want to continue to spend money on things that don’t work?

Rad Resource: Native American Center for Excellence published Steps for Conducting Research and Evaluation in Native Communities that provides a specific context in which a social justice lens is applied in evaluation.

Lessons Learned: Social Justice Criteria for Evaluators. Evaluators who work with a social justice lens consider the following criteria to be indicators of the quality of the evaluation:

  • Emphasizes human rights and social justice
  • Analyses asymmetric power relations
  • Advocates culturally competent relations between the evaluator and community members
  • Employs culturally appropriate mixed methods tied to social action
  • Applies critical theory, queer theory, disability and deafness rights theories, feminist theory, critical race theory, and/or postcolonial and indigenous theories

Rad Resource: Reyes J., Kelcey J., Diaz Varela A. (2014). Transformative resilience guide: Gender, violence and educationWashington, DC: World Bank.

The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

My name is Melissa (Chapman) Haynes from the University of Minnesota’s Minnesota Evaluation Studies Institute (MESI). At MESI we have a strong interest in building evaluation capacity within the university and the community through university-community partnerships. We are trying to build this capacity in a sustainable manner, and in a way that builds upon the practice of professional evaluators and creates scholarship of the teaching of and professionalization of program evaluation. One of our signature activities is a spring evaluation training that MESI has hosted for the past 20 years. This week of posts will highlight a bit of the key learning, resources, and tools presented at our 2015 event!

Lesson Learned: Creating an inclusive community of evaluators is essential but we are an incredibly diverse field – what brings us together? Through the week of MESI the Program Evaluation Standards (Yarbrough et al., 2010) and AEA Guiding Principles were utilized in various contexts. In particular, as a frame of reference as we decide which evaluation projects we will engage in, as a guide to navigation and negotiation of situations where ethics are in question, and to elevate the profession of evaluation in various contexts. We can and should continue to use and explore how these guiding documents can further the professionalization of our field.

Hot Tip: Donna Mertens provided some wonderful examples of the art and power of questioning during her workshop and keynote address. During her workshop she gave some examples of how she uses questioning to negotiate with clients. For example, if a potential client asked you to frame an evaluation in a manner that did not jive with the Program Evaluation Standards or AEA Guiding Principles, one might tell the client something like “I will not do X, but let’s talk about how we might frame an evaluation that will continue to serve the population of interest.”

Rad Resource: Some of the presenters have opted to share the information they presented on our website: http://www.cehd.umn.edu/OLPD/MESI/spring/2015/default.html

Rad Resource: A fun highlight of MESI is the annual “Top Ten” competition. For those new to MESI, Jean King develops a Top Ten statement – this year it was “How is program evaluation like interstellar space travel?” We had over 50 entries from MESI attendees – the Top Ten is located here. My favorite is #2 – “You’ve got to remember that YOU are the alien here.”

The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Venturing into social media can be a daunting task since the various platforms are growing so quickly. Developing a checklist can be an easy way to get started in social media and organize your social strategy and routine.

I have outlined a few ways you can start developing your social media checklist.

3Hot Tip Icon

 Define your audience

Identifying your target audience on social media is important. It’s easy to say that you want to target anyone or everyone who is willing to give you a like or retweet, but is this really aiding your social media goals or purpose and is your content being used effectively?

By identifying who you want to target—whether that group is students, evaluation professionals, non-profit workers, or those focused in data—you can create targeted content that will be more valuable for your followers and result in a higher return on investment for your social strategy. You can start with the basic demographics questions: age, occupation and education. Then you can identify their interests.

3Hot Tip Icon

Develop a content strategy

It’s important to develop some sort of content strategy when venturing into social media so you can stay relevant with your audience. This helps you stay on track and keeps you from sharing anything and everything. Once you have identified what your audience is looking for, you can develop posts that match their needs.  Important questions to ask yourself when developing content are:

What is important to your audience?

What are their questions or concerns?

What do they want to learn more about?

3Hot Tip Icon

Set up your check list for each channel

Once you are ready to start posting, you can set up your personal checklist and scheduling guide which will help you reach your activity goals. Below are a few examples:

Facebook

  • Publish 1 post each day
  • Dedicate two days each week to blog content from evaluation sources
  • Monitor and respond to comments once a week
  • Review insights at the end of every month

Twitter

  • Publish twice daily
  • Retweet relevant content to your followers twice a week
  • Follow 15 new and relevant users or organizations each week
  • Follow industry hashtags once a week

These are just a few examples. You can create a checklist that works with your schedule and social goals.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hi, my name is Martha Meacham. I am a librarian at the Lamar Soutter Library, University of Massachusetts Medical School. We are always happy to help answer the many questions we receive about copyright. While this can be a complicated issue, it shouldn’t be scary. A little background understanding and due diligence will help guide you while navigating copyright.

Copyright is a set of exclusive legal rights granted to the creators of works that allows them to control the copying, reuse, redistribution, creation of derivatives, and performance of their works. While copyright allows creators to benefit from their works, particularly financially, it also has some important limitations that benefit the public. Just because something is copyrighted doesn’t mean it can’t be used; the proper steps just need to be taken.

Hot Tips: It can be a challenge to determine if copyright needs to be taken into consideration. The Copyright Flow Chart below can help guide you through some questions to ask when considering if copyright is applicable.

Meacham

 

Lessons Learned: You may need to do further investigation in areas like Creative Commons Licenses or fair use. Remember, you can always ask for permission. However, don’t wait until the last minute to start thinking about copyright. Finding answers and seeking permissions can take time. Avoid the temptation to ignore the issue, or use something questionable because time has run out to take the proper steps.

Rad Resources: There are many great ways to find materials where copyright is not an issue or has been explicitly addressed.

Anything produced for or by the government exists in the public domain (something belonging or being available to the public as a whole, and therefore not subject to copyright). For example, the NIH Photo Galleries, the CDC Public Health Image Library, and a database of U.S. Government Photos, all provide materials that exist in the public domain.

Other resources contain images that have Creative Commons Licenses. Sites like Flickr allow you to search by Usage (specific Licenses) or restrict to just Creative Commons Galleries. Additionally, almost all images found in WikiCommons have some sort of license that allows for their use. Regardless of resource, it is wise to double check the specific license for a specific image, and always give credit to the source.

Finally, the Lamar Soutter Library offers some great resources about copyright here. Also check out the Columbia University Libraries’ Copyright Advisory Office and Fair Use checklist and the Copyright and Fair Use page from Stanford University.

When in doubt ask for help. Copyright can be tricky but there are many guides. With a little practice, your copyright journey can be smooth sailing.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, my name is Catherine Nameth, and I’m the Education Coordinator for an NSF- and EPA-funded research center at the University of California- Los Angeles. As Education Coordinator, my primary job is not evaluation, so I have to act creatively in order to integrate evaluation into my work and balance the need for internal evaluation with my other administrative and research responsibilities.

Hot Tip: Be an active learner and an active listener. Get to know your colleagues and their areas of expertise. Go to meetings, listen, and be open to learning about your colleagues and what they do. Your understanding of them and their work will inform your understanding of your organization as well as its people and programs/research. This understanding can then inform how you design surveys and collect evaluation data. People who know you are more likely to respond to your surveys and other “official” evaluation requests, and when they respond, you get the information you need!

Rad Resource: Map it out! Use Community Solutions’ map for “How Traditional Planning and Evaluation Interact.” This map displays how an evaluation logic model (inputs-activities-outputs-outcomes) situated horizontally interacts with program planning (goals-objectives-activities-time frame & budget) which is modeled vertically. In using this map, you’ll see that the “activities” of each model intersect, and this cohesive visual aid also serves as a reminder that program planning goals and evaluation outcomes should- and can- inform one another. Use this map to keep yourself focused, which is really important when your primary responsibilities include many aspects other than evaluation, and to help you show your organization’s leadership what you are doing and why you are doing it.

Hot Tip: Have an elevator pitch at the ready. When your work includes evaluation but is not entirely about evaluation, you need to be able to explain quickly and concisely what you are evaluating, why you are evaluating it, what information you need, and how your colleagues can help you by providing this needed information . . . which they will be more willing to do if they know you!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I am Rupu Gupta, Analyst at New Knowledge Organization Ltd. and Co-Chair of AEA’s Environmental Program Evaluation Topical Interest Group. My evaluation work focuses on learning about the environment and conservation in informal settings. As we celebrate Earth Day, I would like to share some reflections on evaluating these experiences.

Lessons Learned: Informal learning settings are critical to learn about the environment and actions to protect it. Informal learning settings offer opportunities for “free-choice” learning, where the learners choose and control what they learn. They are typically institutions such as zoos, botanic gardens, aquariums, and museums, distinct from formal educational settings like schools. With hundreds of millions of visits to these institutions annually, they are prime settings to engage the public in thinking about the environment. Conservation education is often a key aspect of these institutions’ programming, where visitors can learn about different forms of nature (e.g., animals, natural habitats), threats they face (e.g., climate change), and actions to address them (e.g., reducing energy use). Educational experiences here are often referred to as informal science learning for their connection with understanding natural systems.

Learning about the environment in informal learning settings can happen through a variety of experiences. Informal learning is socially constructed, through a complex process that includes oneself, close others (friends, family) and more distant others (institution staff). Specific experiences, like animal encounters, hands-on interactions with flora in botanic gardens, or media-based elements (e.g., touch screens) enable visitors to engage with information about nature and the environment. Docents play an important role in helping visitors ‘interpret’ the message embedded in the experiences and exhibits. Evaluators assessing the impact of the different experiences in informal settings, need to be mindful of the multiple pathways for visitors to engage with the environmental information.

Informal learning manifests broadly. Learning experiences in informal settings encompass outcomes, beyond learning traditionally associated with school-based education. In the process of making meaning of the various experiences, learning is tied to the multiple aspects of the human experience. They can be cognitive (e.g., gaining knowledge about climate change impacts), attitudinal (e.g., appreciating native landscapes), emotional (e.g., fostering empathy towards animals) or behavioral (e.g., signing a petition for an environmental cause). A mix of qualitative and quantitative methods are best to capture the complex learning experiences. By considering the range of learning possibilities, evaluators can design and conduct effective evaluations to understand how people engage with the multi-faceted topic of the environment.

Rad Resources: The following are great to get acquainted with evaluation in informal learning settings:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings! My name is Catherine Cooper, and I am the Faculty Director of the Educational Partnership Center and Professor of Psychology at the University of California, Santa Cruz. I invite you to explore and use the resources from the Bridging Multiple Worlds Alliance (BMWA).

The BMWA is a growing network of researchers, educators, and policy makers – including evaluators – in the U.S. and other nations who work with P-20 (preschool through graduate school) partnerships to support low-income, immigrant, and ethnic minority youth.  These partnerships support youth in building pathways from childhood to college and careers without giving up ties to their families and cultural communities. We work in collaboration with alliance partners, including youth themselves and evaluators of programs and partnerships.

Rad Resource: In the BMWA, we offer three resources that evaluators tell us are especially useful:

  • Aligning models and measures to build a common language among partners.
  • Tools for research, policies, and practice, including formative and summative evaluation.
  • Longitudinal data tools for qualitative and quantitative evaluation and research

The Bridging Multiple Worlds (BMW) Model (shown below) taps five dimensions for opening pathways:

  • Demographics—students’ age, gender, national origins, race/ethnicities, languages, and parents’ education and occupation
  • Students’ aspirations and identity pathways in college, careers, and cultural domains
  • Students’ math and language academic pathways through school
  • Resources and challenges across students’ cultural worlds of families, peers, schools, community programs, sports, and religious activities, among others
  • Partnerships that reach across nations, ethnicities, social class, and gender to open pathways from preschool through graduate school (P-20)

 Cooper 21 April 2015

Rad Resource: Bridging Multiple Worlds Tools include:

  • Survey measures of these five dimensions for middle/high school and college students
  • Activities for middle and high school students for building pathways to college and careers, with pre- and post-activity surveys (in English and Spanish)
  • Logic model template for programs and alliances among programs
  • Longitudinal case study templates

Rad Resource: I invite you to join BMWA partners– students, families, schools, community programs, and universities–in using these tools to ask your own questions and build common ground among evaluators, researchers, educators, and policymakers. The tools and other resources are available at www.bridgingworlds.org.

Rad Resource: Bridging Multiple Worlds: Cultures, Identities, and Pathways to College (Cooper, 2011) describes BMW and related models, supporting evidence, tools, and applications in P-20 research, practice, and policy work.

Hot Tip: Healthy partnerships are learning communities where “everyone gets to be smart”. Focus on questions and indicators partners are interested in and display data in clear and meaningful formats.  This increases enthusiasm, engagement, and cooperation. Examples of such questions, indicators, and formats are on our website.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on theaea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest toaea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi! I’m Madhawa “Mads” Palihapitiya, Associate Director at the Massachusetts Office of Public Collaboration at UMass Boston. We recently concluded the first phase of a statewide municipal conflict resolution needs assessment study commissioned by the state Legislature.

Hot Tip: The term “need” can mean many things to many people. For needs assessment purposes, needs are defined as Gaps in Results.

Organizations don’t often think of aligning their institutional needs with the societal bottom-line, but working towards this alignment is crucial.  We needed to investigate if our institutional mission, which is to help government and other entities address public conflict, was perfectly aligned with the needs of Massachusetts municipalities and their constituents. This alignment was particularly important to us as a statutory state agency and would add measurable societal value. Overtime, this alignment can also increase the institutional bottom-line.

Rad Resource: Roger Kaufman’s Needs Assessment for Organizational Success. See also Bethany Pearsons’ talk at Evaluation 2014 on the Triple Bottom line.

People don’t usually talk about societal results when they talk about organizational needs. How do we define societal results? We first developed an Ideal Vision that contained a series of societal results and indicators to measure them.

Lesson Learned: We had to resist the impulse to focus on immediate institutional needs like organizational inputs and processes. Imagining an ideal future or vision can tell us where the journey should end.

Rad Resource: Kaufman’s Ideal Vision.

Cool Trick: To help the organization and others being engaged understand the difference between different results, consider developing a visualization like the DoView chart below.

Mads 20 April 2015

Assessing societal results while assessing the institutional bottom-line requires access to valuable data both within and outside of your organization. A Needs Assessment Committee (NAC) was established as the ‘public face’ of the process and to provide advice and guidance on assessment design, participant selection etc.

Cool Trick: Set-up a website to communicate the purpose of your needs assessment. Use social media whenever possible.

There are 351 cities and towns in Massachusetts. Multiple organizations were involved. We had limited resources to collect the data we needed. We had to get creative! A series of regional focus groups and telephone interviews were held. To reach the rest, we launched an online survey.

Hot Tip: Online surveys are a great way to involve more people. Keep survey questions close-ended and completion time to 10-15 minutes. Plan ahead so that you keep the survey open for as long as you can.

Cool Trick: Get creative with survey dissemination by using contact databases, newsletters, list servers, Facebook and twitter. Ask people you know to invite others to take the survey.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on theaea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest toaea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Kylie Hutchinson, independent evaluation consultant and trainer with Community Solutions Planning & Evaluation. I also tweet regularly at @EvaluationMaven.

Have you ever wondered what evaluation recommendations and SpongeBob SquarePants have in common? Well, in my opinion, a lot.

Think about why we make recommendations. We want stakeholders to take action on our evaluation findings. But we all know this doesn’t happen by magic. And it doesn’t occur as soon as we submit our final report either. In fact it can be months or years before managers and policy-makers are actually in a position to make decisions based on our findings.

In order for utilization to happen, I think recommendations need to be three things:

  • easily absorbed (at the time of first reading)
  • sticky (so they stay in the minds of decision-makers)
  • have ‘legs’ (so they prompt action).

Hmmm…now think…what has good absorption, is sticky, and has legs? Exactly! SpongeBob SquarePants!

 

Rad Resource: Here’s a tip sheet on Recommendations That Rock!

Hot Tip: Well-written recommendations don’t have to check every tick in the box, but they do deserve significant attention. Don’t leave them to the end or the last minute. Instead, keep a running list of your initial ideas as soon as they occur, even if it’s at the beginning of the evaluation. And always run them by your stakeholders to increase ownership and the chances of implementation. Better yet, develop them collaboratively during a data party.

Rad Resource: You can find a Pinterest page with other resources for writing better recommendations here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi there, Liz Zadnik here, bringing you another Saturday post focused on practitioner experiences and approaches. Today I’m going to focus on my personal journey to stay up-to-date and relevant in all things evaluation.

I was not formally trained as an evaluator – everything I know has been learned or gained through on-the-job hands-on experiences and mentorship (I’m very lucky to have been able to work with a few brilliant evaluators and researchers!). Self-study, reading, and ongoing training have been intentionally incorporated into my personal and professional schedule.

Rad Resource: Coursera is an excellent resource for online learning. You can even get certifications in concentrations after completing a set of courses in sequence. They have a number of courses around data analysis and data science!

Rad Resource: iVersity coordinates and archives some really interesting and innovative massive open online courses (MOOCs). The “Future of Storytelling” course gave me a number of ideas and skills for crafting accessible and engaging trainings and resources, as well as some insights for capturing stories for program evaluation. Recent and future courses focus on idea generation methods and gamification theory.

Lesson Learned: Follow your gut! At first I thought I needed to select courses, books, and resources that were explicitly “evaluation-y,” but found it was those courses that made me say “Oooh! That looks interesting!” helped me think creatively and find ways to enhance my evaluation and program development skills.

Rad Resource: MIT Open Courseware is much more structured and academic, as these are courses held at MIT. These require – for me – a bit more organization and scheduling.

Rad Resource: edX is another great chance to engage in online courses and MOOCs. Right now they have two courses on my “to-take” list: Evaluating Social Problems and The Science of Everyday Thinking.

Are there other online course providers or resources you rely on to stay current? How do you stay up-to-date and innovative as you balance other obligations and projects?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top