AEA365 | A Tip-a-Day by and for Evaluators

TAG | Evaluation

My name is Sharon Wasco, and I am a community psychologist and independent consultant. I describe here a recent shift in my language that underscores, I think, important trends in evaluation:

  • I used to pitch evaluation as a way that organizations could “get ahead of” an increasing demand for evidence-based practice (EBP);
  • Now I sell evaluation as an opportunity for organizations to use practice-based evidence (PBE) to increase impact.

I’d like evaluators to seek a better understanding of EBP and PBE in order to actively span the perceived boundaries of these two approaches.

Most formulations of EBP require researcher driven activity — such as randomized controlled trials (RCT) — and clinical experts to answer questions like: “Is the right person doing the right thing, at the right time, in the right place in the right way, with the right result?” (credit: Anne Payne)

In an editorial introduction to a volume on PBE, Anne K. Swisher offers this contrast:

“In the concept of practice-based evidence, the real, messy, complicated world is not controlled. Instead, real world practice is documented and measured, just as it occurs, “warts” and all.

It is the process of measurement and tracking that matters, not controlling how practice is delivered. This allows us to answer a different, but no less important, question than ‘does X cause Y?’ This question is: ‘how does adding X intervention alter the complex personalized system of patient Y before me?’”

Advocates of PBE make a good case that “evidence supporting the utility, value, or worth of an intervention…can emerge from the practices, experiences, and expertise of family members, youth, consumers, professionals and members of the community.

Further exploration should convince you that EBP and PBE are complementary; and that evaluators can be transformative in the melding of the approaches. Within our field, forces driving the utilization of PBE include more internal evaluators, shared value for culturally competent evaluation, a range of models for participatory evaluation, and interest in collaborative inquiry as a process to support professional learning.

Lessons Learned: How we see “science-practice gaps,” and what we do in those spaces, provide unique opportunities for evaluators to make a difference. Metaphorically, EBP is a bridge and PBE is a Midway.

PBE_EBP 2

 

Further elaboration of this metaphor and more of what I’ve learned about PBE can be found in my speaker presentations materials from Penn State’s Third Annual Conference on Child Protection and Well-Being (scroll to the end of the page — I “closed” the event).

Rad Resource: I have used Chris Lysy’s cartoons to encourage others to look beyond the RCT for credible evidence and useful evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hello! I am Liz Zadnik, Capacity Building Specialist at the New Jersey Coalition Against Sexual Assault. I’m also a new member of the aea365 curating team and first-time Saturday contributor!  Over the past five years I have been working within the anti-sexual violence movement at both the state and national levels to share my enthusiasm for evaluation and support innovative community-based programs doing tremendous social change work.

Over the past five years I have been honored to work with talented evaluators and social change agents in the sexual violence prevention movement. A large part of my work has been de-mystifying evaluation and data for community-based organizations and professionals with limited academic evaluation experience.

Rad Resources: Some of my resources have come from the field of domestic and sexual violence intervention and prevention, as well as this blog! I prefer resources that offer practical application guidance and are accessible to a variety of learning styles and comfort levels. A partnership between the Resource Sharing Project and National Sexual Violence Resource Center has resulted in a fabulous toolkit looking at assessing community needs and assets. I’m a big fan of the Community Tool Box and their Evaluating the Initiative Toolkit as it offers step-by-step guidance for community-based organizations. Very similar to this is The Ohio Domestic Violence Network’s Primary Prevention of Sexual and Intimate Partner Violence Empowerment Evaluation Toolkit, which incorporates the values of the anti-sexual violence movement into prevention evaluation efforts.

Lesson Learned: Be yourself! Don’t stifle your passion or enthusiasm for evaluation and data. I made the mistake early in my technical assistance and training career of trying to fit into a role or mold I created in my head. Activists of all interests are needed to bring about social change and community wellness. Once I let my passion for evaluation show – in publications, trainings, and technical assistance – I began to see marked changes in the professionals I was working with (and myself!). I have seen myself grow as an evaluator by leaps and bounds since I made this change – so don’t be afraid to let your love of spreadsheets, interview protocols, theories of change, or anything else show!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello, my name is Jayne Corso and I work with Dan McDonnell as a Community Manager for AEA. As a frequent social media user, and one of the voices behind @aeaweb, I am always searching for new tools that can organize my social media feeds and help me stay up-to-date on the latest conversations, topics, and hashtag surrounding the evaluation community.

HootSuite is my primary tool for monitoring industry news and evaluating our social media posts. The ease of access to industry information that that this tool provides makes research much more effective – and easy!

Rad Resource: Manage your Social Media Accounts Through the HootSuite Dashboard

Each HootSuite user has a personal dashboard, which can be customized to fit posting or research needs. The dashboard can manage multiple platforms including: Twitter, Facebook, LinkedIn, and WordPress— creating separate tabs for each platform. Each tab can be customized with ‘streams’, (feeds, keyword searches, lists, etc.) so you can curate the most relevant information on one screen.

This is a great way to see how the evaluation community is engaging with @aeaweb’s daily Tweets. The different streams help better identify good times to share posts, what content is most popular, and the best ways to present information. Using these insights, AEA seeks to better connect with the evaluation community on Twitter and other social media channels.

Here is a quick-and-easy guide to adding tabs and streams to your dashboard

Rad Resource: Using Hashtags and Keywords to Follow the Conversation

HootSuite is an excellent resource for staying connected with other evaluators on social media and joining evaluation-related conversations. Add streams to your dashboard that follow keywords or hashtags and HootSuite will search social platforms for the most recent and relevant posts. This is where you come in – jump in, and say hello! Offer your thoughts, insights, and experience to add value to one of the many conversations that are happening. You may just meet some new friends!

Choosing your hashtags depends on the topics you are interested in, be it evaluation (#eval), data visualization (#dataviz), or even helpful excel tips (#exceltip). Hashtags also allow you to follow along with industry events like AEA’s Evaluation 2014.  By adding #Eval14 as a stream to your dashboard, you’ll receive the most recent and updated information tweeted and what other evaluators are saying about the event.

Want to learn more? Here’s a helpful resource from Fresh View Concepts on how to set up your HootSuite Dashboard.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

My name is Neha Karkara, and I’m an independent consultant working with UN agencies. I specialize in developing and implementing human-rights based advocacy, knowledge management and communications strategies. Recently I worked with EvalPartners to develop a resource that will help civil society organizations (CSOs), Voluntary Organizations for Professional Evaluation (VOPEs), governments and other development partners to:

  • Learn how strategic advocacy can be leveraged to increase the demand for evaluation.
  • Acquire essential skills to become an effective advocate for building an enabling environment for evaluation.
  • Devise a long-term advocacy strategy to develop and implement equity and gender sensitive national evaluation policies and systems.
  • Respond quickly to seize any unplanned advocacy opportunity to build a culture of evaluation.

Lesson Learned: Evaluations are a means to support good governance and informed policy-making. Evaluations increase accountability from governments to their citizens and their development partners; bring in transparency in the use of resources and their results; and help in learning from experience. By helping to make better decisions based on facts, evaluations can lead to more effective and efficient use of public benefits.

KarkaraRad Resource: The new publication “Advocating for Evaluation: A toolkit to develop advocacy strategies to strengthen an enabling environment for evaluation” has been jointly developed by EvalPartners, IOCE and UNWOMEN in partnership with UNEG, OECD Development Assistance Committee Network on Development Evaluation, UNICEF, Ministry for Foreign Affairs of Finland and USAID.

The PDF is now available, for free download here. An e-version of the toolkit is also available for free download here.

While technical evaluation capacities (the so-called supply side) are paramount to produce high-quality evaluative evidence, an enabling environment for evaluation is necessary to ensure it is actually used for decision-making. The toolkit contains guidance and tools on how to plan, design, implement, monitor and evaluate advocacy strategies for national evaluation policies and systems that are equity-focused and gender-responsive. The toolkit is especially designed for stakeholders with varying levels of experience, capacities and skills in strategic advocacy.

Hot Tip: Increased capacity for strategic advocacy for evaluation is specifically relevant given the presence of key opportunities, such as 2015 International Year of Evaluation (EvalYear). EvalYear presents a unique opportunity to advocate for and to promote evaluation and evidence-based policy-making at international, regional, national and local levels. Use this toolkit to develop an advocacy strategy for evaluation in your own organization and country, and share it with your key partners.

For more information, please contact:

  • EvalPartners Co-Chairs, Marco Segone (marco.segone@unwomen.org) and Natalia Kosheleva (nkochele@yandex.ru)
  • Advocacy Specialist, Neha Karkara (karkaraneha@gmail.com)
  • EvalYear Secretariat, Asela Kalugampitiya (aselakalugampitiya@yahoo.ie)

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I’m Michelle Baron, an Independent Evaluation Strategist. In my work in higher education, I’ve encountered a mixture of evaluation champions and critics. Today I’d like to address the importance of strategic planning in building a culture of evaluation.

Strategic planning is considered by many to be an organizational road map by outlining the organizational vision and mission, establishing clear and attainable objectives and goals, and then developing processes for how to achieve them.    Strategic planning and evaluation go hand in hand in moving the organization and its programs forward to benefit its stakeholders. Strategic planning is simply crucial to the evaluation process: without a road map of criteria, standards, and goals, it’s almost impossible to achieve desired success.

Evaluators have a unique role in helping organizations with both ends of the spectrum: creating a foundation through strategic planning, and then conducting evaluations to examine and monitor progress.

Hot Tip #1: Start at the top. Buy-in from top management for strategic planning is of the utmost importance for its success.

Hot Tip #2: Conduct a SWOT analysis (Strengths, Weaknesses, Opportunities, and Threats) of the entity or its programs/services. Doing so not only enlightens people to a variety of ideas and questions to consider, but can also indicate the level of support for those topics.

Cool Trick: Brainstorming sessions are often an excellent starting point for the organization itself or smaller group within that organization. The evaluator or designated member of the organization can facilitate the discussion by developing questions beforehand that may serve as prompts for the discussion, such as those dealing with objectives, goals, and resources.

Rad Resource #1: Strategic Planning for Public & Nonprofit Organizations by John Bryson, and related books by the same author, provide the groundwork and tools necessary for organizations to develop and sustain their strategic planning process.

Rad Resource #2: The Fifth Discipline: The Art and Practice of the Learning Organization by Peter Senge helps leaders establish the foundation and philosophy behind strategic planning, and helps them develop their long-term thinking for organizational growth.

With these tools and resources, evaluators may be more prepared to assist organizations in strategic planning, and have more support for and effectiveness of the evaluations for the organizations.

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · · ·

My name is Sean McKitrick, Vice President with the Middle States Commission on Higher Education.

In higher education settings, “assessment” is a term that can mean both institutional research and student learning assessment and usually refers to institutional efforts to provide accurate data and reports to oversight bodies such as federal and state governments or to systems offices, to efforts to evaluate overall institutional effectiveness, and to efforts to assess student learning. In recent years, pressures to assess have their origins in pressures by state and federal governments, accreditors, and by a public requiring more accessible information for prospective applicants.

With regard to assessment in higher education settings, the following points, among others, appear salient:

  1. Accountability demands will only increase, but a debate is brewing about whether these demands should focus on reporting or institutional improvement. Some parties argue that accreditors should not be required to link assessment of student learning and other measures with recommendations regarding an institution’s future eligibility to dispense federal funds, while others argue that measures such as graduation rates and student salary information (in aggregate) are sufficient measures of institutional quality.
  2. Support for requiring institutions to report additional data, such as the aggregate salaries of students, engenders further debate regarding the reliability of such information. Some important questions to ask include: How effectively might institutions be able to contact students for salary information? Should the government be allowed to link federal databases in order to find such information independent of institutional involvement?
  3. The validity of assessment information continues to be debated. Although graduation and retention rates are important measures of institutional effectiveness, some argue that these can serve as proxy measures of student learning. Others argue that these measures do not directly evaluate student learning and other measures be taken to do this, although this increases reporting burdens on institutions.
  4. Pressures to assess student learning continue. However, given a lack of a common core of learning outcomes from institution to institution, it appears that the current trend is to focus on how institutions are using assessment processes (and evaluation information) to manage and improve student learning rather than to focus solely on the measurement of outcomes.

Hot Tip: Assessment and evaluation in higher education continue, but expectations regarding methods of evaluation and assessment are changing as well as expectations regarding what information to report and use by governments and accrediting organizations.

RAD Resource: The College Navigator site, sponsored by the National Center for Education Statistics, is the primary site where institutional data required by the U.S. Department of Education can be found, http://nces.ed.gov/collegenavigator.

The American Evaluation Association is celebrating Assessment in Higher Education (AHE) TIG Week. The contributions all this week to aea365 come from AHE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Our names are Mwarumba Mwavita, Katye Perry, and Sarah Wilkey. We are faculty and evaluators in the Center for Educational Research and Evaluation at Oklahoma State University.

Higher education has constantly been engaged in the development, revamping, and implementation of programs. Often these changes result in reorganizations of existing programs and contribute to a dynamic and shifting ecology, which advances a need for evaluation to determine if outcomes are congruent or discrepant with intent. While some stakeholders are anxious about evaluation and its use, others are unaware of what and how it could be of benefit to them and show the overall impact the program has. Evaluating such a program requires evaluators to assume different roles and begin building evaluation capacity with program personnel. The challenge is HOW?

Hot Tip 1: Understand that you are engaging in a discussion about evaluation with those who may not understand evaluation.

Speak in a language that is not intimidating and is clear enough to explain what evaluation is. Introduce yourself and explain your role—do what you can do build rapport. Help those you are working with to understand that the goal of evaluation is to gather information that will lead to sound decision making, not to punish or find fault.

Hot Tip 2: Determine the unique contribution the service/program you are evaluating makes to the institution.

Let the program personnel know that you understand the university environment is dynamic, and their program may be in flux. Talk with stakeholders and program personnel to identify the goals of the program being evaluated; this will help you understand how the program fits into the university at large. Take time and care to look for discrepancies in words and actions. Understanding the difference between what program personnel and patrons say they do versus what they actually do. Determine the hierarchy/structure of the program, and ask yourself ‘Who is really in charge?’

Hot Tip 3: Determine what information the program personnel/stakeholders expect the evaluation to yield AND when they expect a final write up of findings.

Knowing what is expected of the evaluation will help you determine who needs to be on your evaluation team—be sure to include people with skills and expertise as needed about evaluation and the institution. Understand who the critical stakeholders of the program are and the role they play. This will also help you understand the best way to collect and present information.

Rad Resources:  We have found the books Evaluative Inquiry for Learning in Organizations  and Program Evaluation: Alternative Approaches and Practical Guidelines to be very helpful in our work.

The American Evaluation Association is celebrating Assessment in Higher Education (AHE) TIG Week. The contributions all this week to aea365 come from AHE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

My name is John LaVelle, I am the Director of Operations and External Affairs for the School of Behavioral & Organizational Sciences and the School of Politics & Economics at Claremont Graduate University.  I’m also a PhD candidate specializing in Evaluation and Applied Research Methods, and a past curator of the AEA365 blog.  Today I’m going to share three Rad Resources about some organizations whose work overlaps with Evaluation and also maintain active Listservs as a way of sharing ideas and resources.

Rad Resource: The Society for Community Research and Action.  Several posts have been written about the Society for Community Research and Action, which is also known as Division 27 of the American Psychological Association. They are dedicated to advancing theory, research, and social action, and much of their work overlaps with evaluation.  SCRA maintains a very active listserv (similar to EvalTalk), and regularly share information about conferences and professional development, research and work opportunities, as well as professional challenges and questions.

Rad Resource: ARNOVA is the Association for Research on Non-Profit Organizations and Voluntary Action. Many evaluators work in a not-for-profit context or with organizations trying to increase volunteer and philanthropy activities, and ARNOVA is a good resource.  Similar to AEA’s TIG structure, ARNOVA has areas of specialization, such as community grassroots, social entrepreneurship, and even a section for individuals working at the intersection of practice<—->academics (they call this intersection “pracademics”).  ARNOVA’s listserv and webpages have much information about resources, conferences and meetings, calls for papers, etc.

Rad Resource: CBPR stands for Community Based Participatory Research (see Laura Myerchin Skarloff’s excellent post) and the University of Washington hosts a very useful listserv for sharing ideas and problem-solving.  They even offer a parallel listserv just for sharing work and research opportunities.  Membership information is available here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! Sheila B. Robinson here, guest posting for Susan Kistler, our regular Saturday contributor. I work in PK12 education at Greece Central School District, and in higher education at the University of Rochester’s Warner School of Education. As aea365’s current Lead Volunteer Curator, I’ve had the pleasure of working with a number of groups – American Evaluation Association Topical Interest Groups (AEA TIGs), AEA Affiliates, and other groups that are united by evaluation practice in various contexts.

Hot Tip: Leave no stone unturned! In other words, don’t skip entire weeks. You can learn a lot even when a sponsored week’s group name doesn’t resonate with you. During sponsored weeks, you can read about how evaluators in different contexts from your own have grappled with evaluation challenges, learned something from working in diverse communities, or tried new technologies to enhance their evaluation practice and are now willing to share their experiences with all of us.

Hot Tip: Dig for enticing artifacts! Look for posts with content that transcends the focus of the sponsored week. For example, while I am not an environmental program evaluator, nor do I evaluate extension education programs, I found these two gems during sponsored weeks:

  • In this post, Sara El Choufi shared resources for learning Excel during the Environmental Program Evaluation (EPE TIG) sponsored week.
  • In this post, Melissa Cater shared information on creating a Community of Practice during Extension Education Evaluation (EEE TIG) week.

archaeologistLesson Learned: While our sponsored week authors may share evaluation foci with each other, they offer Hot Tips, Cool Tricks, Lessons Learned, and Rad Resources that appeal to and can be educative for a broad range of evaluators.

 

Cool Trick: Get your hands dirty! Sift through the archive and unearth your own gems in sponsored (and non-sponsored!) weeks.

Lesson Learned: Many sponsored weeks have themes that cut across evaluation contexts. In addition to TIG-sponsored weeks,we’ve hosted Cultural Competence Week, Innovative #Eval Week, Video in #Eval Week, AEA affiliate weeks, Bloggers Series Weeks, and Local Area Working Group Weeks, among others.

Rad Resource: History in the making: Check out aea365 and our archive for a list of over 1000 nuggets of evaluation wisdom from hundreds of authors. With about 70 sponsored weeks on aea365, there’s a lot to learn! So, get into comfortable clothes, get your virtual trowel, sieve, and brush and get your read on!

 

 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hi! I’m Sheila B. Robinson, AEA365’s Lead Curator. I’m also an educator with Greece Central School District, and the University of Rochester’s Warner School of Education.

Today, I’ll share lessons learned about evaluation planning and a fabulous way to get ready for summer (learning about evaluation, of course!).

Rudyard Kipling wrote, I keep six honest serving-men, (They taught me all I knew); Their names are What and Why and When, And How and Where and Who.

The “5 Ws and an H” have been used by journalists, researchers, police investigators, and teachers (among many others, I’m sure) to understand and analyze a process, problem, or project. Evaluators can use them to frame evaluation planning as well.

Lesson Learned: Use these questions to create an outline of an evaluation plan:

What: What is your evaluand and what is the focus of the evaluation? What aspects of the program (or policy) will and will NOT be evaluated at this time? What programmatic (or policy) decisions might be made based on these evaluation results? What evaluation approach(es) will be used?

Why: Why is the evaluation being conducted? Why now?

When: When will evaluation begin and end? When will data be collected?When are interim and final reports (or other deliverables) due?

How: How will the evaluation be conducted? How will data be collected and analyzed? How will reports (or other deliverables) be formatted (i.e. formal reports, slides, podcasts, etc.) and how will these (and other information) be disseminated?

Where: Where is the program located (not only geographic location, but also where in terms of contexts – political, social, economic, etc.)?

Who: Who is the program’s target population? Who are your clients, stakeholders, and audience? Who will be part of the evaluation team? Who will locate or develop measurement instruments? Who will provide data? Who will collect and analyze data and prepare deliverables? Who are the primary intended users of the evaluation? Who will potentially make decisions based on these evaluation results?

Can you think of other questions? I’m sure there are many more! Please add them in the comments 

Hot Tip: Register for the American Evaluation Association’s Summer Evaluation Institute June 2-5, 2013 in Atlanta, GA to learn more about 20+ evaluation-related topics.

 

Clipped from http://www.americanevaluation.org/SummerInstitute13/default.asp

Hot Tip: Want to learn more about evaluation planning? Take my Summer Institute course It’s not the plan, it’s the planning (read the description here).

Rad Resource: Susan Kistler highlighted a few institute offerings here.

Rad Resource: I think this course: Every Picture Tells a Story: Flow Charts, Logic Models, LogFrames, Etc. What They Are and When to Use Them with Thomas Chapel, Chief Evaluation Officer at the Centers for Disease Control and Prevention, sounds exciting. Read the description here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

<< Latest posts

Older posts >>

Archives

To top