AEA365 | A Tip-a-Day by and for Evaluators

TAG | communication

Hi, I am Jennifer Johnson. I am the Director of the Division of Public Health Statistics and Performance Management for the Florida Department of Health. I want to discuss how improving stakeholder relationships can improve data collection.

In most evaluations, collection of quantitative and qualitative data forms a critical aspect of stakeholder engagement and relationships. Methods for collecting both types of data can include structured interviews, surveys, and file reviews. Evaluators also analyze data sets that vary in number and types of variables and formats.

Ultimately, however, key stakeholders provide the data. Thus, effective relationships with key stakeholders can be the lifeline to the data upon which a strong evaluation depends.

Whether participation is voluntary or contractually required, evaluators can adopt practices throughout evaluations that enhance stakeholder engagement specific to data collection. These practices foster effective and clear communication and help evaluators to establish trust.

Hot Tips:

  1. Communicate with Leadership. Initiate engagement with the executive leadership of stakeholder organizations, unless the evaluator has identified specific individuals. Give stakeholder leadership the opportunity to establish parameters and requests for communication throughout the evaluation. These parameters should identify those individuals or groups to always keep informed. Follow up by clarifying what the rules of engagement will be. Ensure that members of the evaluation team follow this agreement.
  1. Communicate Early. Be forthcoming and transparent from the beginning. Clearly communicate the evaluation scope at initial meetings. Specify data and data collection method that the evaluator may request from stakeholders. Inform stakeholders at this stage whether they will have an opportunity to review and discuss preliminary results and conclusions based on their data.
  1. Communicate Specifics. Develop clear and thorough processes for collecting data. Develop and submit data requests that clearly articulate and specify the requested data and information. Include specific variables when requesting databases. Include specific and clear instructions for submitting data. Provide an easy and convenient method for feedback and questions. Set reasonable deadlines and consider stakeholder organizational factors, such as crunch times staffing, and workload issues. If possible, modify data requests based on extenuating circumstances or to ease the burden on the stakeholder.
  1. Communicate Strategically. Data exchanges goes in both directions. Identify opportunities to answer stakeholder questions or provide information. Share results and information that could benefit stakeholders, but only if that sharing does not compromise the evaluation or use additional resources. This could include information that helps stakeholders address organizational problems or improve performance.

 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Professional Development Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings, AEA365 Readers! I am Dr. Nancy Bridier, Senior Doctoral Adjunct at Grand Canyon University, Public Sector Representative, and Board Member for the Southeast Evaluation Association (SEA). I am also an Independent Consultant based in the Florida panhandle. Communication with our clients is part of our practice, but are we communicating effectively? I would like to share tips for effective stakeholder communication.

Rad Resource: Stakeholders are not just those who contract our services, but may also include those affected by the program. This may depend on their relationship to and interest in the program. Explore the University of Kansas Community Toolbox checklist for identifying stakeholders.

Hot Tips:

  • What to communicate and why: Effective communication is not just about the technology we use, but its purpose. I have emailed written reports and presented PowerPoint slides to communicate findings. While these are commonly used tools, they are not always effective for every stakeholder. Understand the type of information stakeholders want and how they prefer to receive it. It may be text, numbers, graphics (charts, tables), visual, or combination. If your stakeholders are in a different area, a web conferencing tool, such as Zoom or WebEx, is a great interactive way to communicate with stakeholders. They also allow stakeholders to ask questions and receive immediate answers. These tools allow you the opportunity to observe stakeholder reactions.
  • When to communicate: Effective communication begins with the initial meeting. Establish a clear outline of the stakeholders’ purpose, questions, timelines, and communication processes. Communicate throughout the project to ensure nothing has changed. Engage stakeholders in decision-making. Inform the stakeholders of progress. Better Evaluation.org offers some great tips, tools, and methods of communicating findings to stakeholders after the evaluation is completed.
  • Considerations: Some evaluators invite stakeholders to review a draft report as part of their communicating and reporting strategy. Before engaging in this practice, consider the costs and ethical implications of accepting a stakeholder’s revisions to a draft evaluation report.
  • Communicating findings: Share the procedures and lessons learned. Know your stakeholders to convey information effectively. Define terminology. Avoid using jargon. Demonstrate results and accountability. Focus on success and improvement. Outline changes to the program to improve outcomes.

Lessons Learned:

On my first program evaluation, I failed to establish communication guidelines to the primary stakeholder. During an eight-week parent education program, the stakeholder changed the assessment instrument based on responses on the pretest. Needless to say, we had to complete more than one cycle of the program to establish a baseline for comparison. Let your stakeholders know communication is a collaborative process. Inform them about the type of information that you need, and the steps of the evaluation process.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Professional Development Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! We are Libby Smith and Levi Roth. We are both Project Managers at the Applied Research Center at the University of Wisconsin-Stout. We recently wrapped up work on the largest project of our professional lives. The INTERFACE Project included all 16 Wisconsin Technical Colleges and as you can guess each college had a diverse project team with different communication needs. This 4-year grant required constant communication with project teams on a variety of levels whether it was communicating data requirements, project updates, documentation requests, etc. We learned a lot about effective communication with a large stakeholder community and we want to share some hot tips we have learned with you!

Hot Tips:

  • Webinars and Conference Calls are your friends. With stakeholders in every corner of the state, holding regular meetings virtually was critical to ensuring everyone was on the same page. We met twice yearly face-to-face, but taking the time to communicate information “in-person” via webinars and conference calls helped build relationships and ensured everyone had an opportunity to ask questions about complicated data gathering guidelines. This way we could answer the question for the entire group in a concise manner, plus we were able to record and archive our conversations.
  • FAQs and standardized templates are powerful tools. We quickly realized how important it was to create a standardized way of doing things that is adopted by everyone involved. When working with a variety of stakeholders they may be reporting the same information but collecting it in different ways. Templates were a necessity and FAQ’s cleared up questions quickly. You help alleviate confusion with a common document that helps walk through questions or issues without filling up your email inbox.
  • Build relationships with your clients. This idea may seem obvious, but in our busy schedules can often be overlooked. In conjunction with our regularly scheduled webinars and conference calls, we made it a priority to also meet with our stakeholders at their college twice a year. We felt like these meetings were so crucial to building and maintaining healthy client relationships over the course of the project. It improved buy-in and commitment. Your clients might also begin to view your webinars and conference calls as welcomed constructive conversations instead of nuisances.

These are just a few hot tips we wanted to share to help improve other’s client communications or at the very least, have you start thinking about the level of effective communication you have with your clients. Let us know if you have any additional hot tips for effective communication!

The American Evaluation Association is celebrating ¡Milwaukee Evaluation! Week with our colleagues in the Wisconsin statewide AEA Affiliate. The contributions all this week to aea365 come from our ¡Milwaukee Evaluation! members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We’re Ningqin Wu and Amy Chen, both coordinators at AFDI – the Asia-Pacific Finance and Development Institute (AFDI) in Shanghai, China. AFDI is a member of the CLEAR Initiative (Centers for Learning on Evaluation and Results) and hosts the East Asia CLEAR Center. CLEAR promotes evaluation capacity building in regional centers across the globe. This week’s blogs are by CLEAR members.

Much of the work at our center involves training, with participants coming from across the globe, but especially in China and other parts of Asia. We’d been looking for an easy way to stay in touch with participants before, during and after courses. We turned to a popular instant messaging service – in our case WeChat – to serve as our main connecting tool with course participants. Below we share more about how we use it.

WeChat – like many other similar apps – is a powerful mobile communication tool to connect the users across the globe. It supports sending voice, video, photo and text messages. We can chat in Chinese with our Chinese participants, and in English with our international participants. We mainly use it to build “mobile learning communities” with members of each of our courses, such as our annual course, SHIPDET – the Shanghai International Program for Development Evaluation Training.

  • Before courses, we send detailed instructions on how to install the app and invite participants to join. We send logistics details and reminders on deadlines. If participants have any questions, they are able to connect to us directly – and the group can see responses which can be helpful for all to read.
  • During the class, we and the instructors share files and other relevant information in our groups. This supports their learning after the training is over. The participants use it to plan social outings and share community info. We also share end-of-course evaluation links through the app so participants can complete course surveys.
  • After the courses and when participants return to work, we use WeChat to stay connected and promote upcoming courses among those alumni. We share resources – such as links to new publications or conferences – with the participants. We’ve found that if instructors are active users, the groups will tend to stay more connected.

Hot Tip:

  • Remember that not everyone has a smartphone or feels comfortable connecting in a group. So make provisions – such as sending information via email – to those who wish not to participate through instant messaging.

Rad Resources:

  • Different apps are more popular in some regions than others. So explore what people in your region might be using such as WhatsApp, iMessage and others.

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi all!  Liz Zadnik here, aea365 Outreach Coordinator and occasional Saturday Contributor.  I wanted to share some insights and reflection I had as the result from a recent EVALTALK discussion thread.  Last month, someone posed the following request:

I’m searching for a “Why Evaluate” article for parents/community members/stakeholders. An article that explains in clear and plain language why organizations evaluate (particularly schools) and evaluation’s potential benefits. Any suggestions?

Rad Resources: Others were kind enough to share resources, including this slideshare deck that moves through some language and reasoning for program evaluation and assessment, book recommendations  There is also a very helpful list from PlainLanguage.gov offering possible replacements for commonly-used words.  (Even the headings – “Instead of…” and “Try…” – make the shift seems much more manageable).

Lessons Learned: Making evaluation accessible and understandable requires tapping into an emotional and experiential core.

  • Think about never actually saying “evaluate” or “evaluation.”  It’s OK not to use phrases or terms if they are obstacles for engaging people in the evaluation process.  If “capturing impact,” “painting a picture,” “tracking progress” or any other combination of words works…use it!  It may be helpful to talk with interested or enthusiastic community members about what they think of evaluation and what it means to them.  This helps gain insight into relevant language and framing for future discussions.
  • Have the group brainstorm potential benefits, rather than listing them for them.  Similar to engaging community members in discussion of the “how” is also asking them what they feel is the “why” of evaluation.  I have heard the most amazing and insightful responses when I have done this with organizations and community members.  Ask the group “What can we do with the information we get from this question/item/approach?” and see what happens!
  • Evaluation is about being responsible and accountable.  For me, program evaluation and assessment is about ethical practice and stewardship of resources.  I have found community members and colleagues receptive when I frame evaluation as a way to make sure we are doing what we say we’re doing – that we are being transparent, accountable, and clear on our expectations and use of funds.

We’d love to hear how others in the aea365 readership are engaging communities in accessible conversations about evaluation.  Share your tips and resources in the comments section!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! I am Liz Zadnik, Capacity Building Specialist at the New Jersey Coalition Against Sexual Assault. I’m also a new member of the aea365 curating team and first-time Saturday contributor!  Over the past five years I have been working within the anti-sexual violence movement at both the state and national levels to share my enthusiasm for evaluation and support innovative community-based programs doing tremendous social change work.

Over the past five years I have been honored to work with talented evaluators and social change agents in the sexual violence prevention movement. A large part of my work has been de-mystifying evaluation and data for community-based organizations and professionals with limited academic evaluation experience.

Rad Resources: Some of my resources have come from the field of domestic and sexual violence intervention and prevention, as well as this blog! I prefer resources that offer practical application guidance and are accessible to a variety of learning styles and comfort levels. A partnership between the Resource Sharing Project and National Sexual Violence Resource Center has resulted in a fabulous toolkit looking at assessing community needs and assets. I’m a big fan of the Community Tool Box and their Evaluating the Initiative Toolkit as it offers step-by-step guidance for community-based organizations. Very similar to this is The Ohio Domestic Violence Network’s Primary Prevention of Sexual and Intimate Partner Violence Empowerment Evaluation Toolkit, which incorporates the values of the anti-sexual violence movement into prevention evaluation efforts.

Lesson Learned: Be yourself! Don’t stifle your passion or enthusiasm for evaluation and data. I made the mistake early in my technical assistance and training career of trying to fit into a role or mold I created in my head. Activists of all interests are needed to bring about social change and community wellness. Once I let my passion for evaluation show – in publications, trainings, and technical assistance – I began to see marked changes in the professionals I was working with (and myself!). I have seen myself grow as an evaluator by leaps and bounds since I made this change – so don’t be afraid to let your love of spreadsheets, interview protocols, theories of change, or anything else show!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings colleagues. My moniker is Michael Quinn Patton and I do independent evaluation consulting under the name Utilization-Focused Evaluation, which just happens also to be the title of my main evaluation book, now in its 4th edition. I am a former AEA president. One of the challenges I’ve faced over the years, as many of us do, is making evaluation user-friendly, especially for non-research clients, stakeholders, and audiences. One approach that has worked well for me is using children’s stories. When people come to a meeting to work with or hear from an external evaluator, they may expect to be bored or spoken down to or frightened, but they don’t expect to be read a children’s story. It can be a great ice-breaker to set the tone for interaction.

Hot Tip: I first opened an evaluation meeting with a children’s story when facilitating a stakeholder involvement session with parents and staff for an early childhood/family education program evaluation. The trick is finding the right story for the group you’re working with and the issues that will need to be dealt with in the evaluation.

Rad Resource: Dr. Seuss stories are especially effective. The four short stories in Sneeches and Other Stories are brief and loaded with evaluation metaphors. “What was I scared of?” is about facing something alien and strange — like evaluation, or an EVALUATOR. “Too Many Daves” is about what happens when you don’t make distinctions and explains why we need to distinguish different types of evaluation. “Zaks” is about what happens when people get stuck in their own perspective and can’t see other points of view or negotiate differences. “Sneeches” is about hierarchies and status, and can be used to open up discussions of cultural, gender, ethic, and other stakeholder differences. I use it to tell the story, metaphorically, of the history of the qualitative-quantitative debate.

Hot Tip: Children’s stories are also great training and classroom materials to open up issues, ground those issues in a larger societal and cultural context, and stimulate creativity. Any children’s fairy tale has evaluation messages and implications.

Rad Resource: In the AEA eLibrary I’ve posted a poetic parody entitled “The Snow White Evaluation,” that opens a book I did years ago (1982) entitled Practical Evaluation (Sage, pp. 11-13.) Download it here http://ow.ly/1BgHk.

Hot Tip: What we do as evaluators can be hard to explain. International evaluator Roger Mirada has written a children’s book in which a father and his daughter interact around what an evaluator does. Eva is distressed because she has trouble on career day at school describing what her dad, an evaluator, does. It’s beautifully illustrated and creatively written. I now give a copy to all my clients and it opens up wonderful and fun dialogue about what evaluation is and what evaluators do.

Rad Resource: Eva the Evaluatorby Roger Miranda. http://evatheevaluator.com/

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings from the Last Frontier. I’m Alda Norris, webmaster for the Alaska Evaluation Network (AKEN) and evaluation specialist for the University of Alaska Fairbanks Cooperative Extension Service (CES).

The faculty and staff I work with at CES are experts in a variety of fields, from horticulture, entomology and forestry to economics, nutrition and child development. That adds up to quite an interdisciplinary organization! Our diversity makes for fantastic collaborations, as well as complicated syntheses. Lucky for me, my PhD is in interpersonal communication, which applies across the board.

Lessons Learned:  Ask people to tell you the inspiration behind their projects. Every group has a story to tell.What common goals bring these people together?Inquiring about the “why” and not just the “what” of a program really benefits capacity building efforts. I got to know CES better while writing a Wikipedia entry. Hearing and reading about the contributions Extension has made in Alaska since the 1930s deepened my understanding of what led up to each of our program’s current priorities and logic models.

  • Help yourself with history. Too often we are mired in a static view of where an organization is now, rather than having an appreciation for how it has changed, and continues to change, over time. Even in a “young” state like Alaska, there is rich historical data we can learn from.
  • Boost your evaluation planning by gathering information on your/the client organization’s “story” from a variety of sources. Talk to emeritus professors, compare the org chart of today to past decades, and comb through newspaper archives. Becoming familiar with past waves of change is very helpful in understanding the meaning behind current missions, goals and structures (and people’s attachments to them).

Hot tip: Communicate about communication! Add a question about communication preferences to your next needs assessment. Don’t assume you know what level of technology and form(s) of interaction your colleagues and clients are comfortable with. Before you do a survey, figure out what modes of communication the target population values. For example, if oral history is a large part of a sample group’s culture, how well will a paper and pencil form be received?

Rad Resources:

  1. The National Communication Association (NCA) can help you step up your message design game. Take advantage of free advice from experts on verbal and nonverbal communication by reading NCA’s newsletter, Communication Currents.
  2. AnyMeeting is a freetool that you can use to reach a wider audience. With it, you can host online meetings and make instructional videos, both of which are really handy when working in a geographically diverse setting. AnyMeeting also has screenshare clarity in its recordings that Google Hangouts lacks.

The American Evaluation Association is celebrating Alaska Evaluation Network (AKEN) Affiliate Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I am Efrain Gutierrez and I work for FSG, a nonprofit consulting firm that helps foundations, nonprofits and corporations increase their social impact. Last year a friend started collaborating as an evaluator for a program that works with LGBTQ youth. Before starting his evaluation he wanted to talk about cultural competency when working with the LGBTQ community. As I prepared for the meeting, I reflected on the lessons that I think would be most useful for evaluators working with this community:                    

Lessons Learned: 1. A person’s sexuality is not the only thing affecting their life. The LGBTQ community replicates the patterns of sexism, racism, and classism prevalent in our society. Problems affecting women and other underrepresented groups are also affecting members of the community. Being queer creates a “multiplier effect,” making it even more challenging for queers to overcome social barriers, stay healthy, get an education, make a decent wage, etc. A clear example of this “multiplier effect” is in the study All Children Matter: How Legal and Social Inequalities Hurt LGBT Families . The document shows how children across races are more likely to live in poverty if they live with a same sex couple compared to those living in different sex couples (see graph from the report below).  As evaluators it is important to account for this “multiplier effect” and be open and prepared to discuss race, sexism, class, and other social issues when engaging with LGBTQ folks.

gut

2. Account for a diversity of voices in your evaluation; tapping only into the most visible LGBTQ members might not give you the diversity needed. Since the LGBTQ movement often reproduces patterns of racial and gender separation prevalent in our society, most intellectual and political circles in the community remain predominantly cis-gender, male, and white. As you determine who to include in your evaluation look for a representative set of members of the LGBTQ community to provide a full picture of the issues affecting the recipients of the programs you are evaluating.

3.  Don’t take for granted that you understand the political context for LGBTQ rights just by reading the headlines. Marriage equality is important, but there is a wide range of challenges affecting the community popular narrative is not focusing on: discrimination against transsexuals, violence against queers living in rural areas, and inadequate access to resources for queers with special needs, to name just a few of the issues evaluators should consider as they work with the LGBTQ community.

Rad Resources:

A Fragile Union  – article on gay politics

Allan Bérubé’s work

aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. We’re celebrating LGBT Evaluation week with our colleagues in AEA’s LGBT Topical Interest Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. 

· ·

I’m Kathleen Tinworth and I co-chair the recently re-named Arts, Culture, and Audiences TIG of AEA with Don Glass, who began this week’s AEA365 series. I lead the Audience Insights department at the Denver Museum of Nature & Science and also consult via my alter ego, ExposeYourMuseum.

Lessons Learned

  • Don started this week with a truism about evaluation in arts and cultural settings: “outcomes and outputs…are sometimes inventive, innovative, and unpredictable.”
  • Jessica Sickler provided a great anecdote of exactly that, writing about interviewing while a child tied a stuffed snake around her legs!
  • The work lends itself to creative tools, instruments, and measures—for example, the timing and tracking method outlined in Amy Grack-Nelson’s post.
  •  That said, there are often real challenges associated with defining audience outcomes, gathering data in ever-moving, highly social environments, and promoting the value of evaluation to arts and culture organizations and stakeholders, as Joe Heimlich underscored.
  • “Performing arts organizations,” Jennifer Novak-Leonard reminded us “are in the business of transforming individuals through arts experiences, but evaluation is rarely on their radars and box office receipts and the number of ‘butts in seats’ are used as proxies of how their art impacts and transforms individual people.”

To combat the challenges above you might assume that arts, culture, and audience evaluators have mastered creativity and innovation when it comes to reporting, presenting, and dissemination– ensuring our communication is as vivid and inspiring as the environments in which we work. Here’s a secret: we haven’t. (Just asked Stephanie Evergreen, who critiqued more museum evaluations than any person should ever have to for her PhD dissertation.) As an evaluator in this sector, and as an AEA TIG co-chair and board member of the Visitor Studies Association, prioritizing good, clean, accessible evaluation communication tops my “OMG that’s gotta change NOW” list.

Rad Resources

Thanks for joining us this week and come visit ACA sometime soon.

The American Evaluation Association is celebrating Arts, Culture, and Audiences (ACA) TIG Week. The contributions all week come from ACA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

Older posts >>

Archives

To top