AEA365 | A Tip-a-Day by and for Evaluators

TAG | Evaluation

Hi my name is Jayne Corso and I am the Community Manager for AEA. Evaluation 2017 officially kicked-off yesterday! And, we had a great start with the opening plenary session from AEA president Kathy Newcomer. In her presentation, Kathy discussed the challenges evaluators are facing and how we can overcome those challenges to push evaluation and the evaluation profession forward.

Looking at the twitter response on #Eval17, her message was heard loud and clear. I want to share just a few of the conversations that were taking place online.

We look forward to more great plenary sessions! Keep tweeting.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on theaea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi my name is Jayne Corso and I am the Community Manager for AEA. There are many reasons to start blogging: to share your work and strategies for evaluations; to become an evaluation through leader; to become a stronger writer and explain your thoughts—the reasons can be endless. I have compiled a few tips to help you create an effective blog that resonates with your followers.

Creating a Blog

Hot Tip: Content

First, identify themes, concepts, or trends that relate to your audience or other evaluators. What topics will you highlight in your blog and how will your blog stand out? For example, will your blog focus entirely on data visualization, or trends in evaluation? Once this is decided you can start working on the details.

Next, decide how often you are going to blog. Is your blog going to be a daily blog, weekly blog, or monthly blog? When making this decision, you must look at your content resources and your available time. What can you commit to, and how and from what sources are you going to gather your content?

Hot Tip: Writing

When writing a blog, you want to be aware of tone, length, and formatting. Write in a conversational tone, using personal pronouns whenever possible.  You also don’t want your blog to be too long. Typically a blog post is 1,000 words or less.  In addition, you want to break up long paragraphs or text. Try bullet points, numbered lists, or visuals to make your post more interesting.

Hot Tip: Call to Action

An important aspect of blogging is starting a conversation and obtaining your follower’s feedback. Invite your follower’s to provide their opinions or questions in the comments. This allows your post to have a longer shelf life and helps you engage with other evaluators.

I look forward to reading your blogs on evaluation! Please share your tips or questions in the comments.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hi my name is Jayne Corso and I am the Community Manager for AEA. Facebook is a great tool for reaching other evaluation professionals. The platform makes it easy to share relevant articles, videos, and thoughts with your followers. However, It can be difficult to get your posts on your follower’s newsfeeds because Facebook only shares 12% of your content. You can increase this percentage by writing effective and engaging Facebook posts!

Hot Tip: Keep your post short

Facebook posts should be 1-3 sentences. If a post is too long, a “see more” button will appear. Nine times out of 10 Facebook users will not click on this button and read the rest of your content. Keeping your post short and sweet can make a big difference in engagement.

Hot Tip: Ask you followers to interact

Asking your followers to comment encourages engagement and involvement. You can use this tactic by stating “like this post if you agree” or “share your thoughts in the comments”. Another way to encourage engagement is to ask your followers for advice. This tactic often starts a discussion on your page.

Hot Tip: Make your links compelling

When posting a link to an article on your Facebook page, make sure the link has a compelling photo and interesting title. These are editable fields, meaning you can customize how your link appears. Sometimes links can pull titles and pictures that are not relevant to your content.

Hot Tip: Use different types of posts

Mix up the content formats you are posting to your page. Use a mix of links, pictures, videos, and albums to make your page more interesting.

What are your favorite Facebook tips? Tell us in the comments!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, this is Bill Fear, independent evaluator and freelancer.  Over the years my interest in evaluation has spanned a number of disciplines and sectors.  The common theme that continually emerges is one of change management and more latterly, I have narrowed this down to organizational change.  This helps to focus the wide range of ideas, theories, methods and so provides a common language for senior stakeholders.

All evaluations are concerned with change, and all social interventions – at whatever level – involve some sort of organization.  All organizations have some degree of ‘management’.  Thus, learning the related language and assumptions about the change process and outcomes provides a means to frame any evaluation.

Hot Tip:  Learn at least the basics of change management/organizational change.  This will give a vocabulary and set of constructs that can be used with a wide range of international stakeholders.  It also provides a frame for any evaluation.

I also find it helpful to stay in touch with the day-to-day resources that senior managers will access as these resources tend to present a credible view of social interventions.  For example, searching for ‘evaluation’ on the Financial Times website  returns, among others: evaluations relating to aid for Haiti, calls for cost-benefit evaluations in the NHS (UK); evaluation by the International Rescue Committee; and the setting up of an Independent Evaluation unit in the Bank of England.

Hot Tip: Regularly scan the websites of reporting magazines and Newspapers such as Forbes, The Economist, and the Financial Times.

Many disciplines contribute to, and practice, evaluation and I have also found it helpful to peruse the websites of the relevant membership organizations.  Examples range from The Academy of Management  through The American Psychological Association, to the American Society for Anesthesiologists.

Hot Tip:  Find professional organization websites and look for ‘resources’, ‘toolkits’, ‘publications’, ‘news’, and search for ‘evaluation’ on the site.

Hot Tip:  Remember, evaluation is practiced within all disciplines and all disciplines practice in the field.  Don’t assume that as evaluators we can just fly in, apply a method, analyse the data, and fly out with total objectivity and impunity.  We are not alone.

As I said earlier, I find it useful to frame these seemingly disparate approaches using either change management or organizational change.

Rad Resource: Burnes, B. (2014).  Managing Change. Pearson Education Ltd: Edinburgh Gate; Tolbert, P. S. and Hall, R. H. (2008).  Organizations: Structures, Processes and Outcomes.  Pearson.

Rad Resource: For a UK slant http://www.businessballs.com/changemanagement.htm and for a more USA slant http://managementhelp.org/organizationalchange/.

Final Hot Tip: The problem with having an open mind is that people keep on wanting to put things into it.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, I’m Jen Przewoznik, Director of Prevention and Evaluation at the North Carolina Coalition Against Sexual Assault. I have been working with and within lesbian, gay, bisexual, transgender, queer, and intersex (LGBTQI+) communities for 15 years. I’d like to share some thoughts about conducting research with and within LGBTQI+ communities that I have learned, using as an example a current study I am co-investigating.

Research with and within LGBTQI+ communities has happened for decades. More and more of this research is conducted by people who are well trained in data collection and analysis regarding people who claim non-normative sexual and gender identities. Unfortunately, a lot of this research still misses the mark. Some researchers, agenda-driven, “miss the mark” because they are actively trying to defame LGBTQI+ people. Most studies, however, seem to miss the mark due to fundamental design flaws.  There are still measurement tools being created (maybe right now?!?! Let’s hope not right now) that conflate sexual orientation and gender identity.

Hot Tip: Friends don’t let friends conflate sexual orientation and gender identity. I know you wouldn’t do this, but if you see a researcher doing this, please tell them to stop.

Hot Tip: Engage BOTH LGBTQI+ people and researchers in the process of creating instruments to better understand LGBTQI+ lives and experiences.  Myself and Juliette Grimmett, NC Sexual Violence Prevention Team member, are collaborating with Drs. Paige Hall Smith and Leanne Royster of UNC Greensboro on a study about LGBTQI+ peoples’ experiences with sexual violence on NC College Campuses.  The results will help campuses create inclusive and affirming sexual violence prevention programming. We began by holding a daylong semi-structured qualitative discussion group to engage folks in conversations about sexual violence and LGBTQI+ communities. People were chosen for their experience in sexual violence or LGBTQI+ campus work with an emphasis on inviting people we knew to be allies and/or themselves LGBTQI+-identified.

Lessons Learned: The output from the meeting heavily informed the survey, which includes questions about sexual violence without using normative terms for body parts and allows participants to choose “all that apply” for identity questions. Our colleagues reminded us that this work can’t be as neat and tidy as it sometimes seems researchers and statisticians would like.

When we exclude necessary research elements because we do not have the knowledge or are too concerned with whether the data will be publishable (statistical significance, the enemy of robust LGBTQI+ research. Kidding. Sort of.), we are left with results that are largely unreliable. While this shouldn’t hold us back from doing this work, it is incredibly important that we continue to explore ways to ask difficult questions and analyze complex responses in order to truly understand peoples’ lived experiences.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hi all, we’re blogging today from the National Resource Center on Domestic Violence. Cris Sullivan is NRCDV’s Senior Research Advisor, and Annika Gifford is Senior Director of Policy and Research. Together with CEO Anne Menard, one of our projects has focused on helping domestic violence organizations evaluate how their services impact domestic violence survivors and their children.

Domestic violence (DV) programs have been undergoing scrutiny to demonstrate that they are making a significant difference in the lives of those using their services. Increasingly, funders are expecting them to demonstrate that their efforts are resulting in positive outcomes for survivors.

In addition to the issues facing all nonprofits trying to evaluate their impact (e.g., little to no money, time or expertise), DV programs have the following additional factors to consider:

  • They are often working with people in crisis who may not be in a space to engage in program evaluation.
  • They have to consider safety and confidentiality of the people with whom they work (so, for example, cannot contact people later through mail).
  • Some funders expect DV programs to have unrealistic or even victim-blaming outcomes (e.g., “victims will leave the relationship”).
  • DV programs recognize that each survivor seeking help has their own individual needs, life experiences, and concerns. Services are tailored to each person, making program evaluation that much more difficult.

Rad Resource: To help domestic violence programs evaluate their work on their own terms — DVEP_Magnetand with no extra money or time — we have created an online resource center that houses a great deal of free and accessible resources.

Among other things, The DV Evidence Project houses a theory of change that programs can use to demonstrate the process through which their services result in long-term benefits for survivors and their children. The site also provides brief summaries of the evidence behind shelters, advocacy, support groups and counseling (demonstrating that programs are engaged in “evidence-based practice”). Finally, evaluation tools are provided so that programs don’t need to re-invent the wheel.  These evaluation tools include client surveys, tips for engaging staff in evaluation, strategies for gathering the data in sensitive ways, and protocols for interpreting and using the findings. We hope these resources are helpful to those in the field doing this incredibly important work!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi there, Liz Zadnik here, new(ish) member of the aea365 curating team and sometimes Saturday poster. Last year Sheila posed the question What is it that YOU would like to read about on this blog?

One of the responses resonated with me, as it represented my relationship with evaluation as a professional:

I would love to see a post, or series of posts about evaluation from the perspective of practitioners for whom their primary job is not evaluation. Perhaps tips on how to best integrate evaluation into the myriad of other, seemingly more pressing, tasks without pushing it to the back burner.

I work in the anti-sexual violence movement at a state coalition, focusing on prevention strategies, training, and making community-based rape crisis centers accessible to people with disabilities. These three areas are my priorities – there are deliverables and activities that don’t always include evaluation and assessment. Many times – given my love of evaluation – I am the sole voice at the table asking about an evaluation plan. Most of the time we can weave evaluation in from the ground floor, other times it happens a little late(r).

Hot Tip: Ask this (or a similar) question: “How will we know we’ve been successful?” This is the most effective way I have found to help get people thinking about evaluation. It has started some of the most engaging and enlightening conversations I’ve ever had, both about a project and the work of the movement.

Lesson Learned: Sometimes, evaluation takes a backseat to program implementation and grant deliverables. This can be disappointing (to say the least), but I do see a change. Funders are more frequently asking for research, “evidence,” or assessment findings, providing evaluation enthusiasts (like myself) to engage our colleagues in this work.

Lesson Learned: Practice and challenge yourself, even if no one is ever going to see it. One of the ways I “integrate evaluation into the myriad of other, seemingly more pressing, tasks” is evaluating myself and my own performance. I regularly incorporate evaluative questions into training feedback forms, look for ways to assess the effectiveness of my technical assistance provision, and record my professional progress throughout the year. I sit in on as many AEA Coffee Break webinars and other learning opportunities as I can, always practicing the skills discussed and looking for ways to apply them to my work.

I would so appreciate hearing from other practitioners (and evaluators!) about their experiences infusing evaluation into their work. I’d also be happy to answer any questions you might have or write about specific projects in the future. Let me know – the aea365 team is here to please!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I am Anna Douglas and I conduct evaluation and assessment research with Purdue University’s Institute for Precollege Engineering Research; also known as INSPIRE. This post is about finding and selecting assessments to use in evaluation of engineering education programs.

Recent years have seen an increase in science, technology, engineering, and mathematics (STEM) education initiatives and emphasis on bringing engineering learning opportunities to students of all ages. However, in my experience, it can be difficult for evaluators to locate assessments related to learning or attitudes about engineering. When STEM assessment instruments are found, oftentimes they do not include anything specifically about engineering. Fortunately, there are some places devoted specifically to engineering education assessment and evaluation.

Rad Resource: INSPIRE has an Assessment Center website, which provides access to engineering education assessment instruments and makes the evidence for validity publicly available. In addition, INSPIRE has links to other assessment resources, such as Assessing Women and Men in Engineering, a program affiliated with Penn State University.

Rad Resource: ASSESS Engineering Education is a search engine for engineering education assessment instruments.

If you don’t find what you are looking for at the INSPIRE, AWE, or ASSESS databases, help may still be there.

Lesson Learned #1: If it is important enough to be measured for our project, someone has probably measured it (or something similar) before. Even though evaluators may not have access to engineering education or other educational journals, one place to search is Google Scholar with keywords related to what you are looking for.  This helps to 1) locate research being conducted in the similar engineering education area (and they may have used some type of assessment) and 2) locate published instruments, which one would expect has a degree of validity evidence.

Lesson Learned #2: People that develop surveys, generally like others to use them. It’s a compliment. It is ok to contact the authors for permission to use the survey and validity evidence collected, even if you can not access the article.  At INSPIRE, we are constantly involved in the assessment development process. When someone contacts us for use of an instrument, we view that as a “win-win”… the evaluator gets a tool, our instrument gets used, and with the sharing of data and/or results, we can get further information about how the instrument is functioning in different settings.

Lessons Learned #3: STEM evaluators are in this together. Another great way to locate assessment instruments is to post through the STEM RIG in LinkedIN, or pose the question to the EvalTalk listserv. This goes back to Lesson Learned #1: most of the important outcomes are being measured by others.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I’m Jennifer Grove, Prevention Outreach Coordinator at the National Sexual Violence Resource Center (NSVRC), a technical assistance provider for anti-sexual violence programs throughout the country.  I’ve worked in this movement for nearly 17 years, but when it comes to evaluation work, I’m a newbie.  Evaluation has been an area of interest for programs for several years now, as many non-profit organizations are tasked with showing funders that sexual violence prevention work is valuable.  But how do you provide resources and training on a subject that you don’t quite understand yourself?  Here are a few of the lessons I’ve learned on my journey so far.

Lesson Learned: An organizational commitment to evaluation is vital.   I’ve seen programs that say they are committed to evaluation hire an evaluator to do the work.  This approach is shortsighted.  When an organization invests all of its time and energy into one person doing all of the work, what happens when that person leaves?  We like to think of evaluation as long-term and integrated into every aspect of an organization.  Here at the NSVRC, we developed a Core Evaluation Team made up of staff who care about or are responsible for evaluation. We contracted with an evaluator to provide training, guide us through hands-on evaluation projects, and provide guidance to the Team over the course of a few years.   We are now two years into the process, and while there have been some staffing changes that have resulted in changes to the Team structure, efforts have continued without interruption.

Lesson Learned: Evaluation capacity-building takes time.     We received training on the various aspects of evaluation and engaged in an internal evaluation project (complete with logic model, interview protocol, coding, and final report).  According to the timeline we developed at the beginning of the process, this should have taken about eight months.  In reality, it took over 12.  The lesson learned here is this:  most organizations do not have the luxury of stopping operations so that staff can spend all of their time training and building their skills for evaluation.  The capacity-building work happens in conjunction with all of the other work the organization is tasked with completing. Flexibility is key.

Hot Tip: Share what you’ve learned.  The most important part of this experience is being able to share what we are learning with others.  As we move through our evaluation trainings, we are capturing our lessons learned and collecting evaluation resources so that we can share them with others in the course of our technical assistance and resource provision.

Rad Resource: Check out an online learning course developed by the NSVRC, Evaluating Sexual Violence Prevention Programs: Steps and strategies for preventionists.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi all!  Liz Zadnik here, aea365 Outreach Coordinator and occasional Saturday Contributor.  I wanted to share some insights and reflection I had as the result from a recent EVALTALK discussion thread.  Last month, someone posed the following request:

I’m searching for a “Why Evaluate” article for parents/community members/stakeholders. An article that explains in clear and plain language why organizations evaluate (particularly schools) and evaluation’s potential benefits. Any suggestions?

Rad Resources: Others were kind enough to share resources, including this slideshare deck that moves through some language and reasoning for program evaluation and assessment, book recommendations  There is also a very helpful list from PlainLanguage.gov offering possible replacements for commonly-used words.  (Even the headings – “Instead of…” and “Try…” – make the shift seems much more manageable).

Lessons Learned: Making evaluation accessible and understandable requires tapping into an emotional and experiential core.

  • Think about never actually saying “evaluate” or “evaluation.”  It’s OK not to use phrases or terms if they are obstacles for engaging people in the evaluation process.  If “capturing impact,” “painting a picture,” “tracking progress” or any other combination of words works…use it!  It may be helpful to talk with interested or enthusiastic community members about what they think of evaluation and what it means to them.  This helps gain insight into relevant language and framing for future discussions.
  • Have the group brainstorm potential benefits, rather than listing them for them.  Similar to engaging community members in discussion of the “how” is also asking them what they feel is the “why” of evaluation.  I have heard the most amazing and insightful responses when I have done this with organizations and community members.  Ask the group “What can we do with the information we get from this question/item/approach?” and see what happens!
  • Evaluation is about being responsible and accountable.  For me, program evaluation and assessment is about ethical practice and stewardship of resources.  I have found community members and colleagues receptive when I frame evaluation as a way to make sure we are doing what we say we’re doing – that we are being transparent, accountable, and clear on our expectations and use of funds.

We’d love to hear how others in the aea365 readership are engaging communities in accessible conversations about evaluation.  Share your tips and resources in the comments section!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Older posts >>

Archives

To top