AEA365 | A Tip-a-Day by and for Evaluators

CAT | Uncategorized

This is John LaVelle, Louisiana State University, and Yuanjing Wilcox, EDucation EValuation EXchange, members of AEA’s Competencies Task Force. Task Force members recently shared the 2/24/16 draft AEA evaluator competencies in five domains: professional, methodology, context, management, and interpersonal. Feedback in coming months will enable us to finalize the set in preparation for two very important engagement activities: (1) a survey of all members to determine the extent to which they agree that these competencies are the right ones for AEA, and (2) a formal vote on the competencies, including a process for their routine revision, thereby making them an official AEA document.

Hot Tip: Keep your eyes open because he Task Force is working on creating professional development materials to enable evaluators, wherever they work, to use the competencies to reflect on their practice and to assess specific needs.  We believe that it is in the reflection process that the explicit value of the competencies will shine as evaluators use them to shape effective practice.  For example:

  • Novice evaluators, those entering the field who want to identify areas of strength and need for development
  • Accidental evaluators, people who may not have formal training, but who are responsible for conducting evaluations
  • Professionals in transition, such as those who may be experts in a particular field, but who want to become competent evaluators in that specific area
  • Experienced professional evaluators, who want to stay abreast of changes in the field’s practice and theory

We envision an individual assessment process similar to that used for the Essential Competencies for Program Evaluators (http://www.cehd.umn.edu/OLPD/MESI/resources/ECPESelfAssessmentInstrument709.pdf) and an interactive process that groups of evaluators (e.g., members of a firm, students in a cohort) could use to customize the competencies to their specific settings.

Lessons Learned: Feedback on the first draft of AEA competencies raised the question of to what extent individual evaluators need to demonstrate each of the competencies because many evaluators work in collaborative groups. We added one competency (Interpersonal Domain 5.7) to address the fact that for many evaluators teamwork skills are essential. We believe that the question of whether the entire set of competencies should apply to individual evaluators versus teams is context-dependent; we invite people to use the competencies as suits their settings and practice.

Rad Resources: If you are interested in a quick orientation to the world of evaluator competencies, consider these three readings:

  • King, J. A., Stevahn, L., Ghere, G., & Minnema, J. (2001). Toward a taxonomy of essential evaluator competencies.  American Journal of Evaluation, 22(2), 229-247.
  • Russ-Eft, D., Bober, M. J., de la Teja, I., Foxon, M. J., & Koszalka, T. A. (2008). Evaluator competencies: Standards for the practice of evaluation in organizations.  San Francisco, CA: Jossey-Bass.
  • Wilcox, Y., & King, J. A. (2014). A professional grounding and history of the development and formal use of evaluator competencies. Canadian Journal of Program Evaluation, 28(3), 1-28.
  • Buchanan, H., & Kuji-Shikatani, K. (2014). Evaluator competencies: The Canadian experience. Canadian Journal of Program Evaluation, 28(3), 29-47.

Hot Tip: See you at #eval17 where we hope to unveil the final draft competencies!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! This is Sheena Horton again, President-Elect and Board Member for the Southeast Evaluation Association (SEA). I wanted to close out SEA’s AEA365 week by providing you with a few tips for stimulating and maintaining your professional growth. In a similar vein of thought as evaluation being an everyday activity, our own professional growth should be approached as an everyday opportunity.  Isaac Asimov once said, “People think of education as something they can finish.”  Learning is a life-long commitment.

The extent to which we seek growth opportunities should not be limited by our current positions, schedules, finances, networks, or fears and hesitations, but be defined by the depth of our intellectual curiosity, aspirations, and commitment to evaluating and bettering ourselves.

Hot Tips:

  • Search YouTube regularly for quick tips or full lessons to develop your knowledge or skills in a specific area, such as in Excel. There are also many free virtual courses and trainings offered at CourseraedXMIT OpenCourseWareFindLectures, and Udemy.
  • Follow the professional development strategy that George Grob suggested at a past AEA Conference: Every year, pick one hard skill and one soft skill to develop over the course of the year.
  • Choose a few bloggers to follow to pick up daily tips and stay up to date on the latest evaluation news. Take it a step further and volunteer to write for a blog or newsletter! AEA365 blog posts are short and allow you to perform a high-level review of a topic of interest or share your experiences and tips with others. SEA’s seasonal newsletter accepts a variety of submissions on evaluation and professional development topics, and article length can vary from a sidebar to a feature article.
  • Volunteer for AEA and SEA short- or long-term projects, or sign up for programs, conferences, and workshops. AEA’s next conference is scheduled for November 6th-11th, 2017 in Washington, DC.  SEA will be holding its 2-day Annual Workshop on February 27th-28th, 2017 in Tallahassee, FL, and will offer in addition to its normal programming a secondary track that will feature Essential Skills training sessions, including as “Evaluation Planning and Design,” “Relating Costs and Results,” and “Effective Presentations.”

Rad Resources:

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, again! My name is Dr. Michelle Chandrasekhar and I serve on the Southeast Evaluation Association (SEA) Executive Board. Every year in February, SEA holds an Annual Workshop that offers attendees networking opportunities and a variety of presentations and panel discussions on evaluation issues. At SEA’s 2015 Annual Workshop, the Board facilitated a round table discussion and asked participants to discuss common challenges encountered in conducting evaluations.

Hot Tips: Below is a summary of the observations and tips from SEA’s Workshop round table discussions. Overall, attendees indicated that evaluators need to know how to do the following:

  1. Talk about Evaluation.
  • Build buy-in and rapport – for example, use stories to explain numbers.
  • Create or find case studies or examples that help evaluators talk to others.
  • Communicate the value of evaluation to leadership.
  • Manage the politics – particularly in how data is presented or for analyzing sensitive data.

2. Plan for Good Evaluation.

  • Demonstrate cultural competence – this means going beyond language barriers.
  • Develop good logic models and get them validated up front.
  • Establish relationships among key people in the client’s organization, as well as among fellow evaluators who can help you problem solve.
  • Include front-line people in the conversation to find problems and solutions, or to review reports.
  • Make recommendations that use Return on Investment concepts.
  • Work within the confines of a grant rather than what the evaluator or client may want to do.

3. Manage Evaluations.

  • Manage multiple projects in various stages – use project management tools and update items in your toolbox (reports, communication protocols, client capacity building information).
  • Manage time and people to stay on track – understand the amount of effort needed for a project and that it isn’t practical to make it perfect.
  • Work within the budget (estimate the billable hours, time frame, and amount to charge) and include the client in the process.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

My name is Dr. Moya Alfonso, MSPH, and I’m an Associate Professor at the Jiann-Ping Hsu College of Public Health at Georgia Southern University, and I am University Sector Representative and Board Member for the Southeast Evaluation Association (SEA). I would like to offer you a few tips on engaging stakeholders in participatory evaluation based on my 16 years of experience engaging stakeholders in community health research and evaluation.

Participatory evaluation is an approach that engages stakeholders in each step of the process.  Rather than the trained evaluator solely directing the evaluation, participatory evaluation requires a collaborative approach.  Evaluators work alongside stakeholders in developing research questions, deciding upon an evaluation design, designing instruments, selecting methods, gathering and analyzing data, and disseminating results.  Participatory evaluation results in stronger evaluation designs and greater external validity because community members have a high level of input in entire process.  It also strengthens buy-in to the results and a greater use of the evaluation products.

Rad Resource: Explore the University of Kansas Community Tool Box for introductory information on participatory evaluation.

Hot Tips: Here are a few tips for engaging stakeholders:

  • Establish a diverse stakeholder advisory group: Community stakeholders have a range of skills that can contribute to the evaluation process. For example, I worked with 8th grade youth on a participatory research project and assumed that I would need to conduct the statistical analysis of survey data.  To my surprise, one of the youths had considerable expertise and was able to conduct the analysis with little assistance. With training and support, community stakeholders can contribute and exceed your expectations.
  • Keep stakeholders busy: A common problem in working with advisory groups is attrition. Keep community stakeholders engaged with evaluation tasks that use their unique skill sets. Matching assignments to existing skill sets empower community stakeholders and result in increased buy-in and engagement.
  • Celebrate successes: Celebrating successes over the course of the evaluation is a proven strategy for keeping stakeholders engaged. Rather than waiting until the end of the evaluation, reward stakeholders regularly for the completion of evaluation steps.
  • Keep your ego in check: Some highly trained evaluators might find handing over the reins to community stakeholders challenging because they’re used to running the show. Participatory evaluation requires evaluators to share control and collaborate with community stakeholders. Try to keep an open mind and trust in the abilities of community stakeholders to participate in the evaluation process with your support and guidance.  You’ll be amazed at what you can achieve when stakeholders are fully engaged in evaluation research! 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, we’re Southeast Evaluation Association (SEA) members Taylor Ellis, a doctoral student and lead evaluator, and Dr. Debra Nelson-Gardell, an Associate Professor, providing consultation at the School of Social Work at The University of Alabama. We form a team tasked with evaluating a program providing community-based, family-inclusive intervention for youth with sexual behavior problems (youngsters who lay people might call juvenile sex offenders). This post focuses on our lessons learned regarding our approach to resistance in program evaluation.

Taut and Alkin (2002) reported people stereotypically view program evaluation as “being judged…that the evaluation is used to ‘get me’, that it is not going to be used to assist me but is perceived to be negative and punitive in its nature” (p. 43). Our program evaluation faced derailment because the program had never been evaluated before, or perhaps because of the inevitability of resistance to evaluation.  Accepting the resistance as normal, we tried addressing it.  But, our efforts didn’t work as we had hoped. Below are the hard lessons learned through “hard knocks.”

Lessons Learned:

  • The Importance of Stakeholder Input: Stakeholders need to believe evaluators will listen to them.  Early in the evaluation process, stakeholders were interviewed and asked about their ideas for program improvement to promote engagement in the process. What the interviews lacked was a greater emphasis on how what stakeholders said affected the evaluation.
  • Remember and (Emphatically) Remind Stakeholders of the Evaluation’s Purpose/Goals: During the evaluation, the purpose of the evaluation was lost in that stakeholders were not reminded of the evaluation’s purpose. Project updates to stakeholders should have been more intentional about movement towards the purpose. We lost sight of the forest as we negotiated the trees. This lack of constant visioning led to many stakeholders viewing the evaluation implementation as an unnecessary hassle.
  • The Illusion of Control: Easily said, not easily done: Don’t (always) take it personally. Despite our efforts, a great deal of resistance, pushback, and dissatisfaction remained. After weeks of feeling at fault, we found out that things were happening behind the scenes over which we had no control, but that directly affected the evaluation.

Knowing these lessons earlier could have made a difference, and we intend to find out.  Our biggest lesson learned:  Resist being discouraged by (likely inevitable) resistance, try to learn from it, and know that you are not alone.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hi all! My name is Sheena Horton, President-Elect and Board Member for the Southeast Evaluation Association (SEA). As I have been learning more about the traits of great leaders and how leaders mobilize others, I have found one element that is frequently mentioned: a leader’s influence.

Influence may seem like an obvious determinant of a leader’s success; you’re not a leader if no one will follow you. Think about a colleague for whom you would work hard for or without hesitation, and then think about a colleague for whom you would not. Why do you want to help the first colleague, but avoid the second?  What makes some leaders more effective than others? How do leaders influence others?

Hot Tips:

  • Ask. Show interest in your colleagues. Ask about their day, goals, and challenges. Build rapport and be people-focused instead of task-focused. Understanding their needs will help you convey to them the benefits of listening to you.
  • Listen. Effective leaders take the time to listen. There is a difference between leading and simply managing. Managers command action while leaders inspire it. Leading is to be focused on others – not yourself.
  • Visualize the other side. Try to understand the other person’s perspective and motivations. By doing so, you will be in a better position to address their concerns, tap into their motivations, and utilize their strengths and interests to build a more effective and mutually beneficial working relationship.
  • Be proactive. Identify, monitor, and manage risks to your team’s success. Ask your team what they need to complete their tasks, and make sure they have what they need to get things done. Address issues quickly and directly.
  • Build credibility through your actions. Consistency is key; unpredictability weakens your ability to influence and lead. Build trust and credibility by following through on what you say. Be the person that others seek out for solutions. Provide reasons for the actions you want taken.
  • Show appreciation. A simple “thank you” or “good job” can go a long way. Express your interest and investment in your team’s growth and success by providing constructive feedback. This feedback provides valuable insight, builds trust, and is an opportunity to motivate. Be supportive by mentoring or providing training or assistance.

Remember: Leadership is not about you. It’s about them. Leadership is about influencing others so they will want to help you.

Rad Resources:

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

engaging-online-content

I hope your new year’s celebrations were filled with laughter and rest – getting you ready for another year of projects and adventures!  

Last year I offered some ideas for when creative block strikes, as well as some ideas for generating meaningful online content (other than a blog post).  I thought I would revisit those ideas, especially for folks interesting in contributing to this blog in the coming year.   

Rad Resources:  Just like the saying goes, sometimes images speak volumes more than text.  There are some beautiful free stock photo sites out there, and even more free and user-friendly design sites.  Can you convey some of your information via an infographic or graph?  This may free up some space for your to dive deeper into a concept or offer background on a project. Images also help create white space (a good thing!) and a more readable screen.  

Hot Tip: For those brave souls, try getting in front of a camera!  Vlogs (or video blogs) are a fantastic way to share your knowledge and expertise with readers or followers.  Videos don’t have to be long and can include visual aids and graphics to make them even more appealing.  There are a number of affordable video editing apps – I’ve used iMovie for personal projects and it could not be easier to use.  Videos can be hosted on sites like YouTube or Vimeo and then embedded in blog posts or on websites.  

Lesson Learned: Did you (or will you) host a Twitter chat or hashtag campaign?  Share your insights without having to revisit every tweet using curating tools like Storify.  You can pull together the highlights and evolution of an online conversation, offering you a chance to have a reference point for synthesis and historical perspectives.  

Creating engaging content is not all about getting more page views or Likes or Retweets (although that’s a part) – it’s also about getting out of your comfort zone in order to share your perspective with the world.  People learn and absorb information in so many ways.  Sometimes reading an evaluation report isn’t feasible, but listening to or watching you talk about the project is!  Different types of content connect with different types of people.   

How have you experimented with different media?  Or do you have a goal this year to try something new?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello Loyal aea365 readers! I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor with one question for you: What is it that YOU would like to read about on this blog?

I started posted this blog article two years ago and we received some excellent responses from readers. I then shared these responses in a subsequent post and we received blog articles on some of the suggested topics from authors willing to answer the call. Here we go again with some minor updates to reflect the times:

Lesson Learned: AEA365 has been going steadily since January 1, 2010 with 2500+ contributions (Wow!) from hundreds of evaluators across the globe. We accept individual submissions at aea365@eval.org on a rolling basis, along with inquiries about sponsored or themed weeks. Posts are about any and all evaluation-related topics, and anyone with something to share with fellow evaluators is welcome to contribute! If you are interested in sharing a tip, please be sure to check out our *updated* contribution guidelines here.

As a key learning tool for evaluation, aea365 can also be a fabulous vehicle for promoting evaluation and evidence-based policy. With that in mind, we would like to include your voice as we head into the new year as our aea365 team considers inviting authors and groups to contribute.

Hot Tip: Let’s crowdsource some ideas for aea365 in 2017 and make it the best year ever.

Please let us know what you would like to see in aea365 by responding to these questions in the comments:

1. What do YOU want to read or learn more about on aea365 in 2017?

2. Who do YOU want to hear from on this blog?

Cool Trick: Did you notice the flipped headline today? With more of our subscribers reading blog articles from smaller devices, we have come to realize (and heard from readers!) that people who scroll through multiple emails a day want to see the content first, before the authors’ names, especially for articles with multiple authors. With that in mind, starting today, you will see the title of the article followed by the authors’ names.

Thanks very much for your input and your loyal readership. Happy New Year!

firework-new-year-s-eve-december-31-fireworks

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We are the 2016 AEA Minority Serving Institutions (MSI) Fellows: Cirecie West-Olatunji (counselor education-Xavier University of Louisiana), Jeiru Bai (social work-University of Nebraska at Omaha), Kate Cartwright (health administration-University of New Mexico), Smita Shukla Mehta (special education-University of North Texas), and Chandra Story (public health-Oklahoma State University).

It seems that it was only a few weeks ago that we shared our biographical and personal goal statements and listened to our evaluation mentor, Art Hernandez, share a (long!) list of reading resources. Hailing from diverse academic disciplines, we wondered how we would integrate seemingly disparate ideas and philosophies to jointly construct a presentation for the annual conference. After 12 months of biweekly telephone conference calls, the week-long AEA Summer Institute, a joint AEA conference presentation, and life changes (e.g., Smita was promoted to Full Professor rank and Jieru had a beautiful 7 lb, 8 oz. baby boy), we now share key lessons learned from our multidisciplinary thinking.

msi

Lessons Learned:

#1: Set Aside Time to Read

We are often too busy to set aside time for reading, reflection, and dialogue with others. Being involved in this fellowship, I found it critical to schedule time to acquire knowledge that I could integrate into my existing skill set.

#2: Evaluation can be Creative

Prior to this fellowship, I thought that data collection methods for culturally responsive evaluation were limited. My learning experiences through the AEA conference and the summer institute have changed my paradigm! There are many creative approaches to evaluation, including ripple effects mapping.  These approaches provide proper context for evaluation while honoring communities.

#3: Transcend Disciplinary Boundaries

As a relatively new evaluator, I learned to always remember that evaluation theory and practice transcend disciplinary boundaries. When planning an evaluation, I now look beyond practices in any one discipline. A good starting place is the AEA website!

#4: Distinguish Research Methods from Program Evaluation

While I acknowledged a difference between research methods and program evaluation, the distinction became clearer after the summer institute and AEA conference. Evaluation design requires a lot more technical skills in mixed methods data collection and analyses. Conducting an evaluation also requires social skills (e.g., trust, compassion, connection, communication, facilitation) to connect with stakeholders.

We are grateful to the AEA community for creating the MSI Fellowship program. Thanks to you, we can continue crystallizing our evaluation identity and competence.

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Chandra Story. I was a 2016 AEA Minority Serving Institution Fellow this year and am a faculty member at Oklahoma State University in Health Education and Promotion/Public Health. My background includes project management and evaluation for federally funded projects, along with non-profit organizations. I have had the opportunity to partner with amazing community members across the country to describe and define what evidence means. The purpose of this blog is to explore and share a few tips on evidence based practice and practice based evidence.

A few definitions and thoughts:

Evidence based practice (EBP) is considered the foundation of public health practice. As a scholar, I am aware of the importance of evidence as a framework. However, as a culturally responsive evaluator, I need to allow community members to compare current EBP with their culture and definitions of health. By engaging community members, we are adding to the evidence base.

Practice based evidence (PBE) is the result of meaningful partnerships between academia and communities to identify and develop appropriate evaluation strategies. Due to cultural nuances, evidence may be defined in different ways. For example, increases in self- esteem among youth due to participation in cultural practices can be considered as evidence of program success in some communities.

Hot Tip:

In closing, I feel that both EBP and PBE are needed for effective and culturally responsive evaluation.  As an evaluator, I am responsible for investigating how success is defined by the community. With the right conversations, evaluators and communities can partner for better health outcomes.

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Older posts >>

Archives

To top