AEA365 | A Tip-a-Day by and for Evaluators

Hi, we’re Osman Özturgut, assistant professor, University of the Incarnate Word and Cindy Crusto, associate professor, Yale University School of Medicine. We are members of the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group. We are writing to inform you of our forthcoming professional development workshop at Evaluation 2014 in Denver. Since the last meeting in Washington, we have been “learning, unlearning, and relearning” with several groups and workshop participants with respect to cultural competence. We wanted to share some of our learning experiences.

Our workshop is entitled, “Acknowledging the ‘Self’ in Developing Cultural Competency.” We developed the workshop to highlight key concepts written about in the AEA Public Statement that focus on evaluator hirself and what the evaluator hirself can do to better engage in work across cultures. As the AEA’S Public Statement explains, “Cultural competence requires awareness of self, reflection on one’s own cultural position.” Cultural competence begins with awareness and an understanding of one’s own viewpoints (learning).  Once we become aware of, reflect on, and critically analyze our existing knowledge and viewpoints, we may need to reevaluate some of our assumptions (unlearning). It is only then we can reformulate our knowledge to accommodate and adapt to new situations (relearning). This process of learning, unlearning, and relearning is the foundation of becoming a more culturally competent evaluator.

We learned that evaluators really want a safe place to talk about culture, human diversity, and issues of equity. In our session, we provide this safe place and allow for learning. Participants can explore their “half-baked ideas”, as one of our previous workshop participants had mentioned. This is the idea that we don’t always have the right words or have fully formulated thoughts and ideas regarding issues of culture, diversity, and inclusion. We believe it is crucial to provide a safe place to share ideas, even if they are “half-baked”.

Lessons Learned: We learned that the use of humor is critically important when discussing sensitive topics and communicating across cultures. It reduces anxiety and tension.

Providing a safe place for discussion is crucial, especially with audiences with diverse cultural backgrounds and viewpoints. Be open to unlearning and relearning – Remember, culture is fluid and there is always room for improvement. Get out of your comfort zone to realize the “self”.

Rad Resource: AEA’s Public Statement on Cultural Competence in Evaluation

Also, see Dunaway, Morrow & Porter’s (2012) Development and validation of the cultural competence of program evaluators (CCPE) self-report scale.

Want to learn more? Register for Acknowledging the “Self” in Developing Cultural Competency at Evaluation 2014.

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! We are M. H. Clark and Haiyan Bai from the University of Central Florida in Orlando, Florida. Over the last several years propensity score adjustments (PSAs) have become increasingly popular; however, many evaluators are unsure of when to use them. A propensity score is the predicted probability of a participant selecting into a treatment program based on several covariates. Theses scores are used to make statistical adjustments (i.e., matching, weighting, stratification) to data from quasi-experiments to reduce selection bias.

Lesson Learned:

PSAs are not the magic bullet we had hoped they would be. Never underestimate the importance of a good design. Many researchers assume that they can fix poor designs with statistical adjustments (either with individual covariates or propensity scores). However, if you are able to randomly assign participants to treatment conditions or test several variations of your intervention, try that first. Propensity scores are meant to reduce selection bias due to non-random assignment, but can only do so much.

Hot Tip:

Plan ahead! If you know that you cannot randomly assign participants to conditions and you MUST use a quasi-experiment with propensity score adjustments, be sure that you measure covariates (individual characteristics) that are related to both the dependent variable and treatment choice. Ideally, you want to include all variables in your propensity score model that may contribute to selection bias. Many evaluators consider propensity score adjustments after they have collected data and cannot account for some critical factors that cause selection bias. In which case, treatment effects may still be biased even after PSAs.

Hot Tip:

Consider whether or not you need propensity scores to make your adjustments. If participants did not self-select into a treatment program, but were placed there because they met a certain criterion (i.e., having a test score above the 80th percentile), a traditional analysis of covariance used with regression discontinuity designs may be more efficient than PSAs. Likewise, if your participants are randomly assigned by pre-existing groups (like classrooms) using a mixed-model analysis of variance might be preferable.  On the other hand, sometimes random assignment does not achieve its goal in balancing all covariates between groups. If you find that the parameters of some of your covariates (i.e., average age) are different in each treatment condition even after randomly assigning your participants, PSAs may be a useful way of achieving the balance random assignment failed to provide.

Rad Resource:

William Holmes recently published a great introduction to using propensity scores and Haiyan Bai and Pan Wei have a book that will be published next year.

Want to learn more? Register for Propensity Score Matching: Theories and Applications at Evaluation 2014.

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hello, my name is Dan McDonnell and I am a Community Manager at the American Evaluation Association (AEA). If you’re a frequent Twitter user, you’re probably familiar with Twitter’s ‘Who to Follow’ feature – a widget in the right sidebar that ‘suggests’ Twitter users for you to follow, based on your profile and following list. If you’re like me, you’re a frequent user of this feature, and oftentimes feel as if  you’ve exhausted the suggestions Twitter provides, and are interested in digging a bit deeper. Enter: Followerwonk!

Followerwonk

Followerwonk


Hot Tip: Search Twitter Bios

For starters, Followerwonk offers a robust Twitter bio/profile search feature. When you search a keyword like ‘evaluation’, Followerwonk will return a full list of results with many different sort-able criteria: social authority, Followers, Following and age of the account. The really cool part, however, is the Filters option. You can narrow these results down by only individuals with whom you have a relationship  (they follow you or you follow them), reciprocal followers or only pull those with whom you are not currently connected, which is a great way to find interesting new people to follow.

Hot Tip: Learn More About Your Followers

Using the ‘Analyze Followers’ tab, you can search for a Twitter handle and find some really interesting details about your network of followers (or folks that you follow). Like Twitonomy, Followerwonk will map out the location of your followers and the most active hours that they are Tweeting (great for identifying optimal times to post!). In addition, you’ll see demographic details, Tweet frequency information and even a nifty wordcloud of the most frequently Tweeted keywords.

Hot Tip: Compare Followers/Following

Now here’s where Followerwonk really shines. Let’s say I want to see how many followers of @aeaweb also follow my personal Twitter account, @Dan_McD. Or maybe you’re a data visualization geek, and want to see what accounts both Stephanie Evergreen (@evalu8r) and AEA (@aeaweb) are following to find some new, interesting Twitter users to follow. The Compare Users tab allows you to see what followers certain accounts have in common and add them to your network!

Using Followerwonk can give you a better overall view of your Twitter community, whether it be identifying interesting connections between your followers or surfacing new users to follow by comparing followers of those you trust. Many of the features of Followerwonk (including some I didn’t cover today) are available for free – and for those that aren’t, a 30-day free trial is all you need. What are you waiting for?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Dan McDonnell on Google + | Dan McDonnell on Twitter

 

No tags

I’m Alice Walters, a member of AEA’s Graduate Student and New Evaluator TIG.  I am a doctoral student in human services and work as a non-profit consultant in fund development, marketing, and evaluation.  Here, I explore potential pitfalls and recommendations based on experience with stakeholders for new evaluators.

Hot Tip 1:  Stakeholders are central to evaluation – include them in every step of the process.

This may be Evaluation 101 but it bears emphasizing.  Identify, include, and inform stakeholders.  Think carefully and critically about all involved parties in evaluation outcomes.  Leaving out key stakeholders may lead to poor quality evaluation in unrepresented perspectives.  Key decision-making stakeholders should be engaged in the evaluation process to ensure evaluation relevancy. 

Rad Resource: Engaging Stakeholders  This CDC guide has a worksheet for identifying and including stakeholders in evaluation.

Hot Tip 2:  Be proactive in frequent & ongoing communication to stakeholders.

Don’t assume that initial evaluation conversations and perspectives haven’t changed without your knowledge.  Frequent communication with stakeholders will alert you to any changes in stakeholder perspectives toward the evaluation.  Ongoing communication will also keep lines of communication open and inform stakeholders of evaluation progress.

Rad Resource: A Practical Guide for Engaging Stakeholders in Developing Evaluation QuestionsThis 48-page resource from the Robert Wood Johnson Foundation covers engaging stakeholders throughout the evaluation process.  It provides worksheets and a range of useful communication strategies.

Hot Tip 3:  Take the time to consider stakeholder’s views at every stage of evaluation.

Stakeholders may be unclear about the evaluation process, its steps, and methods used.  Be sure to explain and continue to inform at every stage of evaluation.  As a new evaluator, I made the faulty assumption that stakeholder views were unchanging from initial evaluation meetings.  I also failed to use opportunities to communicate during evaluation stages that might have signaled changing circumstances from stakeholder response.  Evaluators should be cautious about assuming that evaluation environments and stakeholder views are static.

Rad Resource: Who Wants to Know? A 4-page tip sheet from Wilder Research on stakeholder involvement. Evaluators have an expertise that may require working away from direct stakeholder contact, particularly key decision-making stakeholders.  The relevancy of an evaluation requires ongoing stakeholder input.  Successful evaluation requires keeping communication channels open with stakeholders.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings, I am Laura Pryor. In addition to being a GEDI alumna, I am a student at UC Berkeley’s Graduate School of Education in the Quantitative Methods and Evaluation program. As part of my graduate evaluation work, I have been exploring the recent trend of using multiple measures to evaluate teachers. As part of this trend, many policymakers and district leaders are combining multiple measures into a summative composite score, often for the purposes of high-stakes decision making (such as salary and personnel).

As a graduate student evaluator, I have been exploring two questions:
1) Is it necessary and/or purposeful to create a composite score?
2) If so, how should an evaluator combine multiple measures into a single composite score?

I hope this post provides insight into these questions so that evaluators can more easily navigate the increasingly popular context of high-stakes teacher evaluations.

Hot Tip 1: The purpose of the evaluation should decide if a composite score is needed. While it may be a current trend, not all multiple measure evaluation systems are used for a personnel or salary decision. For many districts and schools, the evaluation system is used to help teachers/staff identify areas for improvement; in this case, a composite score is not always necessary. If the evaluation system is intended for multiple purposes, prioritize purposes with stakeholders and discuss the feasibility for the evaluation system to embody multiple uses.

Hot Tip 2: If creating a composite score, select a model that is most appropriate for the evaluation:
a. The conjunctive approach: A pass/fail score is given; individuals must score at a specified passing level on every measure.
b. The disjunctive approach: A pass/fail score is given; individuals are only required to score at a passing level on one of the measures.
c. The compensatory approach: Individuals are given a continuum of scores; low scores on certain measures are compensated for by high scores on other measures.

Hot Tip 3: When using a compensatory approach, decide how to combine the measures:
a. Clinically: Evaluation stakeholders decide how to weight each measure; this is often called the ‘eyeballing’ approach.
b. Statistically: Select a criterion target and use regression methods to statistically determine the weights for each measure; this approach is considered more accurate than the clinical approach.

Rad Resources:

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, evalusphere! I am Jenna LaChenaye from the University of Alabama at Birmingham. As an evaluation practitioner transitioning into the world of academia, I have found myself positioned at the epicenter of tackling the learning curve that separates these two vital yet divergent arenas of evaluation. As an evaluator and lover of social research inquiry, I reveled in the pursuit of solving real world issues, completing utilization-focused reporting and training, and moving into the next challenging project. My goal was (and continues to be) to complete rigorous and professional work that addressed local issues through the tools of evaluation. I prized spending time in activities that I deemed immediately and visually impactful. Transitioning into the world of academia, however, has put me in a position of re-socialization. I must not only continue to produce useful work that is rooted in real problems, but must additionally generate products that build on the academic community’s current work and the university/department’s mission (which can often seem like two very different conversations). However, academia provides many benefits that I did not find as an independent evaluator, such as access to immense resources, funding, and an impressive community of practice. Furthermore, I have come to see the evaluator-to-academic role as even more of a service of our profession due to the value of bringing practical experience and a focus on action into the academic sphere.

Hot Tip 1: Evaluation is often misunderstood by more traditional faculty. Share your knowledge of evaluation and you will often find colleagues who have a need for your action-based skill set.

Hot Tip 2: Many universities have centers that conduct evaluation work for the school and community. Seek out and connect with these groups as a way to seamlessly transition to the academic world.

Hot Tip 3: Many universities offer mentoring and development programs. Contact your faculty development center and/or department for more information.

Hot Tip 4: Academia and the next generation of scholars can value immensely from your knowledge and experience. If you work strictly as a practitioner, consider teaching an online or adjunct course.

Lessons Learned:

  • Like any other shift in work, moving to academia comes with a learning curve as you re-socialize into the role.
  • Academia is more of a translation of practitioner evaluator work rather than the very divergent jump it seems to be.
  • Colleagues are more than happy to provide support if asked.

Rad Resources:

  • Tanya Golash-Boza, Ph.D. writes a great, simple blog on navigating the academic world and maintaining a work/life balance, a great resource for those of us who want a jump start
  • Translating evaluation reporting to a journal format can be tough. Search Eval.org for resources addressing this transition

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings! We are Kate Westaby and Valerie Moody, new evaluators from two Clinical and Translational Science Award (CTSA) institutes. Kate is an Evaluation Research Specialist at the University of Wisconsin-Madison Institute for Clinical and Translational Research and Valerie is the Evaluation Coordinator at the University of Iowa Institute for Clinical and Translational Science. At the 62 CTSA institutes nationwide, program evaluation is a complex, dynamic, unpredictable environment, mandated by NIH, but implemented in a wide variety of ways by evaluators with diverse backgrounds.

Due to our personal efforts learning to adapt to these complicated surroundings, we wanted to know if there were best practices for new evaluators to orient themselves to their workplaces. Last year, we interviewed 16 new evaluators from 14 CTSA institutes to gather the most helpful strategies for learning about evaluation, thus allowing new evaluators to hit the ground running.

I felt it was like putting together a 1000 piece puzzle, but nobody gave you the cover,” — quote from a new CTSA evaluator.

Hot Tip 1: Learn the history of evaluation efforts at your workplace.New evaluators found this to be the most helpful strategy. Many suggested using programmatic documents (e.g., grant proposals, strategic goal documents, etc.) to find useful historical information. They were better able to understand evaluation needs and review progress towards those needs in a short period of time.

Hot Tip 2: Attend face-to-face meetings (or a conference) with evaluators who are doing similar work. This setting allowed new evaluators to hear what strategies others are using, what their struggles have been, and how they turned their struggles into successes. It also allowed them to establish face-to-face networks for future communication.

Hot Tip 3: Ask questions! Supervisors or colleagues can provide insight into program history, politics, and help you avoid reinventing the wheel. Don’t be afraid to speak up!

Rad Resource: For more tips on how to get comfortable in your new workplace or to look into which strategies were least helpful to our interviewees, check out our AEA 2013 poster below (or download a larger version from AEA’s public elibrary here).

GSNE poster

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Alice Walters, a member of AEA’s Graduate Student and New Evaluator TIG.  I am a doctoral student in human services and work as a non-profit consultant in fund development, marketing, and evaluation.  I share some networking tips, below.

Networking is needed at every career stage.  Review tips and resources to increase your effectiveness.  Enjoy using your networking skills as both art and science to see what serendipitous outcomes transpire!

Hot Tip 1:  Networking is developing informal connections with other professionals.

Building informal connections can occur any time you meet other professionals.  Don’t exclude those outside your usual networks who can be a source of unexpected developments.

Rad Resource: Developing a Strong Professional Network” by the Penn State Alumni Association 

Hot Tip 2:  Networking is more than just about a job hunt. Networking is often associated with job hunting success but it can be much more than that.  Networking can lead you to new avenues, develop new collaborations, and bring attention to your own work in new venues.

Rad Resource: Tips for Successful Business Networking10 Advantages of Business Networking” bySusan M. Heathfield

Hot Tip 3:  Networking is not really an “activity,” it is a lifestyle. Networking is not an isolated activity you add to your calendar.  Instead, it is really a process, approach, and outlook on professional relationships.

Rad Resource: Cheat Sheet: 9 Professional Networking Tips” by Jillian Kurvers

Hot Tip 4:  Networking for the shy – is easier when you don’t think of it as “networking.” Even the most outgoing people can struggle with pressure to force a connection professionally.  Instead, it is better to explore relationships by asking questions that occur naturally to you.

Rad Resource: How to Network: 12 Tips for Shy People” by Meridith Levinson

Hot Tip 5:  Networking is an art.  It’s creative, flexible, and individualistic. Use your strengths to network.  Just as art appeals differently to individuals, networking can accommodate a variety of styles.

Hot Tip 6:  Networking is a science.  It deserves study and analysis. Science is study.  Networking is thoughtful.  It seeks to connect the random dots.  Networking requires analysis of input data.  It’s not an oxymoron to look for serendipity.  Serendipity is defined as finding something valuable but not sought for.  Still, if you are looking for connections and value, you will be more likely to find them.

Hot Tip 7:  AEA is a great resource for networking. AEA is the hub of evaluation professionals.  The AEA Topical Interest Groups, conferences, and local affiliates are a great place to start. On the AEA home page go to (third tab to the right):  Read>Links of Interest>Professional Groups   http://www.eval.org/p/cm/ld/fid=69

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, I am Ayesha Tillman, and I have all but deposited my dissertation for a Ph.D. degree in Educational Psychology with an evaluation specialization from the University of Illinois at Urbana-Champaign (Illinois). I along with Rae Clementz, Sarah Wilkey-Gordon, Tiffany Smith, Pat Barlow, and Nora Gannon-Slater are mentors in the Graduate Student and New Evaluator (GSNE) TIG Peer-Mentorship program. I have five mentees located across the U.S. including Louisiana, California, Michigan, Texas, and the Dominican Republic. So far, my participation as a mentor has been an incredibly rewarding and worthwhile endeavor.

Lessons Learned: All of my mentees joined the GSNE TIG peer-mentoring program because they were looking for someone to bounce ideas off of, share experiences with, and someone to give them tips/advice. Below is advice I have shared.

Hot Tip 1: Presenting. AEA, American Educational Research Association (AERA), and the Center for Culturally Responsive Evaluation and Assessment (CREA) are three conferences for evaluators to submit presentation proposals to. If you are uncomfortable submitting a paper, start with roundtable and poster presentations.

Hot Tip 2: Publishing. Publishing in evaluation can be tricky. Evaluation journals (and conferences) do not want submissions about the results of the evaluation. Research on evaluation and reflections on evaluation practice are well suited for publication. For example, the American Journal of Evaluation “explores decisions and challenges related to conceptualizing, designing and conducting evaluations.”

Hot Tip 3: Capacity building. The workshops at the AEA conference, the AEA summer evaluation institute, and AEA eStudies are great professional development opportunities for evaluators. The AEA Graduate Education Diversity Internship Program is an awesome opportunity for graduate students of color and from other under-represented groups who would like to extend their research capacities to evaluation.

Rad Resources:

  • GSNE mentorship program mentees. If you are interested in being a mentee, make sure you are a member of the GSNE TIG. You will receive an email once a quarter with the opportunity to become a mentee. If you are interested in being paired with a GSNE mentor, please send an email to Kristin Woods.
  • GSNE mentorship program mentors. If you are interested in being a mentor, you should have been an AEA member for two or more years and have attended at least one annual conference. If you are interested and willing to be a GSNE mentor, please send an email to Kristin Woods.
  • GSNE TIG Facebook page. The GSNE Facebook group is a great place to connect with other graduate student and new evaluators. We share resources, opinions, advice, and network on Facebook.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Good morning! I’m Sheila B Robinson, aea365′s Lead Curator and sometimes Saturday contributor. I love to share evaluation news and, ever the teacher, I look for opportunities to educate aea365 readers. Today’s lesson is about EvalYear, the International Year of Evaluation. If you haven’t yet heard about this, it’s time to get reading!

In October 2013 in São Paulo, Brazil at the Third International Conference on National Evaluation Capacities it was announced that 2015 would be the International Year of Evaluation (EvalYear). EvalPartners, the global movement to strengthen national evaluation capacities, is behind the effort, and it’s a big effort! Two leading partners and 47 core partners (of which AEA is one) along with 1580 evaluators/activists have joined or expressed interest in the declaration of EvalYear.

“The aim of designating 2015 as the International Year of Evaluation is to advocate and promote evaluation and evidence-based policy making at international, regional, national and local levels.”

Image credit: Hans Watson via Flickr

Image credit: Hans Watson via Flickr

Lesson Learned: When you visit the EvalYear website and start reading, you will come across no fewer than 15 acronyms! Most are written out, but not all are (or are not written out on every page), so to prepare, have a little taste of alphabet soup!

  • IOCE – International Organization for Cooperation in Evaluation
  • IEG – Independent Evaluation Group
  • OECD/DAC – Organization for Economic Cooperation and Development / Development Assistance Committee
  • VOPE – Voluntary Organization of Professional Evaluators
  • MDGs – Millennium Development Goals
  • SDGs – Sustainable Development Goals
  • UNEG – United Nations Evaluation Group
  • ECG – Evaluation Cooperation Group
  • ALNAP – Active Learning Network for Accountability and Performance 
  • TF – Task Force
  • NECD – National Evaluation Capacity Development
  • QCPR - Quadrennial Comprehensive Policy Review
  • CSO – Civil Society Organization
  • ECD – Evaluation Capacity Development
  • EFGR - Equity Focused and Gender Responsive

“EvalYear will position evaluation in the policy arena, by raising awareness of the importance of embedding monitoring and evaluation systems in the development and implementation of the forthcoming Sustainable Development Goals, and all other critical local contextualized goals, at the international and national levels. EvalYear is about taking mutual responsibility for policies and social action through greater understanding, transparency, and constructive dialogue.”

Hot Tip: Visit EvalYear to learn more and consider how you will get involved!

Rad Resources: Check out the resource center for presentations and updates. The EvalYear logo and brochure is currently available in 18 languages and it is being translated in to many more languages.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top