AEA365 | A Tip-a-Day by and for Evaluators

CAT | Uncategorized

Natalie DeHart here, AEA’s Membership Programs Coordinator. As the week of National Volunteer Week closes, I wanted to take a moment to touch on why volunteering for AEA is so great.

Volunteering allows you, our members, to make meaningful contributions to the fulfillment of AEA’s governance, operations, and membership programming. It provides the opportunity for more members to get involved in a leadership capacity and to network with other fellow evaluators on AEA initiatives they are passionate about.

To volunteer for a Working Group, please complete the contact form here and return to AEA via fax or email. While you may submit an application for most Working Groups, those with a membership status that indicate ‘Open’ are actively seeking members on a rolling basis and will receive top priority in placement.

AEA would not be able to provide outstanding membership programming without the help of our volunteers who provide valuable member input on current programs, or help staff brainstorm new ones.  Volunteering for AEA takes a great deal of time, energy, dedication, and a strong commitment to the organization’s ends goals. To our volunteers who have helped make this association truly great – thank you – we would not be where we are without your efforts!

Visit our website to volunteer today!  For any questions, don’t hesitate to reach out to myself, Zachary Grays, or Milos Popovic at info@eval.org.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello Evaluation Learners! I’m Sheila B. Robinson, aea365’s Lead Curator and sometimes Saturday contributor. Today, I’m writing about AEA’s Summer Evaluation Institute, a perennial fabulous learning opportunity. Anyone who knows me knows that I love learning and meeting with evaluation colleagues, and this is the perfect opportunity for both.

Registration is now open for the 2018 AEA Summer Evaluation Institute – June 17 – June 20 in Atlanta, GA. Presenters will be ready with 34 high quality courses on a wide variety of topics. Please visit the site for complete descriptions of each workshop.

Hot Tip: If you attend, plan on showing up for the plenary sessions. This year features what is sure to be an engaging and dynamic keynote by Kylie Hutchinson of Community Solutions Planning and Evaluation – Effective Evaluation Reporting: Making Your Key Messages Sticky. Kylie’s book, A Short Primer on Innovative Evaluation Reporting is a wonderful compilation of creative ideas for non-traditional evaluation reporting.

Rad Resources: 

The 2018 Summer Evaluation institute offers two full day pre-institute workshops:

  1. An Interactive and Case-Centered Primer on Evaluation Approaches with Bianca Montrosse-Moorhead from the University of Connecticut:

All evaluation rests on an evaluation approach (i.e., theory or model). Sometimes this approach is explicit and sometimes it is implicit. Either way, evaluation approaches guide the reasoning for doing an evaluation, how it will be done, who will be involved and how, and what will be done with results and by whom. This interactive, case-centered workshop covers historical and contemporary evaluation approaches in diverse national and international contexts. Institute course offerings include:

2. Introduction to Evaluation with Thomas Chapel from the Centers for Disease Control

This workshop will provide an overview of program evaluation for Institute participants with some, but not extensive, prior background in program evaluation. The foundations of this workshop will be organized around the Centers for Disease Control and Prevention’s (CDC) six-step Framework for Program Evaluation in Public Health as well as the four sets of evaluation standards from the Joint Commission on Evaluation Standards.

Hot Tip: Check out the other 32 half-day workshops. You can choose up to 5 courses during the institute!

Hot Tip: Can’t wait to get your evaluation learning on? Act fast to register for the 2018 AEA Summer Evaluation Institute and take advantage of early bird prices (until May 24)! Courses do fill up!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello Everyone! I’m Sheila B Robinson, AEA365’s Lead Curator and sometimes Saturday contributor with some pointers on adding photos, icons, graphics or other visuals to your blog article. It’s important to abide by the rules when we’re publishing original content here or on our own blogs and websites.

Want to submit a draft article to AEA365 with images attached?

Hot Tips: We can only use images under one of these conditions:

1.) They are original, created by the author of the post, and unpublished elsewhere.

2.) They are in the public domain. Images in the public domain have no copyright restrictions and are free to use.

3.) They are Creative Commons licensed. About a year ago, I published Mind Your Manners When it Comes to Visuals! by Sheila B Robinson, an article focused mostly on understanding Creative Commons. If you’re not yet familiar, I’d recommend reading it, or better yet, head right over to the Creative Commons site and learn from the source! Hotter Tip: You’ll need to know which CC license is associated with your image. Some require attribution, and some allow you to make changes to the image, while others do not.

4.) They are copyrighted, but you have express written permission from the author/owner of the image to use it with your blog article.blank polaroid images

Lesson Learned: Royalty-free does not mean the same as free! Royalty-free means that once you pay for a license to use the image, you can then use them many times without paying additional fees.  If you see a watermark on a photo (e.g. you can see the word “Shutterstock” or another company name behind the image), you don’t have a license to use it and it is restricted.

Cool Trick: We’ve featured a number of articles on using and creating images. Try these two:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings! We are Larry Dershem (Senior Advisor, Research & Evaluation), Ashley Bishop (Monitoring, Evaluation and Learning Advisor), and Brad Kerner (Senior Director, Sponsorship Program) working for Save the Children (SC), which is an international charitable relief and development organization that seeks to ensure children survive, learn, and are protected internationally and in the US.

A retrospective impact evaluation (RIE) is an ex post evaluation of an evaluand to assess its value, worth, and merit, with a special focus on examining sustainability of intended results as well as unintended impacts. However, due to resource constraints, international development donors and organizations cannot afford to conduct many RIEs, limiting our ability to truly understand longer-term outcomes and impacts after the closure of a program. Save the Children’s sponsorship programs are currently investigating the feasibility of conducting RIEs in order to optimize learning from scarce resources.

With invaluable professional assistance from E. Jane Davidson (RealEvaluation) and Thomaz Chianca (independent consultant), we developed a RIE Evaluability Scoping Guide to assess the feasibility of conducting an RIE. The RIE Scoping Guide assesses 24 issues categorized into the following four dimensions: 1) Internal Stakeholder Support, 2) External Stakeholder Support, 3) Availability of Evidence & Documentation, and 4) Context. After program staff review, discuss and score each of the 24 issues, a tally of the scores indicate which of the four feasibility categories best describes the feasibility of a RIE.

RIE Feasibility Categories

RIE Feasibility Categories diagram

(click for larger image)

To date, five sponsorship programs implemented for 10-years, and ended 5 to 9 year ago in Bangladesh, Nepal, and Ethiopia, have been assessed. All five programs are either “Adequately Evaluation Ready” or “Fully Evaluation Ready”; therefore, in the next year SC plans to commission at least one RIE.

Lesson Learned:

  • To avoid confusion, be clear with program staff that an evaluability assessment of a program is NOT an evaluation of a program.
  • For program staff to clearly asses the evaluability issues under the 4 dimensions, each issue should have a short description and set of questions to be answered.

Hot Tips:

  • Having specific dimensions and issues that are critical for a RIE to be feasible after a program has ended allows staff to incorporate these issues into the program from the beginning.
  • SC has established an optimal window of 5-10 years after completion of a program to conduct a RIE which allows for longer-term impacts to occur but not too long to limit the number of confounding factors.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Isabelle Collins and I was the Principal Analyst, Monitoring and Evaluation at Superu (the Social Policy Evaluation and Research Unit) in New Zealand.

Bridging Cultural Perspectives cover

Rad Resource:

My colleagues at Superu, and its previous entity the Families Commission, have published the Bridging Cultural Perspectives approach. This approach acknowledges and respects the value of all knowledge streams. The approach also provides spaces for dialogue between the knowledge streams.   This is a new way of collaboration that requires researchers, policy makers, planners and decision-makers to go beyond their previous conceptual boundaries. You can download the paper here.

Bridging Cultural Perspectives is made up of two models. One, He Awa Whiria – Braided Rivers, was developed by Angus Macfarlane as part of his work in the Advisory Group on Conduct Problems. The model is dynamic. It allows for different cultural knowledge systems to function separately or together, just as the streams of a braided river flow apart or together in their journey to the sea.

NZ braided rivers diagram

The other model, Negotiated Spaces, was developed by researchers in the Te Hau Mihi Ata project. It applies the traditional concept of w?nanga to the modern context. The w?nanga are designed to facilitate conversation between m?tauranga M?ori experts and M?ori scientists.

The two models work together well – He Awa Whiria – Braided Rivers provides a conceptual model and Negotiated Spaces provides the dialogue space and the means of application.

Hot Tip: The Superu website has links to a range of work in this area, including  Family Wellbeing and Wh?nau Rangatiratanga Frameworks. http://www.superu.govt.nz/current-projects/families-and-wh-nau-wellbeing-research-programme. This page includes latest contact details for key researchers on the programme, as Superu itself is in the process of being disestablished. Grab it while you can!

Hot Tip: If you haven’t been to New Zealand to see our amazing braided rivers, you really should.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I’m Rick Davies, Evaluation consultant, from Cambridge, UK.

Predictive analytics is the use of algorithms to find patterns in data (e.g. clusters and association rules) by inductive means, rather than by theory led hypothesis testing.  I can recommend three free programs:  RapidMiner StudioBigML and EvalC . My main use of these has been to develop prediction models, i.e. find sets of attributes that are associated with an outcome of interest.

Here are some situations where I think prediction modelling can be useful, when looking at international development aid programs:

  1. During project selection:
    • To identify what attributes of project proposals are the best predictors of whether a project will be chosen for funding, or not
    • To identify how well a project proposal appraisal and screening process is as a predictor of the subsequent success of projects in achieving their objectives
  2. During project implementation:
    • Participants’ specific and overall experiences with workshops and training events
    • Donors’ and grantees’ specific and overall experiences of their working relationships with each other
  3. During a project evaluation:
    • “Causes of effects” analysis: To identify what combination(s) of project activities (and their contexts) were associated with a significant improvement in beneficiary’s lives.
    • “Effects of causes” analysis: To identify what combinations of improvements in beneficiaries’ lives were associated with a specific project activity (or combination of)
    • To identify “positive deviants” – cases where success is being achieved when failure is the most common outcome.

BigML and RapidMiner have more capacities than I needed. So, I developed EvalC3, an Excel app available here, where a set a set of tools is organised into a workflow:

In the Input and Select stages choices are made about what case attributes and outcomes are to be analysed. In the Design and Evaluate stage users can manually test prediction models of their own design or they can use four different algorithms to find the best performing models. Different measures are available to evaluate model performance. All models can be saved, and case coverage of any two or more models can be compared. The case membership of any one model can also be examined in more detail. This last step is important because it enables the transition from cross-case analysis to within case-analysis. The latter is necessary to identify if there is any casual mechanism underlying the association described by the prediction model.

The workflow design assumes that “Association is a necessary but insufficient basis for a causal claim,” which is more useful than simply saying “Correlation does not equal causation.”

Lessons Learned:

Hot Tip:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Bronwyn Mauldin, Director of Research and Evaluation at the Los Angeles County Arts Commission. I’m going to share the informal peer review process we use to improve the quality of our work.

Even if you’re not writing for an academic journal you want to make sure your methods are rigorous, your findings watertight, your final report lucid and clear. How can you get an objective assessment prior to print if your report doesn’t go through peer review? Ask an external colleague who works in the same field or uses similar methods to read it and give you feedback. In fact, ask two or three of them. Here at the LA County Arts Commission we’ve established a practice of doing this for every research or evaluation report we publish. It’s a simple idea we’ve found to be remarkably beneficial.

This practice is especially useful for those of us who work in that area some call “gray literature” published by nonprofits, foundations, government or other non-academic institutions. While we may have the advantage of working closely with practitioners and subject-matter experts, we have less access to the kind of meticulous critique available in the academy.

Rad Resource: Your colleagues. Identify three or four experts outside of your organization, then ask them to review your report and comment on it. Provide guiding questions so they’ll pay attention to your key issues, but be open to whatever else they find. Be sure to credit your reviewers in the final report.

Lesson Learned: People can be remarkably generous with their time and expertise. We’ve sent reports to reviewers that run to 70 pages or more, and others that were loaded with charts and graphs. Most people we’ve asked delivered thoughtful, thorough feedback.

Lesson Learned: Timing and communication are critical. Reach out to potential reviewers to get their commitment early in the writing phase. Send them the finished report when the text and charts are complete (but before the design phase). Give reviewers enough time for their review based on the length and complexity of the report, and a clear deadline. It might take a reminder or two, but most people eventually come through.

Cool Trick: Don’t limit yourself to colleagues you know. Contact the top experts in your field – both academics and others. This is also a great way to raise your profile with experts you’d like to get to know.

Independent evaluators who want to use informal peer review will probably need to let the institution you’re working for know what you’re planning in advance. Invite them to recommend experts to serve as reviewers.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi, I’m Sara Vaca, independent consultant (www.SaraVaca.com), AEA365 Outreach Coordinator and Creative Advisor, and contributor to Saturday’s posts. As a former program manager, evaluator and meta-evaluator, I have come across maaaany reports. And probably many less that you.

Lesson Learned: Reports (inception reports, evaluation reports, program reports, you-name-it-reports) seem to continue to be one of the most common ways, almost like the unit we package information and knowledge to be shared. And even though there are alternative supports for it, it will probably be years (decades?) until we move on from them.

Hot Tip: Since we continue producing reports for others to read, let’s strive to make them look more appealing than the average we are used to read. Including Data Visualization elements to complement and illuminate the text, and/or eventually using graphic design services to improve its looks can help us spice them up. But there are other underexplored alternatives.

Cool Trick: Instead of opening a blank Word document to create your report, open a Powerpoint file to write it. What?! Yes, a Powerpoint document can be equally converted to a pdf, after which it will become a report, regardless of the program it was created in. A Powerpoint file, either in portrait or landscape orientation, can be a much better friend to facilitate your product looking much more innovative.

Here are a couple of examples of reports I have done recently, exploring Powerpoint possibilities:

Lessons learned: Powerpoint offers different dynamic than Word, and these are the advantages (and disadvantages) I have observed comparing it to Word when creating a document:

  1. Advantages: Powerpoint allows me to think much more openly about the layout of the report. It is much easier to do different-width columns lay outs and alternating paragraphs and visuals in a very dynamic way. Besides, it allows you to customize each page in its all particular manner. I also find it makes you craft your sections in a more concise style, as you feel somehow limited in space (though in reality you can use as much space as in Word if you wish). Finally, the use of visuals, icons and tables is way less painful than in Word.
  2. Disadvantages: But not all is good. I have also realized some negative aspects. First one is that treating the text (for proofreading, translation, etc.) is more manual, and therefore slower. At the same time, each page acts at some point as a self-contented unit, where changes in one have to be manually corrected in the subsequent pages (it does not move automatically to the next page). In another order of differences, I have also remarked that in some rare occasions, this approach may be too innovative and may not always be a good idea (the rest of the time, it’s safe, encouraged and even begged!)

Finally, I also use Powerpoint for other purposes, for example, I am doing the last versions of my visual CV in it.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I am Laura DeMaria and I am the Senior Manager of AEA’s Operations department.

When I joined the AEA staff in September 2017, I had two main priorities: one was to learn as much about our organization and staff as I could, and the other was to learn as much about evaluation as possible. Within just a few weeks, I had a great opportunity to do both in real time at Evaluation 2018, AEA’s annual meeting, and our most attended meeting to date.

What impressed me most about my fellow staff was their absolute professionalism and dedication to customer service. Successfully running a 4,000+ person meeting in the labyrinthine Marriott Wardman Park hotel is an incredible feat! And to do so with such grace and positivity was a delight to see. From my station in the Information Desk, I was amazed by how much the staff accomplished, from the behind the scenes detail to the very public movement of tables, auction items, livestreamed sessions, the lights and sound, and even, at one point, a drum circle. AEA members, you have a highly competent staff at your service.

This January a few of us joined the Board of Directors for our first meeting of the year in this year’s conference location, Cleveland, OH. First, let me say this about Cleveland: what a city! The feeling I got was one of coming home. Warm people (even in the cold weather), the beautiful line of the edge of the lake against the city, the urban architecture, and the passion of the residents for their home town made us all agree we are genuinely looking forward to returning this fall.

During the Board meeting, I was struck with how devoted to listening evaluators are. Discussion is always lively, and the Board, and the members I have met, truly are committed to their value of inclusivity. This is one of my favorite things about learning about evaluators and the profession of evaluation, that no matter what your perspective is, there is room for you at the table.

I look forward to meeting more of you in the coming months, and especially at Evaluation 2018 in Cleveland!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · · ·

I am Jenny McCullough Cosgrove @EvalNerd, Twitter convert, and member of the #EvalTwitter community. It is fascinating to watch our online community form and we are excited to introduce our #EvalTwitter monthly chat topics. This will be an easy way to connect with the #EvalTwitter and start conversations.

Hot Tip: Get connected around the #EvalTwitter hashtag.

Step 1. Log in to Twitter

Step 2. Type #EvalTwitter into search bar to pull up all the tweets using that hashtag

Step 3. Connect and have fun!

Hot Tip: Join the #EvalTwitter community for #eval chats the fourth Thursday of each month at 8:30 PM EST (5:30 Pacific) starting NEXT Thursday March 22! Each month we will connect around a specific hashtag, starting in March.

Tweet out with the #EvalTwitter hashtag each month to get involved!

  • March 2018: Follow Friday with a Twist (The Twist is that it’s Thursday)
    • Get to know the #EvalTwitter community and connect with new evaluators.
    • Share out your #eval loves and targets of your admiration to connect them with #EvalTwitter.
    • No idea what a Follow Friday is? Check out this longer (and funnier) explanation by The Oatmeal.
  • April 2018: Methods Madness
    • Share a method or approach that you are just crazy about!
    • What about this method or approach excites you?
  • May 2018: AEA LAC connections
    • Promote your local AEA area collaborative (LAC).
    • Connect with someone new in your area.
  • June 2018: Fail Forward
    • How have you failed forward in your #eval work?
    • What have you learned?
  • July 2018: Toot Your Evaluator Horn
    • What great #eval thing have you done so far in 2018?
    • Compliment a fellow  #EvalTwitter-er on their great work.
  • August 2018: Professional Development
    • What has been the best professional development you have attended this year?
    • What has been the best professional development you have ever attended? Why?
  • September 2018: #eval18 Connects
    • Connect with an #EvalTwitter fav and schedule a coffee break with them at #Eval18.
  • October 2018: #eval18 Excitement
    • What are you looking forward to?
    • Need recommendations for a certain topic?
    • Share out your presentation!
    • Live tweet your conference attendance.
  • November 2018: Stuck On #eval18: Reflections and Takeaways
    • What has stuck with you from the conference?
    • What do you find continuing to consider?
  • December 2018: New Year, Renewed You
    • How are you going to renew your #eval practice for 2019?
    • Where do you hope to grow in your practice in the new year?

Get Involved: Digital Goes Analog

#EvalTwitter is planning a live-tweeted get-together at #eval18. Join us for an in-person get-together in Cleveland. Follow #EvalTwitter to stay in the loop.

Following a live Twitter chat can be difficult, so we recommend using Hootsuite or another similar program to better engage in the chat! Applications for Hootsuite can be used to curate your own social media experience and keep up with a live Twitter chat. Check out this Hootsuite kickstart guide.

The American Evaluation Association is celebrating #EvalTwitter week. All posts this week are contributed by evaluators engaging, networking, and collaborating through Twitter. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top