AEA365 | A Tip-a-Day by and for Evaluators

Hi, I’m Kamilah Henderson, Evaluation Fellow at Skillman Foundation in Detroit. I work with Foundation staff and partners to create learning opportunities that inform the work of improving conditions for Detroit kids.

Skillman provided a social innovation grant to the Detroit Bus Company to develop the Youth Transit Alliance (YTA), creating a long-term transportation solution for youth in Southwest Detroit. YTA’s work has required nimbleness and creative agility to respond to shifts in the volatile ecosystem in which the project is embedded. As an internal evaluator, I used rapid learning to complement the spirit and energy of the YTA’s work to 1) highlight and track tangible changes in program strategy, 2), develop a rigorous data collection system, 3) surface solutions in a way that fosters continued mutual responsiveness and collaboration.

Lesson Learned:

Social innovators work fast solving seemingly intractable problems. Rapid learning allows foundations to match the pace of social innovators who need data to inform their swift responses to systems level changes.

Hot Tip #1: Demonstrate Values of Collaboration through Action. Developing evaluation relationships early in project planning ensures that rapid learning addresses the concerns of the grantee and Foundation. Starting with this value has made for stronger learning questions. As implementers of the work, YTA learned from the rapid learning cycles about moving key levers in systems change for kids, and Skillman’s Social Innovation team learned about providing technical assistance resources for core grantees.

Hot Tip #2: Use Tried and True Tools. Beverly Parsons developed a framework to assess program development as it moves toward sustainability and scaling. The framework helped me identify strategy changes the YTA employed during their pilot year. Parsons’ tool was beneficial in the absence of a logic model, which is sometimes the case with social innovation projects versus traditional nonprofit programs.

Hot Tip #3: Faster is Better. Instead of year-end reports, YTA has appreciated getting the results of data analyses within months so that they could more quickly shift the direction of their work toward better outcomes for kids. Skillman has valued learning as the work progresses rather than after a grant cycle has ended. Melanie Hwalek’s memo format is a helpful tool for presenting critical analyses without the long wait.

Rad Resource: Evaluating Social Innovation, by Preskill and Beer.

Rad Resource: The Real-Time Evaluation Memo, by Melanie Hwalek.

Rad Resource: Developing a Framework for Systems-Oriented Evaluation, by Beverly Parsons.

Get Involved: I would love to hear from others who are doing similar work. I will be presenting with a panel of colleagues at the AEA Conference. Please join Marie Colombo, Sara Plachta Elliott, Nancy Latham and me at Learning about Rapid Learning: Identifying Approaches that Increase Evaluation Use in System-Building.

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

I am Patti Patrizi an evaluation consultant working primarily with foundations helping them develop evaluation and learning systems. After working at The Pew Charitable Trusts, I founded The Evaluation Roundtable. My tip is an approach I used to help a large foundation develop a learning system that fosters internal learning about their strategies as an antidote to years of producing reports about results and outcomes.

Hot Tips:

  • Assessing the current reporting system: We used a modified “post action review” (http://www.fireleadership.gov/documents/Learning_AAR.pdf ) with a 16 person representative staff group asking them to describe their experience with their current system (this included asking about: audience, process, questions, actual use–by whom, gaps and positives) and to describe their hopes. The process took 2 meetings at 1.5 hours each.
  • Providing quick feedback: We quickly provided their comments back on a single Excel sheet sent to them for comments.
  • Plotting out the new system: Using the information, we generated a rough outline of the major elements of a new reporting system, which they reviewed in one group meeting, and then via email. We then selected four members of the larger group, to help detail mechanics, rules, and flows for the new system.
  • The core of the process: The system builds exchange between officers and their directors on each strategy. The exchange is teed up by responses to a set of questions developed to stimulated thinking and discussion on issues. Each officer writes a note; their director reads it, and convenes the group of officers working on a strategy, and then writes his/her own note.   Each note represents each person’s own perspective; there are no “corrections” in the process.   The group then meets with their program VP to discuss implications.
  • Developing good learning questions: The old system focused on listing accomplishments. The new system drives on questions that challenge officers to think critically about the strategy, and about why something happened or not. Using data of some kind (qualitative or quantitative) is a requirement. So as an example:

“Are you finding that you need to rethink the assumptions behind your theory of change, including:

  • Time needed to achieve outcomes envisioned
  • The extent of partnership and interest delivered by key stakeholders
  • Availability or nature of resources needed to make a difference
  • Levels of interest from external stakeholders—such as policy makers, NGOs etc.
  • Unanticipated changes in policy
  • The level of capacity that exists within relevant field/s in order to carry out the work, or as it relates to the key approaches
  • Other assumptions that have not materialized as you hoped?”

Last thought: This process will be only as good as the thinking it produces in the organization.

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I’m Gretchen Shanks with the Bill and Melinda Gates Foundation’s Strategy, Measurement and Evaluation (SM&E) team. Our team works to ensure the foundation’s leadership, program teams and partners have the necessary capacity, tools and support to measure progress, to make decisions and to learn what works best to achieve our goals.

Before the Foundation, I supported teams at non-profits that were eager to evaluate their projects in the field; however, financial resources were inevitably scarce. Now that I work for a grant-maker that prioritizes the generation of evidence and of lessons about what works, what doesn’t and why, I think about issues of resourcing measurement and evaluation a bit differently.

In particular, I think less about whether we have enough financial resources in our budget for M&E and more about whether we have “enough” technical resources for measurement available (both to our internal teams and to our partners), or “enough” appropriately targeted and utilized evaluations. Some of the questions I ask about grantee evaluation include:

  • Are we investing sufficient resources, both time and technical support, in our work with partners to articulate the logical framework of measureable results for a project?
  • Have we adequately planned for measurement of those results and any possible evaluation(s)?
  • Do we know if we really need an evaluation, and if so, towards what end?
  • Does the design of the evaluation appropriately match the purpose and audience?
  • Do we know how (and by whom) the evaluation results will be used?

Planning results and measurement up front, supporting M&E implementation, and facilitating the use of data and lessons learned from evaluation all require resourcing – some financial, some technical, and (perhaps most importantly) temporal—the time needed from the relevant stakeholders (internal and external) is critical. As you likely know well from your own work, there are no magic solutions to these challenges. Here at the foundation we’re working on getting smarter about how to utilize scarce resources to support actionable measurement and evaluation.

Hot Tips: Here are a few examples of ways we’re tackling these challenges:

  • Check out this blog post by SM&E’s Director, Jodi Nelson. She introduces the foundation’s evaluation policy, which aims to “help staff and partners align on expectations and focus scarce evaluation resources where they are most likely to produce actionable evidence.”
  • Read this PowerPoint deck, which describes the foundation’s approach to designing grants with a focus on measureable results.
  • Listen to an Inside the Gates podcast to hear from NPR’s Kinsey Wilson and Dan Green, BMGF’s deputy director of strategic partnerships, as they discuss measurement in the field of media communications and some of the related challenges. (The segment runs from 8:55 to 15:38.)

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello All! Sheila B Robinson, aea365′s Lead Curator and sometimes Saturday contributor here with even more good news about audience engagement! Last Saturday, I wrote this post introducing the new Audience Engagement Workbook, the new Potent Presentations (p2i) tool featuring the WHY, WHAT and HOW of audience engagement, along with 20 specific strategies any presenter can use with limited investment of time or money. Look for the workbook to be posted on the p2i site any minute now!

In just a moment, I’ll share another strategy from the book, but in the meantime, I want to let you know about another opportunity to learn about audience engagement. Are you excited? Raise your hand if you want to learn more! (Are you feeling engaged now?)

Hot Tip: Join me for an AEA Coffee Break Webinar* - Audience Engagement Strategies for Potent Presentations - on Thursday October 9 at 2:00pm EST where I’ll preview several key strategies appropriate for a variety of presentation types. Click here to register.

Cool Trick: Try a quote mingle. This requires some preparation in that you will gather quotes about a topic and print them out on cards – enough for each participant to have one (either print a few quotes on cardstock or on paper, cut apart, and paste to index cards). Use this activity as an icebreaker opportunity for participants to introduce themselves, or during or at the end of the session to have them make a connection to your content. Distribute cards randomly, and ask each participant to stand and get with a partner. Partners take turns reading their quotes, saying briefly what the quotes mean to them, and then introducing themselves, or answering your question, or relating the quote to their situation, etc. Once the exchange is over, call time and ask partners to exchange quotes, and find a different partner. Do as many exchanges as time permits.

Quick tip: You don’t need to gather as many quotes as participants. You can repeat quotes two or three times to produce larger sets of cards.

Caution: You will need a microphone or loud projecting voice to be able to call time to switch partners and to call an end to the activity. This activity will likely be very challenging with a group larger than 60-70 people.

Image credit: Sean MacEntee via Flickr

Image credit: Sean MacEntee via Flickr

Rad Resource: The p2i family of tools and resources to polish your presentation to perfection!

Hot Tip: Type”p2i” in the search box (just look to your right…see it?) and read some great aea365 posts from people who have used p2i tools to spice up their presentations.

*Coffee Break Webinars are free for AEA members. Not a member? Why not join now? Click here for more information.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, my name is Mariana Enríquez and I am a Program Evaluation Consultant based in Denver, Colorado. My work focuses on the evaluation of education and public health programs across Colorado.

Immigration has been in recent news because of the large number of unaccompanied children arriving in the country from Central America. Colorado, and especially the Denver Metropolitan Area, is home to a large number of immigrants from all corners of the world; many are refugees from Africa, Asia and the Middle East. In fact, close to 10% of Colorado residents were born in countries other than the USA (source).

Although more than one-third of immigrants in Colorado are naturalized U.S. citizens (source), many maintain their own language and culture. For example, almost 17% of the Colorado population speaks a language other than English at home, and Denver Public School students collectively speak more than 120 languages. This diversity makes evaluation work very challenging when crossing languages and cultures trying to reach these communities. As AEA’s Statement on Cultural Competence in Evaluation indicates, “The diversity of cultures within the United States guarantees that virtually all evaluators will work outside familiar cultural contexts at some time in their careers.” Additionally, “Cultural competence is fluid. An evaluator who is well prepared to work with a particular community is not necessarily competent in another.”

Hot Tips:

  • Learn as much as possible about participants’ cultural identity and background.
  • Use cultural brokers, cultural translators, bridge builders, interpreters to access and get to know your participants.
  • Do not assume that same language makes same worldviews. Language could be a barrier, but it is not “the only” barrier.
  • Adapt to the participants’ needs, do not expect them to adapt to yours.
  • Ensure that participants’ intentions are understood and their voices are heard.
  • Use advisory committees, involve representation from all stakeholders in all phases of evaluation.

Rad Resources: Things to Do in Denver during Evaluation 2014

  • Get out at sunset and don’t miss Chihuly Nights, illuminated glass sculptures by renowned sculpture artist Dale Chihuly on display at Denver Botanical Gardens.
  • Getting around downtown Denver is easy and FREE. Quickly reach downtown restaurants, museums and shops on the FREE 16th Street Mall Ride. Get around downtown quickly during the rush hours on the FREE Metro Ride on 18th Street.
  • Visit the newly renovated Union Station where you can connect to many metropolitan areas via bus
  • Shop or eat at the historical Larimer Square, five blocks west of the Colorado Convention Center.
  • Head west to Boulder – Rent a car and visit the Celestial Seasonings tea factory and take a hike among the famous Flatirons in beautiful Chautauqua Park.
  • Check the Westword magazine for other ideas and activities.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

·

Hello, I am Maggie Miller, the principal of Maggie Miller Consulting. I conduct program evaluation for nonprofits in the Denver/Boulder area. Welcome to Colorado! We Coloradoans tend to be very friendly; when you meet us at Evaluation 2014, we will be very happy to share any information about Colorado that we can.

Coloradoans also like to learn about evaluation. When I’m not consulting, I teach various evaluation classes and workshops in the greater Metro Denver area. There are many opportunities for program staff in nonprofits (and the private sector) to learn about evaluation. These are a few organizations that I’ve taught for: the Colorado Nonprofit Association, the Nonprofit Cultivation Center, Mountain States Employers Council, the Nonprofit Management program at Regis University, and Denver Evaluation Network (DEN), which is for Denver-area museums and cultural institutions. The staff at the Denver Public Library system were very receptive to a series of evaluation planning classes I gave, and once I even presented a logic modeling workshop for the HR department New Belgium Brewery. Hey, everyone can benefit from thinking about outcomes!

(P.S.: While I’ve never taught for them, I should mention that there are some large evaluation firms in town that offer excellent training to our evaluation-oriented Coloradoans.)

Lessons Learned: Anyone can learn about evaluation and improve their skills. It’s important to keep these teaching tips in mind.

  • Assess where your students “are at” in terms of their experience and existing skills (which may include evaluation-related things like teaching, research, project management, and facilitation).
  • For any given teaching opportunity, figure out what’s most important to teach. Keep your lesson focused on a few important ideas which they will remember and use, rather than giving them an overwhelming smorgasbord.
  • Facilitate hands-on interactive activities to help people engage deeply with new ideas.
  • Use examples that are relevant to your students, and encourage them to apply what they learn to their professional (and even personal) lives.
  • Whenever possible, get them to review what they learned. This is easier in multi-session workshops or classes, but you can still do it before and after breaks in a one-time workshop.

Hot Tip: Some of the places I’ve taught are great resources for you when you are in town! Check out Denver’s wonderful DEN-participating museums, our fabulous public library, and taste some great New Belgium beer at many restaurants and bars in the Denver area.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

 

Greetings from Colorado – Home of the Southern Rocky Mountains and the edge of the Great Plains. My name is Helen Holmquist-Johnson and I am the Assistant Director of the Social Work Research Center at Colorado State University in Fort Collins, Colorado. Colorado is not only geographically diverse, but incredibility diverse in terms of the demographic characteristics of the communities and the individuals who live here. Some of my recent work focuses on evaluating evidence-based programs that promote strengthening families and keeping children safe and healthy.

In Colorado, the Division of Child Welfare is a state supervised, county administered system. In fact, Colorado is one of only nine states in the U.S. where the administrative structure of Child Welfare can be described this way. Each county, while held to the same State and Federal requirements, can individually decide how to operate and deliver child welfare services to families. This arrangement increases the level of autonomy the counties have on everything from which programs they implement, to utilizing different models of leadership and supervision. In a state with 64 counties (I know some of you are already ahead of me here), this structure introduces some unique evaluation challenges.

This is where process evaluation becomes useful and important. What might be missed or overlooked in an outcomes evaluation can be captured by asking process evaluation questions. In general, these questions focus on who the program reaches and whether or not the program was carried out as planned. Because we are evaluating evidence-based models, our interest shifts somewhat from asking does the program work, to asking how, why and in what context or conditions does the program work? As you can see, these are the questions which are important to ask when working in 64 different counties throughout the state. Answers to these questions will assist county administrators and other stakeholders in making policy and practice decisions which consider the contextual factors unique to their community and families.

Rad Resource: For specifics about how to design and conduct a process evaluation, read Steckler, A., and Linnan, L. (Eds.), Process Evaluation in Public Health Interventions and Research.

Hot Tip for Denver: If you have children and family coming with you to Evaluation 2014 in Denver you might want to check out The Children’s Museum of Denver and The Butterfly Pavilion.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

·

Hi – I’m Erik Mason, the Curator of Research at the Longmont Museum and Cultural Center, located in Longmont, Colorado, about 35 miles northwest of Downtown Denver. I am not an evaluator – in fact, the word “evaluation” does not appear in my job description.  I have come to believe, however, that evaluation is critical to the success of my work as a museum curator.  Much of that realization is the result of my participation in the Denver Evaluation Network (DEN), a collection of 15 museums across the Denver metro area that have made a commitment to learn about, and do, evaluation on a regular basis.

Only two members of DEN have full-time evaluators on staff. The rest of us are a mix of educators, exhibit developers, administrators, and curators.  Our daily work is filled with school tours, fundraising, label writing, and all the other stuff that goes into making museums fun and interesting places to visit. As a result, evaluation can get short shrift. We fall back to anecdote and what we think we know.

Over the last two years, the members of DEN have been presenting at museum conferences about the work we are doing to bring evaluation to a broader community.  It has been fascinating watching people who always thought evaluation was something scary and hard, and required a large supply of clipboards, realize that it can be done in many ways.

Within my workplace, I have been pleasantly surprised as we have begun incorporating evaluation into more and more of what we do. Data gathered from iPad surveys provides a baseline understanding of our audience demographics and allows us to compare the changes in our audience as our special exhibits change. Evaluation is now a part of the development of all our exhibits. In the course of doing evaluation, I’ve seen attitudes change from “Why are we wasting our time doing this?” to “When are we doing another evaluation?”

Rad Resource: Check out this video of testimonials from members of DEN.

Hot Tip for Evaluation 2014 Attendees: Denver really is the “Mile High City” and you can take home proof of this fact with a short jaunt and a camera. A free shuttle and brief walk away from the Colorado Convention Center is the Colorado State Capitol building, a Neoclassical building that sits at the eastern end of Denver’s Civic Center Park. The Capitol building sits exactly one mile above sea level, and the official marker can be found on 13th step. The Capitol building is emerging from a multi-year restoration effort with a shiny new coat of gold on its dome, in honor of Colorado’s mining heritage. Free tours of the Colorado Capitol Building are offered Monday-Friday.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

My name is Valerie Williams and I am Senior Program Evaluator at the University Corporation for Atmospheric Research (UCAR) located in Boulder, CO. UCAR’s Center for Science Education is currently developing a new climate exhibit for the Mesa Laboratory Visitor Center and I am providing front-end evaluation support to help them test concepts and ideas about climate change with different audiences.

Teachers and their K-12 students are a primary audience for this exhibit; so much of my work involves conducting focus groups with teachers. It can be difficult to access teachers during the summer months when most schools are closed. Yet, this is often an opportune time for scheduling focus groups with teachers without having to squeeze time from their busy school day.

Lessons Learned: Professional development and skill-building workshops are a great way to identify local teachers that may be willing to participate in a focus group during the summer months. Boulder and the surrounding Front Range community are home to many universities and science research centers that host teacher workshops on climate-related topics during the summer. Working with local workshop coordinators can be an effective way of connecting to teachers.

Tokens of appreciation can go a long way toward expressing gratitude. Despite working within limited budgets, I always try to provide teachers with something they can use for their classrooms, such as posters or hands-on manipulatives to let them know I value their time.

Rad Resources for Front-End Evaluation: Most of my experience is in evaluating formal science education programs, so moving to informal science and museum evaluation has been a bit challenging. However, I’ve found many resources that have helped to smooth this transition.

Hot Tip: Advice for Evaluation 2014 in Denver. Not surprising, my hot tip is to take short trip to Boulder to de-compress from Evaluation 2014. Only 25 miles away from Denver, Boulder offers an amazing array of activities for connecting with nature. From the nature hikes that give you a breathtaking view of the Flatirons, to people watching and all around entertainment, a visit to Boulder is a great way to end an exciting and intellectually-stimulating conference!

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

 

We are Sheila A. Arens, Stephanie Fuentes, and Antonio Olmos,members of the Colorado Evaluator’s Network (COEN) and the Local Arrangements Work Group (LAWG). We are honored to be hosting the upcoming 2014 AEA Conference in Denver! The LAWG is comprised of members from the COEN community who are working on a number of subcommittees to make your conference professionally and personally gratifying. When you are in the registration area, stop by the information desk and say hello. Volunteers will be available to provide information about the area and provide advice about nearby restaurants (for fast Mexican, try Colorado’s own Chipotle), coffee shops (Dazbog is a local favorite), and the like. They may also share some fascinating Denver facts [Denver brews over 200 different beers daily—more than any other city in the nation or warn you about “zombies” roaming downtown October 18th, when Denver hosts the world’s largest Zombie Crawl (we couldn’t make this up if we tried!)].

Hot Tip: Downtown Denver is teeming with places to stay and things to do. Check out the searchable Colorado Convention Center site for housing options. With the support of COEN the LAWG created guides about the Denver area that provide information about Denver restaurants, public transportation, and suggestions for activities. Electronic versions of these can be accessed via the AEA 2014 Conference websiteor directly via the following links:Guide to Denver, Dining in Denver, Events in Denver, and Visiting Boulder. If you have questions at the Conference, look for COEN members (and our local friends) identifiable by their “Denver” badges.

First Time Attendee Tip: In the registration area pick up a First Time Attendee ribbon for your name tag! This is a great way to meet veteran and fellow first-time attendees. And be on the lookout for evaluators wearing a “Conference Ambassador” ribbon who have volunteered to answer your questions.

International Attendee Tip: The LAWG International subcommittee is working with the International and Cross Cultural Evaluation (ICCE) TIG to connect evaluators from other parts of the globe with U.S.-based evaluators (buddies). There will be an informal opportunity for buddies to share a meal immediately after the ICCE TIG business meeting on Thursday, October 16th.

Need More Information? The AEA Conference website has other useful information about the conference, including a conference schedule, searchable program, information on pre-sessions, accommodations, and even more conference activities.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Older posts >>

Archives

To top