AEA365 | A Tip-a-Day by and for Evaluators

Hello, you may know me, Cheryl Oros, best from Policy Watch columns in the AEA Newsletter, as I have been the consultant supporting the Evaluation Policy Task Force over the past six years. I also directed federal evaluation offices, served at the executive level over broad programmatic efforts and have taught many evaluation courses.

Hot Tip: 

Metrics for both evaluation studies and performance management can be developed from a conceptual (logic) model of a program.  The important questions (related to input, output, outcomes and impact) about a program are developed from the model and the metrics are designed to answer these questions via appropriate analyses.

Cool Trick: 

You can blend learning from evaluation studies with performance metrics for decision makers to assist them in policy making and program adjustments.  Evaluation can also inform whether the targets chosen for performance metrics are reasonable.

Rad Resources :

Lessons Learned:

  • Evaluation studies are needed to determine the impact of programs and to understand why results occur (or not). When these studies also explore program processes, they can shed light on the features of the program over which managers have control, allowing them to influence program success.
  • Performance metrics are usually process oriented, addressing the inner workings of programs that can influence desired impact. Metrics addressing impact should only be used for performance management if they have indeed been validated via an established link to the program via evaluation.
  • Combining evaluation and performance monitoring enables managers to make policy decisions based on an in-depth understanding of the program as well as the ability to monitor and analyze program functioning via performance metrics, possibly in real time.

The American Evaluation Association is celebrating Washington Evaluators (WE) Affiliate Week. The contributions all this week to aea365 come from WE Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, we are Shannon Williams, Research Manager at Rebuilding Together, Allison Schachter, Monitoring and Evaluation Advisor at Pathfinder International, and Tyler Spenser, Founder and President of The Grassroot Project. Together, we are the three inaugural recipients of the Washington Evaluators New Professional Scholarship program established in 2017.

The goal of this scholarship program is to strengthen the sustainability of the local evaluation community by supporting new professionals in integrating evaluation practices and approaches within their respective organizations. Through this scholarship opportunity, Washington Evaluators provides funding to cover the cost of annual memberships with the Washington Evaluators and the American Evaluation Association, as well as the registration fee for the annual American Evaluation Association conference (held in Washington, DC this past November).

Lessons Learned: Why the Washington Evaluators Professional Scholarship Works

  • Access to networking opportunities: For Tyler, the “stamp of approval” that came with the scholarship helped him get a foot in the door in terms of networking with other evaluators and advancing the monitoring and evaluation goals of the organization he runs. The day the scholarship was announced, he received several congratulatory emails from members of Washington Evaluators who were interested in talking about collaboration.
  • Introduction to new resources: Prior to attending the conference, Allison often found herself searching online for guidance and best practices in developing evaluation strategies. The conference provided new frameworks that she could apply immediately to her work at Pathfinder International, including a step-by-step guide to building a strategy.
  • Further enhance skillsets: For Shannon, being able to attend many of the sessions focused on data visualization helped her strengthen her reporting skills and ensure that her research findings are reaching their target audiences. The advice received during a session focused specifically on crafting effective one-pagers has proven particularly fruitful to increasing her colleagues’ awareness and use of her reports.

Rad Resources:

The American Evaluation Association is celebrating Washington Evaluators (WE) Affiliate Week. The contributions all this week to aea365 come from WE Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

My name is Stephanie Cabell and in my role as an evaluation advisor at the State Department, I have the pleasure of engaging with and learning from many smart people from a number of disciplines, including the behavioral and social sciences field.

Behavioral science is a relatively young field and governments have only recently begun using its insights to inform public policy.  More than a dozen countries, including the U.S., have teams of behavioral scientists working with policy makers and government agencies to improve efficiencies for their citizens. The goal:  to improve user access to information and programs in order to help citizens make more informed decisions for their well-being, or that deliver better results at a lower cost for the American people.  Federal agencies are in the nascent stages of developing strategies to apply social and behavioral sciences insights to programs and, where possible, to rigorously test and evaluate the impact of these insights.

Hot Tip: Behavioral science insights can be an effective design tool and component of program logic models and establishing theories of change.  Whether designing a program that requires individuals to work through an online application process, or a program where beneficiaries might have to travel far to obtain services, behavioral science insights can help discern how to optimize outcomes for individuals—information that is then factored into a program’s goals and objective.

Cool Trick: You can blend behavioral science and evidence-based decision making to maximize the range of feedback or data collected and analyzed from programs.

Rad Resources:

  • Visit the National Sciences and Technology Council’s Social and Behavioral Sciences Team’s website for a primer in behavioral sciences insights and its application in the work of government agencies.
  • A counterpart to the United States’ Social and Behavioral Sciences Team is the United Kingdom’s Behavioral Insights Team.  The U.K. team has had success using behavioral sciences insights to design and build scalable products and services that have social impact.
  • There are numerous institutions of higher education throughout the United States that offer graduate-level courses and programs in social and behavioral sciences. A good place to research schools is by visiting the College Board’s website.

Lessons Learned:

  • Government agencies can use social and behavioral sciences insights to simplify the presentation of complex information in programs and, thus, have more consistency in how individuals choose or make decisions.
  • A central insights from social and behavioral science is that there is not yet consensus on whether people respond to incentives, monetary and non-monetary incentives, as a means of getting individuals to take specific actions. Research to date suggests that people are more likely to take advantage of an incentive if they can benefit immediately from it rather than at a later date, such as is the case with a tax credit. This is an area still ripe for research.
  • Federal, state and local government agencies can incorporate social and behavioral sciences insights into broader evidence-based initiatives, and embed it into the fabric of program and project design for better outcomes for people.

The American Evaluation Association is celebrating Washington Evaluators (WE) Affiliate Week. The contributions all this week to aea365 come from WE Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello! I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor, and I have some questions for you today! Back in 2014, I published a similar post about AEA’s Conference History page, and decided it’s time again to ask a few more questions.

Hot Tip: Poke around AEA’s website (or go directly to this link under the “Events” menu) and you’ll find some fascinating trivia about our annual conference. Perhaps some of these burning questions have kept YOU up at night. For instance…

1.) Who was president of AEA in 1988?

2.) In what year was the AEA Annual Conference first held in Washington, DC?

3.) How many different US states and Canadian provinces have hosted the AEA conference?

4.) When was the conference theme: Evaluation and Social Justice?

5.) How many times has the AEA president’s first name been the same as that of a previous AEA president?

6.) Where will Evaluation 2020 be held?

The answers are all there!*

Cool Trick: Want to know about the sessions your favorite evaluator presented in any given year? Curious to see what the hot topics were when the conference theme was Evaluation Quality? Perhaps you have an idea about a new topic and wonder if anyone has presented on it before. Or, you’re just learning something new about evaluation and want to see who the thought leaders on that topic appear to have been over the past few years, so you can follow up on their work or even network with them. You can access conference programs for the last 15 years from the Conference History page for all of this information. Most are even searchable online!

Cooler trick: Want to know how the conference was evaluated and how it performed in any given year? How many evaluators attended in 2003? What do we know about them? How many were students, researchers, professors, or consultants? How many conferences had they attended before? Did they consider themselves novice or expert evaluators? What were their reactions to the conference in that year? Evaluation data and reports are available for several conference years.

Rad Resource: AEA’s Conference History page: Everything you wanted to know about Evaluation 1986 – Evaluation 2018, (33 years!) but were afraid to ask. (Well, perhaps not afraid…)

Bonus Rad Resource: The page also includes links to the websites of the last 6 Summer Evaluation Institutes.

AEA Conferences 1986-2018 by Location

(Click for larger image)

*Except for this one: #3. I counted 17 different states plus Washington, DC, and two Canadian provinces.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! ¡Saludos! I’m Lisa Aponte-Soto, an AEA GEDI alumna, Latinx Responsive Evaluation Discourse (La RED) TIG chair, and CEA local affiliate member. I currently serve as Associate Director of Community Engaged Research at the University of Illinois at Chicago. La RED and CEA foster communities of practice to exchange ideas and enhance collective learning around evaluation practices. Specifically, La RED engages evaluators working to support multiple Latinx-specific contexts through culturally responsive evaluation practices, evaluation capacity building, and evaluation professional development. Similarly, CEA provides evaluators and students career development activities.

Both La RED and CEA value partnerships that will increase membership networking and skill-building opportunities. Namely, members of La RED and CEA in the Chicagoland area have collaborated with the Latina Researchers Network (LRN) to maximize and share professional resources for Latinx researchers and evaluators in the Midwest and nationally. Founded by Dr. Silvia Mazzula at John Jay College of Criminal Justice, the LRN offers ongoing mentorship and career development resources to meet the diverse needs of Latinx professionals. The conceptual framing of the LRN can be found here. Since its inception in 2012, the LRN has provided leadership, research, and evaluation training to over 3000 scholars, researchers, evaluators, academic leaders, and junior investigators.

Most recently, Chicagoland members embarked in a yearlong LRN initiative along with members in Texas to regionalize the network. La RED and CEA members leading the planning steering committee include Leah C. Neubauer, Grisel Robles-Schrader, Diana Lemos, and myself. Opportunities will now be available to establish local LRN chapters and affiliates. Efforts also culminated in an LRN Chicago Chapter kick-off event in May 2018, Latina Women in Academia: Challenges and Strategies for Success featuring keynote speaker Dr. Aida Giachello, a social hour, and World Café discussions to gauge membership interests and needs.

Lessons Learned:

From the World Café discourse, we found that Latinx professionals are seeking more opportunities to:

  1. Network internationally
  2. Engage in writing workshops
  3. Receive methodological skill-based training
  4. Participate in work-life balance activities
  5. Collaborate with interdisciplinary professional organizations and associations

Hot Tip: The LRN will be hosting its fourth biennial Latina Researchers Conference August 23-25 at John Jay College in New York, New York. Register here.

Rad Resources:

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Grisel M. Robles-Shrader

Grisel M. Robles-Shrader

Keith A. Herzog

Keith A. Herzog

We are Grisel M. Robles-Schrader and Keith A. Herzog of the Northwestern University Clinical and Translational Sciences (NUCATS) Institute. The Center for Community Health (CCH), which is one of 10 centers and programs within NUCATS, is specifically charged with offering support and resources to catalyze and support meaningful community and academic engagement across the research spectrum with the aim of improving health and health equity. Community engagement centers across the nation-wide Clinical and Translation Sciences Award (CTSA) consortium offer a similar range of programs and services.

We facilitated efforts within our institution to develop an evaluation infrastructure to better understand, improve, promote, and evaluate the community engagement support and services that CCH offers to investigators. By engaging key stakeholders within CCH and NUCATS more broadly, we concentrated our efforts on metrics and data collection tools relevant to our team’s work.

As part of our comprehensive evaluation plan, the CCH developed community engagement metrics covering six domains aimed at measuring engagement support and outcomes beyond publications and funding:

  • consultation services,
  • capacity building & education,
  • fiscal support,
  • partnership development,
  • institutional-level changes, and
  • community-level changes

Rad Resources:

  • REDCapResearch and Electronic Database Capture is a secure web application for building and managing online surveys and databases.

Phases of Development & Feedback Loops

Phases of Development & Feedback Loops

Utilizing REDCap data collection tool enabled us to refine and adapt our “dream list” of metrics, based on our comprehensive logic model. In close collaboration with key internal stakeholders, we implemented the CCH Engagement and Tracking project focused on consultation services. At the end of the 12 month pilot period, we connected again with these stakeholders to assess what was working well, was not working well, and what needed to be revised. For example, we focused our review on categories that consistently had high rates of missing data and discussed whether those were still considered relevant questions to track. Moreover, we utilized the 12 month review as an opportunity to assess revisions to our logic model, based on evidence-based insights informed by the consultation tracking form.

Hot Tips:

  • Engage key stakeholders throughout the evaluation development and implementation processes. This ensures you collect relevant data, utilizing strategies that are meaningful for your team.
  • Utilize the 80/20 rule to avoid data collection creep (i.e., trying to collect everything, all the time). Ask yourselves: “What do we consistently encounter, do, collect, and share 80% of the time?”
  • Pilot data collection tools using real-world data. Refine the tool. Revise and repeat (as necessary).
  • Establish strong project management skills to keep the group on task and to secure buy-in from key stakeholders.
  • Support standardization by developing manuals with succinct definitions and concrete examples. Including instructions and contextual links within REDCap so it is available when your team enters data.

 

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello, I am Matt Feldmann, the principal researcher and owner of Goshen Education Consulting, Inc. (http://www.gosheneducationconsulting.com). I am a member of the Chicago Evaluation Association and the President of the Evaluation Association of St. Louis (EASL).

I began my local affiliate experience as a member of the Chicagoland Evaluation Association because I had contracts in Northern Illinois and Chicago and because there wasn’t a local affiliate in St. Louis. I like to joke that I was the furthest suburb south of Chicago. (The really funny part is that there are so many suburbs of Chicago that most of the CEA members never got the joke and really thought that Edwardsville was a legitimate suburb.) Check out the following map.

Map showing Chicago and Edwardsville, IL

Hot Tip: Actively Engage other Affiliates

EASL is officially becoming a non-profit corporation this summer 2018. In many ways EASL has developed through my active participation with CEA and by assistance from CEA leaders. CEA has actively encouraged me by featuring me as a speaker for the annual Jazzin’ at the Shedd event which gave me exposure to many of the members. They further provided by-laws and support as we went through the processes for development.

Lesson Learned: Think Creatively about Programming

A great experience has been to develop a shared regional speaker series between EASL, CEA and the Indiana Evaluation Association. I have presented a short version of a pre-conference workshop about introductory independent evaluation skills in both Chicago and Indianapolis and in exchange EASL has received a workshop from Asma Ali (president of CEA) on mentoring new and emerging evaluators.  Leah Neubauer (former president of CEA) will present a workshop in September on culturally responsive evaluation and we will host Mindy Hightower King (Indiana Evaluation Association) in late fall.

Rad Resource: Local Affiliate Collaborative (LAC)

Get to know the LAC by visiting this AEA LINK. The group has a well-organized conference call monthly that promotes group development and that has resulted in a pre-conference workshop for local affiliate leaders.

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello, we are Amanda Lambie, Cara Karter, and Michelle Lopez, three members of the Research & Evaluation team at After School Matters. As internal evaluators for an out-of-school time provider, we highly value youth voice as an essential ingredient in positive youth development. So we love that this year’s AEA theme is Speak Truth to Power.

Why promote youth voices in your work? Because when young people are placed in a position to be heard, that opportunity…

  • Builds their self-determination and motivation which can affect academic, social, and technical skill development1.
  • Improves their sense of ownership and belonging which can increase program engagement and participation1.
  • Positions them in a place of influence, challenging traditional societal power dynamics and encouraging civic engagement2.
  • Affirms an ethical ideal that participants deserve the opportunity to speak and be heard2.

Hot Tips:

  1. Include an open-response field in your end of session participant surveys. Approximately one-third of the teens in our programs provide comments about their experience. Our team reads every single one, categorizes them into themes, and shares them with our staff. Teen responses inform decisions we make across the organization – about recruitment, instructor professional development, programming, and more.
  2. Plan to regularly conduct focus groups or interviews with participants and compensate them for their time. We engaged 180 teens last year through focus groups and interviews to collect teen feedback about new initiatives or policies. We hold focus groups and interviews immediately before or after a program in the same location and we provide gift cards to participants. It can be expensive, but minimizes burden and reinforces the value of participants’ time and effort.
  3. Test your surveys using cognitive interviews3. Ever wonder what participants think of your survey? Stop wondering and do some cognitive interviewing to find out! Not only is this a great way to incorporate participant voices in your work, but it also results in a more valid and reliable instrument. Win-win!
  4. Employ a participant as an intern. Participants can help you develop more relevant indicators, identify themes in data you may be missing, and collect data that you may have been unable to collect.

Rad Resources:

  1. “Keeping it real”: An evaluation audit of five years of youth-led program evaluation
  2. Sound, presence, and power: “Student voice” in educational research and reform.
  3. Cognitive interviewing: A “how to” guide

 

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

Hi everyone, Dylan Felt and Peter Lindeman here with the EDIT Program at Northwestern University’s Institute for Sexual and Gender Minority Health and Wellbeing. Our team focuses on fostering a learning community to improve the health and wellbeing of sexual and gender minority (SGM) populations in Chicago and beyond.

Our work has taught us that evaluators have a responsibility to consider sexual orientation and gender identity in our work. We want to talk a bit about why, and to provide evaluators with the tools to do so effectively and conscientiously.

The LGBT+ community faces a number of critical health disparities. While some are well known (e.g., HIV, mental health, and substance use), there’s still more we don’t know. Just this summer, our team published one of the first papers to highlight the link between sexual minority identity and Type 2 diabetes risk factors.

Many of the disparities which affect the SGM community are driven by complex structural factors and exacerbated by minority stress and stigma, which can impact the effectiveness of an intervention. If you aren’t specifically breaking down your results to consider SGM individuals, much like you would with sex and race, you aren’t only doing a disservice to the LGBT+ folks in your program, but to the accuracy and quality of your evaluation. If a program is showing results for some participants, but SGM folks are faring worse – that’s something stakeholders need to know.

Hot Tips:

Check out our recommendations for how best to ask these questions below:

  1. What was your sex assigned at birth?
    1. Male
    2. Female
    3. Prefer not to respond
  2. What is your current gender identity?
    1. Male
    2. Female
    3. Agender
    4. Non-Binary
    5. Not Listed: ________
    6. Prefer not to respond
    7. Unsure
  3. Do you identify as transgender?
    1. Yes
    2. No
    3. Prefer not to respond
    4. Unsure
  4. What is your sexual orientation?
    1. Straight
    2. Gay/Lesbian
    3. Bisexual/Pansexual
    4. Not Listed: _______
    5. Prefer not to respond
    6. Unsure

Depending on the setting of the evaluation, you may want to be more detailed. While the suggestions above are a good starting point, you might want to add more options for sexual orientation (e.g., asexual, queer) and gender identity (e.g., gender fluid, gender queer) if you are evaluating a program at an LGBT health center. Remember – don’t be afraid to ask your program stakeholders for advice! They know their community best, and can be a great resource.

Rad Resources:

Speaking of resources: Sheila B. Robinson and Kimberly Firth Leonard’s book, Designing Quality Survey Questions, includes information about how to ask these and other demographic questions.

The Williams Institute has two great guides on this topic – the SMART Report and the GenIUSS Report. They are some of the most extensive resources available that provide best practices for asking sexual orientation and gender identity questions while conducting research and evaluation.

Want to know more?

Reach out! We’re happy to talk. You can reach Peter, Dylan, and the rest of the EDIT team at EDIT@northwestern.edu

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Welcome to the Chicagoland Evaluation Association’s (CEA) week of AEA 365. We are Asma Ali, President and Leah C. Neubauer, Past President of CEA. CEA begin many years ago as a forum for evaluation professionals and students in the Chicagoland area to network, exchange ideas and knowledge, and participate in professional development activities that promote excellence in evaluation.

CEA as an AEA Local Affiliate

As the lead post for this week, we offer insights and resources about AEA Local Affiliates. Local Affiliates are our linkage to a network of other local evaluators and evaluations. CEA has benefitted tremendously in the last year from its participation in Local Affiliate activities. Our interactions with the Local Affiliate Groups have supported our learning about successful affiliate strategies, facilitated new programming and professional connections, and introduced us to inspiring evaluators throughout AEA. As a result, CEA has expanded programming, updated its communications strategy, and revitalized its member rosters.

Affiliates as Evaluator Learning Spaces

As evaluators, how do we create opportunities to learn, grow and enrich our practice?  What happens in our post-formal education and training lives to facilitate new learning and growth?   Through the lens of adult and continuing education, affiliates function as local communities of practice (CoP) and homegrown entities that promote various types of learning and growth within and among evaluators.

CEA member blogs this week

One benefit of our participation has been an expanded national and local evaluation communities of practice (CoP), which are featured throughout this week’s CEA blog. CEA has organized a stimulating week of posts that address the evaluation-related work of our members including topics on: local affiliate experiences, after school programming and youth voices in evaluation, a new network for Latina Researchers, evaluating community engagements, and sexual and gender minority communities (SGM). This is our third week of featured posts.  Check out our previous work here.

Rad Resource #1:  CEA Affiliate Website.  Are you in the Chicagoland and surrounding area, looking to collaborate with someone based in Chicago, or interested in our affiliate work and professional development opportunities?  Check out the CEA website or email Asma (CEA President) at asma.ali1@gmail.com for more information.

Rad Resource #2:  AEA Local Affiliates.   Are you involved in your local AEA affiliate?  A list of affiliates, contact information, websites, conference archives, helpful links and a variety of evaluation websites can be found here: https://www.eval.org/affiliate.

Rad Resource # 3:  Local Affiliate Collaborative.

As CEA Leadership, we join folks on the Local Affiliate Collaborative (LAC) Steering Committee monthly calls to share support, resources and expertise.  The members and their respective AEA affiliates represent decades of evaluation and AEA leadership experience.  Check out the website and resources. If you’d like to join the LAC, email Leah (LAC co-chair) at leah.neubauer@northwestern.edu.

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Older posts >>

Archives

To top