AEA365 | A Tip-a-Day by and for Evaluators

Greetings, colleagues! My name is Steven Bingham. As a professor of education at High Point University in High Point, North Carolina, and as a former researcher in a federally funded regional research and development laboratory, I approach the dissertation process with a bias for learning “what works”. In the 30 dissertations that I either have chaired or served as committee member in the last six years, the majority of those have used program evaluation, in part or whole, as a methodological approach.

Hot Tips:

Beyond ensuring successful dissertation defense, I have become increasingly interested in impacts of what I call “program-evaluation competency.” My hope is that program graduated candidates and districts that employ them value and apply what I have taught. Notably, our course of study requires that all candidates complete at least one program evaluation prior to conducting their dissertation research.

I did, then, what any curious professor would do: I surveyed a sample of matriculated practitioners. Here’s what I learned: On the upside, 100 percent of responding practitioners reported systemically using the tools of program evaluation in assessing program merits. Eighty percent reported appropriate application of program evaluation, resulting in program scale-up or modification.

On the downside, one respondent stated that a multimillion-dollar reading program was abandoned without evaluating it at all. Of mixed blessing is the respondent who reported that evaluation of a one-to-one instructional technology program had failed to consider human variables. The result was erosion of teacher confidence and reduced program effectiveness. Despite evidence, district leaders deemed the finding unacceptable.

Not surprisingly, my findings suggest that program evaluation may be a credible approach to school and district improvement. Readers whose work involves determining the merits of educational programs may also recognize a familiar fly in the ointment: for program evaluation to be of value, invested leaders must be willing to embrace the results even when politically inexpedient.

Lessons Learned:

As a professor teaching program evaluation to school and district practitioners, my greatest and most affirming lesson learned is that, if taught as part of a doctoral program, particularly in dissertation research, program evaluation seems to have a better-than-even chance of being used and useful for public school districts and their students. In the pursuit of education as evidence-based practice, that is good news.  

Rad Resources: Here is a source the describes the benefits of a dissertation that has a program evaluation focus.

 

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hi!  My name is Daina Lieberman, English teacher and International Baccalaureate (IB) Middle Years Programme (MYC) Coordinator at South Lakes High School in Fairfax County Public Schools, Virginia.  I am also a recent graduate of the Ed.D. in Educational Administration and Policy Studies program at The George Washington University.  Today I’d like to provide some tips on Project-Based Learning.

Hot Tips:

As an IB MYP Coordinator, I work with teachers in my building to create, implement, and assess performance-based assessments in all subject areas, including PBLs.  Project-Based Learning, or PBL, has become an important method of teaching and assessment in schools.  Instead of students being taught a unit and then creating a project, students are asked an open-ended, driving question that requires them to research and learn information to solve a problem.  Their final work may vary in form and content, but students need to collaborate, think critically and creatively, and conduct research, and demonstrate their understanding.

PBL sets up situations that allow students to solve real-world problems and create authentic solutions.  As adults, we solve our problems in the same way—if we want to buy our first house, we conduct research, ask professionals for help, take action, reflect, make adjustments, and hopefully purchase a home successfully.  Teachers need to guide students throughout their inquiry phase to ensure they are learning appropriate and factual content relevant to solving the problem and answering the driving question.

PBL is a great way to enable English language learners, special ed students, advanced students, and all other students to demonstrate their learning in ways teachers can assess and students can enjoy.  This type of assessment can be used with students at any level, including undergraduate and graduate.

Be sure when assessing PBL work that your rubric is assessing student learning, not behavior or completion.  Check in with other teachers who have conducted PBL units and look at various rubrics before creating one; ask a colleague to look it over to ensure you are assessing what you want to assess.  You can also work with your students and have them help you create a rubric to assess their work.

Have fun!

Rad Resources:

For a great definition of performance-based assessments, check out Patricia Hilliard’s article on edutopia called Performance-Based Assessment: Reviewing the Basics or this booklet from Stanford School Redesign Network called What is Performance-Based Assessment? which includes research and examples of PBAs.

Check out this page on Edutopia for articles and videos on Project-Based Learning and this Research Spotlight on Project-Based Learning by the NEA.  Resources and Tools for PBL Start to Finish on edutopia is another great page with even more resources and links to help you get started.

For more information on developing performance-based assessments and rubrics, read Doug Wren’s AEA blog post on the topic and have a look at Ross Cooper’s blog post on Project-Based Learning Professional Development (part 2): Student Created Rubrics on ASCD Edge.

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

This is John Fischetti, Dean of Education/Head of School, at the University of Newcastle in Australia. We are one of Australia’s largest providers of new teachers and postgraduate degrees for current educators. We are committed to equity and social justice as pillars of practice, particularly in evaluation and assessment.

Lessons Learned:

It is with that equity lens that I want to share an Australian story.

In early May 2018, the Australian government launched a new report on the failure of Australian schools. It challenges the current schooling system by calling out the vestiges of the assembly line industrial age of education and the current lack of investment in “individualized” learning and future-focused skills. It calls for new types of online formative assessment and new progression of learning schemes to focus literacy and numeracy skills early and to reinvent years 11 and 12 of high school to more creative and innovation based.

The premise of this new scheme is line with the best thinkers in the world (from Guskey to Zhao) and the most progressive nations in the world (yes, sorry folks, Finland, Switzerland, Belgium and the Netherlands). However, the assessment recommendations are a reboot of more of the same. Assembly-line assessments in the early years are perhaps the opposite of how to boost literacy and numeracy early on. The report asks for massive changes to an assembly line reality by advocating for more assessment assembly-lines. And some of the recommendations in the report are already failing elsewhere, such as New Zealand’s system where young people can face a test a day.

Hot Tips:

I recommend that all of us who work in schools and with student performance data spend time this year advocating for reinventing the systems. We are to prepare children to be successful in their futures. To do that they need knowledge, skills and dispositions to be passionate, vibrant, dynamic, curious, open-minded, engaged (and literate and numerate) participants in their own journeys. We can’t assembly-line assess that.

One urban legend definition of insanity is “doing the same things over and over again and expecting better results.” When assembly line schooling is transformed to individualized learning, but the assessment scheme is from the same original mindset, we have the cart in front of the horse. And that is insane. “Stop, drop and test” assessment schemes are obsolete. It is time we in the field called this out and moved forward to build learning centers instead of testing centers. 

Rad Resources:

Gonski Review Attacks Australian Schooling Quality and Urges Individualized Teaching Approach

Thomas Guskey. What we know about pre assessments.

Yong Zhao: What Works Can Hurt: Side Effects in Education

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hello. I am Sean Owen, Associate Research Professor and Assessment Manager at the Research and Curriculum Unit (RCU) at Mississippi State University. Founded in 1965, the RCU n. The RCU benefits K-12 and higher education by developing curricula and assessments, providing training and learning opportunities for educators, researching and evaluating programs, supporting and promoting career and technical education (CTE), and leading education innovations. One of my roles at the RCU focuses on providing our stakeholders practical strategies to inform their practices better and to guide future processes.

Lessons Learned:

Technology can be your best partner. We all agree the rate of development of technological advancements has increased at a rapid pace in recent years. Muhammad Yunus said, “While technology is important, it’s what we do with it that truly matters.” In our group, we have started leveraging Google Forms with a combination of add-ons to increase the efficiency of our classroom observations. One of the keys to implementing effective observation systems is a failure to use observation rubrics that are concise, focused, easy to use, and digestible.

Hot Tip:

Explore add-ons for Google tools. There is a vast community of developers creating Add-Ons for Google’s Suite of Tools. As Alice Keeler has shown in recent months, anyone can create add-ons for the Google Suite to best suit the processes of their work. As we continue to forge ahead in a climate of tight budgets, evaluators using cost-effective and efficient practices have become even more important. To learn more about developing Add-ons for the Google Suite, navigate to the Google Developer site.

Rad Resources:

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hi, I am Paula Egelson, research director at the Southern Regional Education Board in Atlanta. For this week of the AEA 365 blogs, board members from the Consortium for Research on Educational Assessment and Teaching Effectiveness (CREATE) will be sharing blogs associated with the tenants of the CREATE organization: assessment, teacher and principal effectiveness, program evaluation and accountability.

CREATE has a long history with the Joint Committee on Standards for Educational Evaluation (JCSEE) that has been housed over the years at Western Michigan University’s Evaluation Center, University of Iowa and Appalachian State University. Begun in 1974, JCSEE members representing research and practitioner organizations both nationally and internationally have created and revised the Program Evaluation Standards, Personnel Evaluation Standards and Classroom Assessment Standards.

Hot Tips:

Our focus today is on the Program Evaluation Standards and some of its uses. The Program Evaluation Standards apply to a wide variety of settings in which learning takes place. This includes schools and universities to nonprofits and the military. These 30 standards are organized around the five key attributes of utility, feasibility, propriety, accuracy and accountability. Each attribute includes the key concepts related to it, standards statements implementation suggestions, hazards to avoid, case narratives and references for further reading.

The program evaluation standards provide guidance and support reflective practice associated with:

  • Whether and when to evaluate,
  • How to select evaluators and other experts,
  • The impact of cultures, contexts and politics,
  • Communication and stakeholder engagement,
  • Technical issues in planning, designing and managing evaluations,
  • Uses and misuses of evaluations,
  • Issues related to evaluation quality, improvement and accountability.

Among other things, program evaluation standards can help evaluators resolve some common evaluation issues that are found below:

  • Stakeholder over-involvement in the evaluation,
  • Agency disagreement over the evaluation recommendations,
  • A contractor desiring an evaluation report with predetermined outcomes,
  • Stakeholders “sitting on” an evaluation report, and
  • A lack of data collection integrity (lack of timeliness related to data collection, supervisor review of employees’ survey responses, teachers reviewing an online test or survey before the administration begins, not following random sampling guidelines).

I encourage you take an opportunity to access The Program Evaluation Standards to determine how these standards can be of best use to you and your colleagues. I look forward to hearing from you about your uses of the standard and obtaining your feedback of the standards.

Rad Resources:

Detailed information about the Program Evaluation Standards

Information about the work of the Joint Committee on Standards for Educational Evaluation

For more information about CREATE, please go to www.createconference.org. CREATE’s annual research and evaluation conference will take place at William and Mary College in Williamsburg, Virginia, on October 11 and 12, 2018. We hope to see you there!

 

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hello, my name is Jayne Corso and I am the community manager for the American Evaluation Association. 

Many turn to social media to promote their services or expertise. Although the sites are a free resource, you do have to dedicate time to and energy into creating relevant content on a regular basis. I have listed just a few reasons why it is important to frequently post on social media.

Hot Tip: Keep Your Followers Informed

Social media is one of the best platforms for creating connections and increasing awareness of your services or expertise. However, many small companies or organizations misuse this tool because they maybe post once or twice a month, post six times a quarter, and in total share 15 posts a year. This is not enough for your content to stay relevant with your followers. To keep your followers engaged and build relationships, you should be continuously having social conversations.

A posting schedule is unique for each organization. Some organizations will have the resources and content to post multiple times a day, while others will have the resources to post 1-2 times a week. Find the right balance that matches with your content and staff resources. Once you start posting on a routine schedule, your post’s reach and impressions will increase.

Hot Tip: Stay Relevant in Important Conversations

Social media is not only a great tool for getting your message across, it can also help you participate in important conversations that are taking place online. Twitter, Facebook, and LinkedIn serve as great listening tools. Use these platforms to find relevant news and industry trends that are being used by others. You can than share these resources with your followers.

Sharing content is also a great way to fill in the gaps of your content calendar, especially during your slow season when you don’t have as much news to share.

Hot Tip: Posting Improves Search Engine Optimism

Posting regularly on social media helps with your social page’s SEO. Google’s algorithms favors sites that are updated frequently, and the same goes for social media pages. Posting on a regular schedule will increase your social page’s search ranking, resulting in increased traffic to your social page.

Get busy posting!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

Hello, I’m Martha Brown, and I started my consulting business in 2016 thanks to some encouragement from a group of passionate independent consultants. I was recently inspired by a keynote speaker, Lea Bill, at the 2018 CES Conference. She displayed a logic model that was circular and artistic, and it included the values of the people she worked with. I immediately thought of one my clients, a botanical garden, and how I could use their stated values to deepen their thinking about their programs.

Like Kylie Hutchinson’s data parties, I decided to have a “logic model party” full of big sheets of paper and lots of colored markers. We started by determining outcomes, which had never been done. My job was to ask the right questions to the garden’s Executive Director, the Program Manager, and a very hands-on and knowledgeable board member. This was more difficult than I imagined, as the Program Manager’s thinking constantly defaulted to activities. But after an hour, we had our first working draft of outcomes.

Kids & Schools desired outcomes handwritten chart

Using Bob William’s questions about assumptions, I guided them into digging deeper about why they believed these were reasonable outcomes. From there, we worked backwards and after 3 hours, had a very colorful logic model taped to the wall. It is only the beginning of our work, but in the process of making the first draft, something beautiful sprouted. They decided to approach one school and invite them to participate in a whole-school partnership, which we can evaluate in several ways over three to five years. None of us saw that coming, and it is a great idea. My work with this client went from a very small 2-year contract for assessing student learning in a single program to a potentially long-term contract evaluating a new model in science education. Because I don’t hesitate to share that I explore new approaches and bring them to our work together, my short-staffed client trusts they have a real partner – a consultant and evaluator – who looks out for them. It’s a win-win.

Lessons Learned:

Ask the right questions. Then…

Step back. While the team brainstormed, I took notes on big pieces of paper that led to 7 pages of typed minutes.

It can’t all be done in one session. We have more work to do, but we planted many seeds on fertile soil.

Organic is good. If the seeds we planted grow, I will be conducting a longitudinal evaluation on a new model that may have an impact on how educators think about teaching science.

Hot Tips for Independent Evaluators:

Work with organizations and programs that inspire you. Have fun at your job.

It’s OK to start small. You never know what will come of it.

Get out of the way. Your clients are the experts.

Learn as much as you can about as much as you can – and bring it back to your clients.

 

The American Evaluation Association is celebrating IC TIG Week with our colleagues in the Independent Consulting Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Melanie Hwalek

Melanie Hwalek

My name is Melanie Hwalek. I have been practicing evaluation for almost 40 years. I’ve always taken for granted my knowledge of social science research design. Recently, I had a profound experience that made me realize how this knowledge can be used to empower voices from the field.

Hot Tip: Respect the power of our social science research knowledge and use it for the greater good.

I learned this lesson from BABESWORLD, a systems approach to healthy living and substance use disorder prevention. BABESWORLD began 40 years ago as BABES, a K-12 puppet-based storytelling curriculum designed to assist young people develop positive living skills through accurate, non-judgmental information about the use and abuse of alcohol and other drugs. BABESWORLD’s design emanated from the personal recovery of its designer and hours of storytelling from recovering adults. Decades of stories from BABESWOLD participants illustrate how it has profoundly impacted their lives.

My organization was hired to help BABESWORLD become “evidence-based.” Certification as “evidence based” would qualify BABESWORLD for U.S. federal funding. At the contract kick off meeting, the BABESWORLD CEO asked me to read a chapter in Antoine De Saint-Exupery’s The Little Prince. The Little Prince is a fable written in the 1940s where a little prince travels from his home on a star to visit other stars in the galaxy. On each star he visits there is a parable. The parable the CEO asked me to read was about a star where a Turkish astronomer had identified, for the first time, the little prince’s home. In his turban and robe, the Turkish scientist presented his findings at a galaxy-wide conference of astronomers. Nobody believed him. Then, the king of the Turks mandated that all people dress in European clothes or face death. Ten years hence, the Turkish astronomer made the exact same presentation to the exact same conference of astronomers. This time he was dressed in an elegant European suit. The conference hailed him for his discovery. After I read this story, the CEO said to me, “I need you to dress BABESWORLD in European clothes.”  This was a profound realization to me of how my knowledge of quasi-experimental design could be used to translate what BABESWORLD knew from decades of anecdotal evidence (Turkish) into the language of the U.S. scientific community (European). It humbled me to think about our social science research knowledge as “power,” and that we can use this power to give voice to those in the field.

Rad Resources: To learn about BABESWORLD, please visit: www.babesworld.org

Hazel Symonette describes this translational power as boundary spanning, and the evaluator as boundary spanner. Read Symonette’s chapter on culturally responsive evaluation in Hood et. al’s Continuing the Journey to Reposition Culture and Cultural Context in Evaluation Theory and Practice.

 

The American Evaluation Association is celebrating IC TIG Week with our colleagues in the Independent Consulting Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! My name is Norma Martínez-Rubin. As an independent evaluation consultant, I specialize in evaluating nonprofits’ and foundations’ disease prevention and health promotion programs and projects. I’m particularly drawn to initiatives designed to address health, economic, and educational inequities. Because of the complexity associated with such efforts, I’ve been part of multidisciplinary evaluation teams in which, after labored discussions of evaluation approaches and data-collection strategies for our evaluation proposal(s), one of us is tasked with developing a budget to accompany a proposal.

I recall an early attempt where my team lead strongly believed we would increase our chances of obtaining a contract award by positioning ourselves as the lowest bidder. At the time, I didn’t voice my reluctance about that believing that team member to be well experienced in such matters. In retrospect, the low-cost approach was not the defining factor in contract selection. Nor would I want it to be now regardless of which side I occupy at an evaluation proposal reviewers’ table.

What changed for me since I erroneously succumbed to the notion that low-cost evaluation services equal winning contracts, is a continual attempt to align my service offerings to prospective client needs. This requires being certain that my core services —not necessarily different from other evaluators’— are presented along with features to enhance a prospective client’s desire to seek my services.  A greater perception of value is the “value added” to an ordinary services offer. Value-added services are not cheap, low-cost menu items, but extraordinary services to support clients’ business-related reasons for entering into a contract. Among those are activities that save time, provide peace of mind, and increase efficiencies. Developing an evaluation proposal budget with this in mind ties one’s technical knowledge with business savvy. It contributes to the client-evaluator satisfaction of having a proposal that’s fair to both parties. 

Hot Tips:

  • Think of what you are willing to do if presented with certain work conditions e.g., access to decision makers, ready access to evaluation project stakeholders or data, staff availability to negotiate evaluation plans, acceptable distance to evaluation sites, reasonable number of drafts required before finalizing a data presentation.  Note: These are things that, if provided, are likely to prevent or reduce scope creep.
  • Create an outline of your core services i.e., what are the technical skills you can demonstrate and enjoy using while concurrently serving a prospective client’s evaluation needs?
  • Create a list of your “value-added” services i.e. what, additional to your core list of evaluation services, would delight your clients? (Think: what would make your prospective clients’ work lives easier?)
  • Quantify, in time and monetary terms, the cost of your core and value added services.

Lesson Learned:

  • Think of ways to bundle your services so when opportunities exist, you are prepared to describe them to prospective clients.
  • Value-added services may be intangibles, too! Don’t undermine your interpersonal skills and relationship-building abilities. 

Got value-added services? Sure you do! Think: What delights  prospective clients? What delights me?

The American Evaluation Association is celebrating IC TIG Week with our colleagues in the Independent Consulting Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hello, my name is Kate Clavijo. I earned my degree in Educational Program Evaluation from the University of Louisville in 2002, and have been working as an evaluator ever since. Over the years, I have held full time evaluator positions and worked for non-profits while doing evaluation consulting on the side. About eight years ago, I left a “traditional” evaluation job and became a full-time consultant. In the early days of full time consulting, I budgeted most of my time for drumming up business and networking with potential clients. This was time well spent, but quite isolating and lonely. My breakthrough moment came after attending an AEA conference and realizing what I missed the most was sharing with evaluators. I like to collaborate and talk about evaluation. The first place I turned to when I returned from the conference was the AEA website. Through the AEA “Find an Evaluator” link, I found two people living in my area and sent them an email asking if they wanted to meet for coffee.

The first lived right around the corner. After several coffees and good conversations about evaluation, she asked for help with one of her projects.

I said yes…and when she moved away to greener pastures she passed on what, to this day, is my favorite client.

I reached out to the second person after admiring her website. We have met for coffee and lunch over the years. During our first meeting, she provided me with inspiration, friendship and some excellent resources related to evaluation consulting.

Recently, she asked “Can you help with conducting an interview?”

I said yes…and was exposed first hand to an evaluation of a complex initiative where networks of people and organizations are changing systems in local communities and I saw a very innovative use of graphic presentation.

Lessons Learned:

Reach out to other evaluators. There is no need for an agenda – friendship and talking about evaluation are the immediate rewards.

Say yes. When an evaluator asks for help, just say yes. You never know what you will learn, who you will meet and how it will shape your own evaluation practice.

Rad Resource:

Find an Evaluator is a free searchable online resource provided by the American Evaluation Association. It includes the contact information and area of expertise of hundreds of evaluators across the country and the world. Most people might want to start by searching for evaluators close to home, but you can also search for evaluators working in a specific niche.

 

The American Evaluation Association is celebrating IC TIG Week with our colleagues in the Independent Consulting Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

<< Latest posts

Older posts >>

Archives

To top