AEA365 | A Tip-a-Day by and for Evaluators

TAG | feedback

Hi, this is Pat and Tiffany. We are doctoral candidates in Evaluation, Statistics, and Measurement who work for the Graduate School of Medicine at the University of Tennessee. We completely redesigned a statistics and epidemiology curriculum where none of the previous instructors had formally outlined what they wanted their residents to learn. We not only created a brand new syllabus with learning objectives, but also taught the courses and assessed baseline knowledge and outcomes.

Ask yourself: as an assessment professional (or course instructor), how many times have you been faced with generating useful assessment data from a vague or altogether absent set of learning goals?

Starting from nothing, we had to find a way to gather useful assessment data through the creation of new instruments. Here are five tips that can be used in any assessment or evaluation where there are vague or unclear learning goals.

Hot Tips:

One: Know Your Situation

  • Learning environment
    • What is being taught? (For us, statistics and research methods—not everyone’s idea of exciting)
    • What is the nature of the course? (e.g. required vs. optional)
    • Work environment
      • Do the students have external obligations that need to be considered? (Our case, hospital “on-call” obligations)
      • Population-specific
        • What are the factors associated with your target population? (E.g. age, learning style, background with topic).
        • Availability of resources
          • What are your time, personnel, and financial constraints?

Two: Clarify Your Purpose

  • Ask yourself two questions:
    • How will the instructor(s) benefit from the assessment results?
    • How will the students benefit from the assessment results?

Three: Use What You Have

  • Play detective, gather the necessary background data
    • Existing content, instructor/staff interviews, direct observation, literature, and/oryour previous experience.
    • It provides three benefits: (1) Shows what instructors think the students are learning; (2) what is actually being taught; and consequently (3) where gaps exist in the curriculum.

Four: Fit the Instrument to Your Purpose, Not the Other Way Around

  • Always consider situational factors (tip one), and align assessment strategies to the most efficient method for that situation.

Five: Get Consistent and Critical Feedback

  • Assessment development/administration must be viewed as a dynamic and iterative process.
  • An instrument is developed or modified, it is tested, the testing generates feedback, the feedback leads to modifications to both assessment and teaching and learning activities.

Barlow and Smith AHE TIG Week Picture

 

We hope these tips will be helpful for your assessment work; good luck!

Rad Resources: For more information on assessment we strongly recommend the following…

  • For a copy of this presentation along with other resources check out my SlideShare page

The American Evaluation Association is celebrating Assessment in Higher Education (AHE) TIG Week. The contributions all this week to aea365 come from AHE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

My name is Linda Delaney and I am a Learning Coordinator for the City of Memphis Office of Talent Development. I have been a consultant in evaluation for over 10 years. I work with Dr. David Fetterman as an evaluator for the Minority Sub-recipient Grant Office in Arkansas – a State-wide tobacco prevention evaluation.

In Empowerment Evaluation, simplicity is a key factor to ensure that participants can easily apply concepts that help them to identify and clarify their mission, assess their performance, and strategically plan activities to accomplish their long-term goals.

In addition, as an administrator, trainer, and facilitator of Empowerment Evaluation, I provide evaluative (positive and constructive) and non-evaluative feedback along the way. Providing feedback to community and program staff members is critical and helps guide the learning process.

Rad Source: Using Effective Communication Skills PowerPoint

Hot Tip – Provide evaluative feedback tools: Participants like measurements that simply let them know where they are: on track or not.

Hot Tip – Provide positive evaluation feedback: Participants are encouraged when they to hear from the evaluator that they have done a good job. It reinforces constructive behavior and activity, required to move the project forward.

Hot Tip – Provide constructive evaluative feedback: When you need to instruct and guide participants on how to correct actions, constructive evaluative feedback is necessary. The term “constructive feedback” often generates a negative initial response, but community or staff member response is determined by the manner in which the message is delivered. Avoiding a tone of criticism that sounds like they are being “chewed out” goes a long way in producing the desired results. Constructive feedback shared in supportive tones can help to put things back on track.

Hot Tip – Provide non-evaluative feedback: Non-evaluative feedback does not assign a value to actions. It simply acknowledges the actions and/or feelings of people. Non-evaluative feedback can be as simple as saying, “thank you for your input” or “that’s an interesting way of looking at things”.

Lessons Learned: When would participants want to hear your feedback? When they are still thinking about the work and when they can still do something about it. Giving immediate and appropriate feedback helps participants hear it and use it while the performance in question or actions are still fresh on their minds.

The American Evaluation Association is celebrating CPE week with our colleagues in the Collaborative, Participatory, and Empowerment TIG. The contributions all this week to aea365 come from our CPE TIG Colleagues. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice.

·

Hi, my name is Bikash Kumar Koirala. I work as a Monitoring and Evaluation Officer in the NGO Equal Access Nepal, which is based in Kathmandu, Nepal.  I have been practicing monitoring and evaluation work for over five years, which is focused on development communication programs.  A research project that EAN has collaborated on Assessing Communication for Social Change (AC4SC) developed a participatory M&E toolkit based on our experiences.  One of the modules in this toolkit is the Communication Module, which is summarized as follows.

As a result of AC4SC, the communication systems in our organization improved a lot and became more participatory. We began to understand that effective communication and continuous feedback is essential to the success of participatory M&E. Communication inside organizations and outside can be quite challenging sometimes because different people have different perspectives and experiences.

Lessons Learned

Community Involvement: After the AC4SC project, the level of engagement with communities by the M&E team increased considerably. Their involvement in ongoing participatory research activities and providing critical feedback has proved very useful to our radio program development. This has increased community ownership of our programs. As well as work undertaken by the M&E team, this research is conducted by network of embedded community researchers (CRs).  These activities have produced research data, which is analyzed and triangulated with the other sources of data (such as listeners’ letters) to produce more rigorous results.

Internal Communication: Regular constructive feedback related to program impact and improvement is given to content teams by the M&E team.  This has increased dialogue and cooperation between the M&E and content team members.  Before the AC4SC project, content team members didn’t usually take M&E findings into account because they felt that they already knew the value of the program content through positive feedback from listener letters. The value of M&E has now been recognized by the content teams. They now ask for more in-depth data to generalize feedback they receive. The M&E team addresses this through research and analysis using many different forms of data from varied sources.

Use of New Communication Technology: The M&E team has been analyzing SMS polls, text messages, and letter responses, and triangulating these with the CRs research data and short questionnaire responses to present more rigorous results to program team members, donors and other stakeholders.

Some Challenges: In participatory M&E it is important to understand the roles of everyone involved in the process. Effectively presenting results for better communication and the utilization of M&E findings among different stakeholders is an ongoing challenge. Time to effectively undertake participatory M&E and is also an ongoing challenge.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · · · ·

My name is Kathleen Dowell and I am President of Partners in Evaluation & Planning. As a full-time evaluation consultant, and lead in a consulting firm, I continually strive to improve our practice and service to clients.

Hot Tip: Want to assess the quality of the services you provide to evaluation clients? Then the Client Feedback Form (CFF) may be for you! Developed by several members of the American Evaluation Association’s TIG for Independent Consulting as a result of a desire to “walk the talk,” the CFF is a method of gathering systematic, specific feedback from your clients about your work. As many evaluators have discovered during their use of this tool, each and every CFF completed by a client can provide valuable information about how the client perceives your value and your work. Thus, use of the CFF goes beyond simply finding out whether a client is satisfied, but provides client assessments on specific dimensions of performance. It doesn’t matter if you send out one CFF or more; each one can be important in assessing your work as an evaluator. The CFF is downloadable from http://bit.ly/ClientFeedbackForm

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

·

Archives

To top