AEA365 | A Tip-a-Day by and for Evaluators

CAT | Teaching of Evaluation

I am Dawn X. Henderson, a past fellow of AEA’s Graduate Education Diversity Initiative (GEDI) and member of the Ann E. Casey’s Expanding the Bench Initiative. I recently developed an undergraduate seminar course in Community Psychology at a Minority Serving Institution. Program evaluation is a competency in Community Psychology and modeling evaluation was critical in passing my evaluation “wisdom” on to a group of “underrepresented” students through a partnership with a nonprofit. I aim to share some hot tips and lessons learned with those interested in teaching and working in evaluation.

Hot Tips:

  • Practice logic models. In preparation of the evaluation report, the class met with the Executive Director to obtain information about the nonprofit, focusing on their programming and key activities. The process of building logic models allowed students to become familiar with services provided by the nonprofit and develop visual connections between inputs, activities, etc.
  • Recognize the individual strengths and knowledge of your students/team. Students worked in pairs to perform the quantitative and qualitative analysis; each pair had a student familiar with the methodology and a weaker student. Weaker students learned new knowledge about data analysis and students collaboratively compiled findings into text and graphs.
  • Divide the report in sections and assign main duties and responsibilities. Each section of the evaluation report had a student leader responsible for collecting information, majority of writing, and maintaining communication with students and faculty. Each student also had to review and summarize an article related to the nonprofit’s programs and services; summaries were integrated into the discussion or recommendation section of the report.

Lessons Learned:

  • Maintain lines of communication on progress with the nonprofit. Maintaining contact with the nonprofit about status, challenges, and their needs can be useful in building feedback and recommendations to improve content. Using this process allows undergraduate students to understand the important role of integrating the nonprofit throughout the process in order to ensure the evaluation report is an accurate representation of their program.
  • Develop timelines for important milestones/benchmarks. The majority of the evaluation report was completed at the end of the academic semester, making it a stressful process for students and myself. Building in benchmarks for each section of the evaluation report would have provided more opportunities for feedback and editing. I literally had to go through the entire report the night before its draft was due to the nonprofit.

The students approached the preparation of the evaluation report with limited knowledge in evaluation but some familiarity in traditional research in psychology. In the end, students discovered ways to translate research processes into evaluation and the nonprofit received useful information to support their programming and funding efforts.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I am Maggie Miller, the principal of Maggie Miller Consulting. I conduct program evaluation for nonprofits in the Denver/Boulder area. Welcome to Colorado! We Coloradoans tend to be very friendly; when you meet us at Evaluation 2014, we will be very happy to share any information about Colorado that we can.

Coloradoans also like to learn about evaluation. When I’m not consulting, I teach various evaluation classes and workshops in the greater Metro Denver area. There are many opportunities for program staff in nonprofits (and the private sector) to learn about evaluation. These are a few organizations that I’ve taught for: the Colorado Nonprofit Association, the Nonprofit Cultivation Center, Mountain States Employers Council, the Nonprofit Management program at Regis University, and Denver Evaluation Network (DEN), which is for Denver-area museums and cultural institutions. The staff at the Denver Public Library system were very receptive to a series of evaluation planning classes I gave, and once I even presented a logic modeling workshop for the HR department New Belgium Brewery. Hey, everyone can benefit from thinking about outcomes!

(P.S.: While I’ve never taught for them, I should mention that there are some large evaluation firms in town that offer excellent training to our evaluation-oriented Coloradoans.)

Lessons Learned: Anyone can learn about evaluation and improve their skills. It’s important to keep these teaching tips in mind.

  • Assess where your students “are at” in terms of their experience and existing skills (which may include evaluation-related things like teaching, research, project management, and facilitation).
  • For any given teaching opportunity, figure out what’s most important to teach. Keep your lesson focused on a few important ideas which they will remember and use, rather than giving them an overwhelming smorgasbord.
  • Facilitate hands-on interactive activities to help people engage deeply with new ideas.
  • Use examples that are relevant to your students, and encourage them to apply what they learn to their professional (and even personal) lives.
  • Whenever possible, get them to review what they learned. This is easier in multi-session workshops or classes, but you can still do it before and after breaks in a one-time workshop.

Hot Tip: Some of the places I’ve taught are great resources for you when you are in town! Check out Denver’s wonderful DEN-participating museums, our fabulous public library, and taste some great New Belgium beer at many restaurants and bars in the Denver area.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

 

Hello! I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor with a great resource for evaluators and one of my perennial go-to guides for evaluation work.

Do you ever need to explain the central limit theorem, counterfactual, or construct validity and need a little help? Do you confuse mediating and moderating variables? Do you know when to use a Mann-Whitney U vs. a Wilcoxon? Perhaps you came across the term geisteswissenschaften and want to know what it means, just in case you need it for a research report, or  it’s ever the answer on final Jeopardy.

More likely perhaps is that you’re reading a journal article and are not familiar with the statistics or methodology used. You don’t want to head back to grad school or read an entire chapter on the topic, but just need a little boost to make better sense of what your’e reading.

Hot Tip: Sure, you could Google all of these. Sometimes, however, a well-written book you can just pull off your shelf will do the trick, and there’s no chance of being distracted by ads or irrelevant search results like the latest Justin Beiber scandal. The Dictionary of Statistics & Methodology: A Nontechnical Guide for the Social Sciences by W. Paul Vogt and R. Burke Johnson, features about 2800 brief and plain-English nontechnical definitions of terms cross-referenced so that you’ll never find a definition you can’t understand because it contains other technical terms. The book is up-to-date in its 4th edition, published in 2011.

Do you teach evaluation, research methods, or statistics, or need to teach some key concepts to evaluation stakeholders? Are you in grad school and just learning social science research methods? This handy guide is a great resource for instructors and students alike, as it offers both definitions and relevant, understandable examples. Are you a visual learner? The book also features dozens of easy-to-interpret charts, tables, graphs, and figures.

Cool Trick: Use this text when you need not only methodological and statistical terms defined and explained, but also some theoretical and philosophical terms (Ex. postmodernism, positivism, empiricism, realism, etc.). The authors’ rationale is that readers will likely come across these when reading research reports. If you’re anything like me, you’ve only a minimal number of brain cells available for definitions of words ending in -ism, so this nifty tool at your fingertips is a welcome respite from wracking your brain to remember them.

Hot Tip: The section Suggestions for Further Reading features an up-to-date organized list of recommended books on Elementary Methodology and Statistics, More Advanced Works on Methodology and Statistics, Dictionaries and Reference Works and lists of Some Useful Websites on Statistics and Methodology organized by subtopic. 

BONUS: AEA members receive discounts on many books through our publishing partners. Click here for details.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hello, my name is Jeanne Hubelbank. I am an independent evaluation consultant. Most of my work is in higher education where, most recently, I help faculty evaluate their classes, develop proposals, and evaluate professional development programs offered to public school teachers. Sometimes, I am asked to make presentations or conduct workshops on evaluation. When doing this, I find it helpful to know something about the audience’s background. Clickers, hand raising, holding up colored cards, standing up, and clapping are ways to approach this. A recent AEA365 post, Innovative Reporting Part I: The Data Diva’s Chocolate Box, that showed how to present results on candy wrappers served as an impetus for another way to introduce evaluation and to assess people’s understanding of it.

Instead of results, write evaluation terms such as use, user, and methods on stickers and place them on the bottom of Hershey’s Kisses®; one word to a kiss. Participants arrange their candy in any format that they think represents how one approaches the process of conducting an evaluation. This can give one a quick view of how the participants view evaluation and most people like to eat the candy afterwards.

Hot tips:

  • Use three-quarter inch dotsHubelbank
  • Hand write or print terms you want your clients to display
  • Besides Hershey’s Kisses® provide Starbursts®, for those who are allergic or adverse to chocolate
  • Use different colored kisses for key terms, such as use and uses in silver and assessment in red, for a quick view on where people place them in the process
  • Wrap each collection of candy terms into a piece of plastic wrap and tie with a curled ribbon
  • Ask people to arrange candy in any format that they think represents how one approaches the process of doing an evaluation
  • You can do this before and after a presentation, but if you do it again, remind people to wait to eat.

Rad Resources:

Susan Kistler’s chocolate results

Stephanie Everygreen’s cookie results and her book Presenting Data Effectively: Communicating Your Findings for Maximum Impact.

Hallie Preskill and Darlene Russ-Eft’s book Building Evaluation Capacity: 72 Activities for Teaching and Training.

Michael Q. Patton’s book Creative Evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Sue Griffey; I lead the Evaluation Center at Social & Scientific Systems, Inc. in Silver Spring MD. I mentor outside of work for professionals in evaluation and public health in both formal programs (Cherie Blair. Foundation (CBF), APHA-SA National Mentoring Program, Aspire, Foundation, Rollins School of Public Health Annual Mentoring Program) and through individual connections.

I have noticed over the past few years, as my mentoring work has increased, that my ability to assess mentoring readiness is critical to the success of the mentoring relationship.

Hot Tip: Mentoring is a volunteer acting. Don’t just assume that the Mentor-Mentee pairing results in both being ready. It may appear as Mr+/Me+ (as in the table below) but the pairing may actually be in a discordant cell.

aea365_Griffey1

aea365_griffey2Mentee isn’t ready: The mentee may not realize she isn’t ready for mentoring; you as the mentor may need to help her see that. A mentee may identify needing mentoring when it really isn’t what she needs. As the mentor, develop and apply metrics for readiness as you would in an evaluation.

Hot Tip: I have a 3-email rule. If I have to track down the mentee more than 2 times because he has missed a scheduled session or not confirmed a session time, my third email lays out my perspective that there may be a mismatch in what the mentee is able to do (as shown below).

aea365_griffey3

Hot Tip: Don’t rule out a mentoring program because you don’t think you offer the program’s content or focus. I became a CBF mentor in its initial program even though I didn’t necessarily have the business focus I thought they wanted. My match was a mentee who two years later still benefits from my experience in public health and leadership.

aea365_griffey4Mentor isn’t ready: If you have agreed to mentor, respect the commitment or acknowledge that you can’t.

As the mentee, make sure you are getting what you need from the mentor. And if you aren’t getting what you need, don’t be afraid to let the mentor or the mentoring program manager know that. It may be that the mentor really isn’t ready for the mentoring relationship

Hot Tip: it may help you as a mentee to think overall and about each session as answering 3 questions:

  1. What do you need right now?
  2. What do you want to do and why?
  3. How can your mentor help you?

Hot Tip: Being a mentee is as important as your work or schooling. Be proactive in communications, making sure to check your email daily, letting a mentor know what your schedule is, what time zone you are in, and how and when to reach you.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings!  I am Carl Brun, a social worker turned professor turned evaluator.  I have taught in the social work department at Wright State University in Dayton, OH for 21 years.

I teach evaluation in every research methods course I teach, whether it be in a human services course or an undergraduate or graduate social work course.  Students can relate to evaluation as an actual activity that occurs in social services compared to research which they see only happening in universities.

Lessons Learned:  Have students do evaluation.  Partner with community agencies to have students apply their evaluation skills to help develop and implement evaluations.  In one graduate level course, my students conducted a door-to-door needs assessment.  Their efforts helped the agency receive a $733,000 grant to begin a federally qualified health clinic.  See http://www.talberthouse.org/media/documents/Newsletter_2013%20Fall.pdf

Hot Tip:  Demystify evaluation in the very first class.  I have students discuss ways they use research in their everyday lives to make decisions, such as “how did you research coming to this university or choosing this major?”  I ask them to think of ways they have evaluated others (ex. student evaluations of professors) and been evaluated (ex. by a supervisor).

Hot Tip:  SCREAM.  This is the acronym I use to emphasize values I support for every evaluation:  Measure Strengths. Be Culturally competent.  Evaluate within the Resources you have.  Ethics, ethics, ethics.  Get Agreement from all stakeholders on all aspects of the evaluation.  Measure Multiple systems levels.

Hot Tip: I simplify research methods by discussing three types of evaluation questions and three types of data collection.  Exploratory questions = qualitative methods. Explanatory questions = quantitative methods. Descriptive questions = both.  All questions can be answered by asking questions, observation, or secondary data analysis.  I have a chart that puts these pieces together to help the students develop their research design.

Resources:  There are many electronic discussion forums in which teachers share their syllabi and teaching tips.  Among the ones I use are AEA’s Evaltalk, and one established primarily for social work educators  (http://www.bpdonline.org/bpd_prod/BPDWCMWEB/Resources/BPD-L_List/BPDWCMWEB/Resources/BPD-L_Email_List.aspx?hkey=bf39d2a2-7005-4db1-a6d0-e3587cc98956#Join).  I love talking about teaching evaluation.  Feel free to contact me at carl.brun@wright.edu.

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Theresa Murphrey and I am on the faculty at Texas A&M University. I teach both traditional classroom based student as well as distance-based learners from around the world. I have been working with a new tool that I have found to be extremely useful in facilitating teaching and learning.

2014 Update: I have now used this tool for the past several years and still find it to be extremely useful in facilitating teaching and learning.

Rad Resource: Jing is a technology that allows one to capture what is seen on the computer screen and audio using a microphone to create a video that can be easily shared online. The use of Jing can enhance, extend, and support the delivery of course content by having students apply what is presented in the course. Technologies such as Jing offer ways to allow students to express themselves and increase sensory input, thus increasing the chance that we can engage the student and enhance learning. Use of such technologies in an experiential process allows students to gain ownership of their ideas and communicate these ideas more clearly.

Approaches to Using Jing for Assignments:

  • Ask students to find an answer to a question using the Internet and share that answer using Jing in a recording of the Internet site and their explanation.
  • Ask students to review specific material and relate a particular finding to themselves personally in a recording using Jing.
  • Ask students to create a presentation and record that presentation using Jing.
  • Verbally annotate electronically-submitted student work products, providing detailed and contextually clear feedback and guidance

Jing may be found online at http://jingproject.com

2014 Update:  I demonstrated Jing as part of the AEA Coffee Break Demonstration Series on February 18, 2010. To learn more, click here. The webinar recording is free for AEA members. Click here for a handout on Jing, free for everyone!

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings colleagues. My moniker is Michael Quinn Patton and I do independent evaluation consulting under the name Utilization-Focused Evaluation, which just happens also to be the title of my main evaluation book, now in its 4th edition. I am a former AEA president. One of the challenges I’ve faced over the years, as many of us do, is making evaluation user-friendly, especially for non-research clients, stakeholders, and audiences. One approach that has worked well for me is using children’s stories. When people come to a meeting to work with or hear from an external evaluator, they may expect to be bored or spoken down to or frightened, but they don’t expect to be read a children’s story. It can be a great ice-breaker to set the tone for interaction.

Hot Tip: I first opened an evaluation meeting with a children’s story when facilitating a stakeholder involvement session with parents and staff for an early childhood/family education program evaluation. The trick is finding the right story for the group you’re working with and the issues that will need to be dealt with in the evaluation.

Rad Resource: Dr. Seuss stories are especially effective. The four short stories in Sneeches and Other Stories are brief and loaded with evaluation metaphors. “What was I scared of?” is about facing something alien and strange — like evaluation, or an EVALUATOR. “Too Many Daves” is about what happens when you don’t make distinctions and explains why we need to distinguish different types of evaluation. “Zaks” is about what happens when people get stuck in their own perspective and can’t see other points of view or negotiate differences. “Sneeches” is about hierarchies and status, and can be used to open up discussions of cultural, gender, ethic, and other stakeholder differences. I use it to tell the story, metaphorically, of the history of the qualitative-quantitative debate.

Hot Tip: Children’s stories are also great training and classroom materials to open up issues, ground those issues in a larger societal and cultural context, and stimulate creativity. Any children’s fairy tale has evaluation messages and implications.

Rad Resource: In the AEA eLibrary I’ve posted a poetic parody entitled “The Snow White Evaluation,” that opens a book I did years ago (1982) entitled Practical Evaluation (Sage, pp. 11-13.) Download it here http://ow.ly/1BgHk.

Hot Tip: What we do as evaluators can be hard to explain. International evaluator Roger Mirada has written a children’s book in which a father and his daughter interact around what an evaluator does. Eva is distressed because she has trouble on career day at school describing what her dad, an evaluator, does. It’s beautifully illustrated and creatively written. I now give a copy to all my clients and it opens up wonderful and fun dialogue about what evaluation is and what evaluators do.

Rad Resource: Eva the Evaluatorby Roger Miranda. http://evatheevaluator.com/

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, we are Osman Ozturgut, assistant professor, University of the Incarnate Word, Tamara Bertrand Jones, assistant professor, Florida State University, and Cindy Crusto, associate professor, Yale School of Medicine. We are members of the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group.  We’d like to update you on our conference session.

The goals of the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group include increasing awareness about the Statement and the resources available to increase use and application, regardless of the type of evaluation. In light of these goals, the working group formed a sub-group to prepare modules to be used in teaching evaluation. As its first task, this group has designed a curriculum module to introduce the Statement and its relevance to the field of education.

In this conference session, we sought feedback from the participants about the use of video in our module related to use, relevance, and practically. The significance of multimedia resources in evaluation is unquestionable. Whether we are designing or presenting the results to the stakeholders, effective use of multimedia can determine the appropriate next steps. Participants expressed their thoughts on the design’s effectiveness and provided suggestions that would increase utilization by academics and evaluation trainers.

First, we wanted to limit the first module’s video to 8-10 minutes so that it would serve as an introductory module and provide insights on the significance of the Statement and the definition and practice of cultural competence in evaluation. This video would include testimonials from experts on the significance of the AEA Public Statement on Cultural Competence in Evaluation. These statements would include the meaning of culture and cultural competency in evaluation and how evaluations reflect culture. The second part of the module would include accounts on the significance of acknowledging the complexity of cultural identity, recognizing the dynamics of power, identifying and eliminating bias in language, and employing culturally appropriate methods.

Next, we sought feedback on how we could effectively design such a video that uses time efficiently.

Lesson Learned: Participants’ feedback confirmed that a more structured approach, in the initial design phase, such as creating a storyboard when designing the video, would be important. Yes, this step may be time-consuming, but it is important to spend the time in advance to help disseminate the significance of cultural competency in evaluation. We are more than willing to take the challenge of learning new-to-us technologies!

Rad Resources: Storyboard is the next step once you have the concept and the script. It tells the story frame-by-frame, and is a great resource to begin the adventure!

Clipped from http://www.storyboardthat.com/

This week, we’re diving into issues of Cultural Competence in Evaluation with AEA’s Statement on Cultural Competence in Evaluation Dissemination Working Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hey! I’m Stephanie Evergreen and among other things I run AEA’s Potent Presentations Initiative. Lately we’ve been working with eStudy presenters to reboot just five of their slides. Great speakers and solid content need to be reflected in polished slides. Here is the reboot for Scott Chaplowe.

Scott’s workshop is a great comparison between monitoring and evaluation – where they overlap and where they are similar. Scott also interacts with his attendees quite often when he presents. I wanted to make his slidedeck reflect those same dynamic elements. Scott already knew how to use animation to guide attention to certain parts of his slide, so I continued to build on that where possible.

Before: Results Hierarchy

BEFORE

 

After: Results Hierarchy

AFTER

I gradated the red color for the Results Hierarchy slide, so that the color change and the slow build of the slide via animation make the idea of a hierarchy more clear. With the removal of some unnecessary text on that slide, the explanatory material can be put into a larger font, too. Each row in the table is animated to appear one-at-a-time. Each arrow is also animated, so Scott can talk about the way Inputs feeds back to Activities for as long as needed without other distractions.

 

M&E and the Project Cycle

BEFORE

 

After: M&E Project Cycle

AFTER

 

When I told Scott I wanted to remake his slide on M&E and the Project Cycle, he let me know that there existed a somewhat better diagram but that he strongly preferred the use of animation to build each component of the diagram one at a time. Understandable. How can one get a single image file to become animated? Well, I used a lot of leg work but I cropped out each element of the better looking diagram and reassembled the individual pieces into a coherent whole. Then I added in the animation. Now, let me be clear that this was an enormous amount of work to get each piece cropped and I still see some things about it that I don’t like, where I could have done a better job. It will not always be worth the effort it took to make the diagram animated. It is probably only justified in cases like this, where it is an essential slide for the talk, a real centerpiece (and you don’t have access to the original file used to make the image).

Read about all 5 revised slides on the Potent Presentations Initiative site and look for Scott at this year’s conference.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Older posts >>

Archives

To top