AEA365 | A Tip-a-Day by and for Evaluators

CAT | Teaching of Evaluation

Hi, we are Osman Ozturgut, assistant professor, University of the Incarnate Word, Tamara Bertrand Jones, assistant professor, Florida State University, and Cindy Crusto, associate professor, Yale School of Medicine. We are members of the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group.  We’d like to update you on our conference session.

The goals of the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group include increasing awareness about the Statement and the resources available to increase use and application, regardless of the type of evaluation. In light of these goals, the working group formed a sub-group to prepare modules to be used in teaching evaluation. As its first task, this group has designed a curriculum module to introduce the Statement and its relevance to the field of education.

In this conference session, we sought feedback from the participants about the use of video in our module related to use, relevance, and practically. The significance of multimedia resources in evaluation is unquestionable. Whether we are designing or presenting the results to the stakeholders, effective use of multimedia can determine the appropriate next steps. Participants expressed their thoughts on the design’s effectiveness and provided suggestions that would increase utilization by academics and evaluation trainers.

First, we wanted to limit the first module’s video to 8-10 minutes so that it would serve as an introductory module and provide insights on the significance of the Statement and the definition and practice of cultural competence in evaluation. This video would include testimonials from experts on the significance of the AEA Public Statement on Cultural Competence in Evaluation. These statements would include the meaning of culture and cultural competency in evaluation and how evaluations reflect culture. The second part of the module would include accounts on the significance of acknowledging the complexity of cultural identity, recognizing the dynamics of power, identifying and eliminating bias in language, and employing culturally appropriate methods.

Next, we sought feedback on how we could effectively design such a video that uses time efficiently.

Lesson Learned: Participants’ feedback confirmed that a more structured approach, in the initial design phase, such as creating a storyboard when designing the video, would be important. Yes, this step may be time-consuming, but it is important to spend the time in advance to help disseminate the significance of cultural competency in evaluation. We are more than willing to take the challenge of learning new-to-us technologies!

Rad Resources: Storyboard is the next step once you have the concept and the script. It tells the story frame-by-frame, and is a great resource to begin the adventure!

Clipped from http://www.storyboardthat.com/

This week, we’re diving into issues of Cultural Competence in Evaluation with AEA’s Statement on Cultural Competence in Evaluation Dissemination Working Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hey! I’m Stephanie Evergreen and among other things I run AEA’s Potent Presentations Initiative. Lately we’ve been working with eStudy presenters to reboot just five of their slides. Great speakers and solid content need to be reflected in polished slides. Here is the reboot for Scott Chaplowe.

Scott’s workshop is a great comparison between monitoring and evaluation – where they overlap and where they are similar. Scott also interacts with his attendees quite often when he presents. I wanted to make his slidedeck reflect those same dynamic elements. Scott already knew how to use animation to guide attention to certain parts of his slide, so I continued to build on that where possible.

Before: Results Hierarchy

BEFORE

 

After: Results Hierarchy

AFTER

I gradated the red color for the Results Hierarchy slide, so that the color change and the slow build of the slide via animation make the idea of a hierarchy more clear. With the removal of some unnecessary text on that slide, the explanatory material can be put into a larger font, too. Each row in the table is animated to appear one-at-a-time. Each arrow is also animated, so Scott can talk about the way Inputs feeds back to Activities for as long as needed without other distractions.

 

M&E and the Project Cycle

BEFORE

 

After: M&E Project Cycle

AFTER

 

When I told Scott I wanted to remake his slide on M&E and the Project Cycle, he let me know that there existed a somewhat better diagram but that he strongly preferred the use of animation to build each component of the diagram one at a time. Understandable. How can one get a single image file to become animated? Well, I used a lot of leg work but I cropped out each element of the better looking diagram and reassembled the individual pieces into a coherent whole. Then I added in the animation. Now, let me be clear that this was an enormous amount of work to get each piece cropped and I still see some things about it that I don’t like, where I could have done a better job. It will not always be worth the effort it took to make the diagram animated. It is probably only justified in cases like this, where it is an essential slide for the talk, a real centerpiece (and you don’t have access to the original file used to make the image).

Read about all 5 revised slides on the Potent Presentations Initiative site and look for Scott at this year’s conference.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello! I’m Sheila B. Robinson, aea365’s Leader Volunteer Curator. I teach Program Evaluation Methods at University of Rochester’s Warner School of Education, and am a grant coordinator for Greece Central School District in Rochester, NY. In my spare time, I read, learn and blog about evaluation, and it’s no exaggeration to say I never come away from my computer, book, article on evaluation without  learning something. “Ancora imparo” is attributed to Michelangelo in his late 80s!

As I’m once again preparing my syllabus, I’m reflecting on a wealth of free and low-cost evaluation resources. Since I always want to improve the course for my students, I’m look for new readings and activities to ensure my course is up-to-date, and that my students learn about the “big names” and big issues in the evaluation community today.

Lesson Learned: I’m convinced evaluators are the most collegial, collaborative, and generous people ever, and I’m always impressed with how many of them are willing to share their knowledge and resources with everyone.

Hot Tips:

1.) Fill your toolbox! Susan Kistler, AEA’s Executive Director Emeritus, has contributed numerous aea365 posts on free or low-cost technology tools. Search her name, or glance through aea365 archive for links and descriptions.

2.) Join the conversations! Mentioned before, but definitely worth another look: AEA’s LinkedIn discussion, and EvalTalk – two places I’ve learned about the multitude of websites, textbooks, and articles on evaluation, many of which have made their way into my course. Here’s a link to a discussion on “Comprehensive set of websites on evaluation and research methods.” I recently asked EvalTalk for some “must-read journal articles for program evaluation students” and got great responses; some people even sent me their syllabi!  Cool trick: I’ve copied rich EvalTalk and LinkedIn discussions on a topic of interest (e.g. pre- and post-testing) to share with students as examples of the types of discussions evaluators have in “the real world” of evaluation work.

3.) Cull from collections! Who doesn’t love one-stop shopping? My favorite place for great collections is AEA’s site. Check out everything under the Reading, Learning, and Community tabs and all the links on the main page. Check out Evaluator and Evaluation blogs and evaluators on Twitter. Chris Lysy maintains a large collection of evaluation-related blogs at EvalCentral. Gene Shackman has amassed probably the largest collection of Free Resources for Program Evaluation and Social Research Methods.

4.) “Extend” your learning! Google “evaluation” + “extension” and find a universe of free tools and resources from university extension programs. Here are just a few:  University of Wisconsin-Extension, Penn State Extension, NC Cooperative Extension, K-State research and Extension. I stumbled upon this collection at University of Kentucky’s Program Development and Evaluation Resources.

Apprendimento felice! (Happy learning!)

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

I am Kristin Woods, a co-chair for the Graduate Student and New Evaluators Topical Interest Group (GSNE TIG); I am a PhD student at Oklahoma State University in the Research, Evaluation, Measurement, and Statistics program and a faculty member at Southwestern Oklahoma State University.

As a graduate student, I took an applied program evaluation course in which myself and a fellow graduate student (Brad Williams) from another department at OSU completed a program evaluation for a state agency. As new evaluators just getting our feet wet we learned a lot along the way and many times after the fact. Working with a state agency provides many challenges for an evaluator. When you add in little hands on experience by the evaluators and the multitude of job duties and busy schedules for state agency employees, it becomes a daunting task for students/new evaluators to complete successfully in a semester. However, the efforts of the faculty member (Dr. Katye Perry), the fellow classmates, and the state agency employees provided an experience unmatched by most that launched my passion for evaluation.

Hot Tip 1: Present a few programs/components for the students/new evaluators to choose from that are low stakes.

With shorter timelines, it is important that the evaluators already have a working knowledge of the evaluation model. Therefore, it is important for the students/new evaluators to have some input in the type of evaluation they will conduct. Having a low stakes evaluation creates the optimal learning environment for novice evaluators to learn from their successes and mistakes without an agency employee losing their job or a program loosing funding.

Hot Tip 2: Develop a written agreement of the project.

Students/new evaluators along with the help of the faculty/advisor/senior evaluator should develop a written agreement that includes the evaluation plan, responsibilities of all parties, a timeline that includes soft and hard deadlines, and an agreement of the final deliverables. This will allow all parties to be on the same page before the project begins and serves as a reference to keep everyone on task and on time. Additionally, many of the obstacles that might arise can be discussed and planned for before the event actually occurs ultimately making for a better learning experience. It is important that the agency provide as much information about the program before this written agreement is developed to aid the students/new evaluators in planning a low stakes, meaningful, and feasible evaluation that all parties are invested in and can be completed in the desired time frame.

Rad Resource: Students/new evaluators should join the GSNE TIG and the GSNE TIG Facebook Community Group. This is a place where one can ask questions and get advice from fellow novice evaluators who have recently/currently are in similar shoes.

The American Evaluation Association is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I am Susan Kistler, the American Evaluation Association’s Executive Director and aea365’s regular Saturday contributor.

Guess what just moved into the #1 spot for the most read aea365 article of all time? The April 6, 2013 post announcing the collaboration with BetterEvaluation to produce a series of publicly available Coffee Break Webinars! We’re well under way with the webinar series focusing on the RainbowFramework for planning, managing, and implementing an evaluation, and I want to share an update:

Hot Tip – Sign Up to Attend the Remaining Offerings Live: There are still six more webinars to go in the series, to be offered every Tuesday and Thursday through until the end of May. If you attend live, you have an opportunity to raise questions at the end, and to hear directly from the presenters. Register for the remaining free webinars here – note that you must register for each one separately.

Hot Tip – Subscribe to AEA’s YouTube Channel and Check Out the First Webinar in the Series: The recording of the first webinar in the series is already available online. Go to http://www.youtube.com/user/AmEvalAssn and click the Subscribe button just below the header to receive notices as each subsequent recording is posted.

Hot Tip – We’re having each of the videos in the series professional transcribed: The transcription is useful in at least four ways: (1) it enables reading the video content for those who may have hearing limitations, (2) it allows viewers to read along in case the video’s audio is not perfect or the viewer is more comfortable with written rather than spoken English, (3) it allows for easily extracting references and written quotes from the video, and (4) it serves as the basis for improved translation to other languages.

Cool Trick – Automated Translation Is Available and Human Translations Are Coming: You can select “Translate Captions” (see the screenshot below) and Google will create a machine translation of the captions and transcript. Working from a professional transcription improves the quality of the machine translation, but the quality is still quite variable. I am excited to announce that we are working with a team of volunteers to translate this video series into multiple languages.

Lessons Learned (Coming Soon): I’ll share a note once the human translations are available. And, in a future post I will also cover (a) how to contract for transcriptions, (b) how to work with colleagues to create video translations (it’s easier than you think), and (c) how to upload/incorporate transcriptions and translations into your videos since I know that increasingly we’re seeing evaluators sharing video on YouTube.

Hot Tip – Make the Most of YouTube:

YouTubeHowTo

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Susan Kistler, the American Evaluation Association’s Executive Director and aea365’s regular Saturday contributor.

Hot Tip – Collaboration with BetterEvaluation: Today, we’re excited to announce a partnership with BetterEvaluation. BetterEvaluation is an international collaboration to improve evaluation practice and theory by sharing information about options (methods or tools) and approaches.

Rad Resource – BetterEvaluation Rainbow Framework: BetterEvaluation has produced a framework that organizes over 200 evaluation options into 7 clusters of evaluation tasks that can help you to plan and manage an evaluation. On their website at http://betterevaluation.org/ you’ll find the framework itself as well as extensive explanation and examples in support of the framework tasks. This resource brings together contributions from evaluators working on the ground in various contexts around the world.

 

Hot Tip – Series of eight free short-form webinar trainings: BetterEvaluation and the American Evaluation Association are teaming up to bring to you a series of eight Coffee Break Webinars in May. This series is open to the public (please help us to spread the word), registration is free, and the speakers represent deep expertise applicable in both domestic and international contexts. The series of eight webinars walks you through the components of the Rainbow Framework and will include takeaways immediately applicable to your practice:

  1. Overview of Rainbow Framework for Evaluation – Irene Guijt
  2. Define What Is To Be Evaluated – Simon Hearn
  3. Frame the Boundaries of the Evaluation – Patricia Rogers
  4. Describe Activities, Results and Context – Irene Guijt
  5. Understand Causes of Outcomes and Impacts – Jane Davidson
  6. Synthesise Data from One or More Evaluations – Patricia Rogers
  7. Report and Support Use of Findings – Simon Hearn
  8. Manage an Evaluation – Kerry Bruce

Pre-registration is required and you can register for as many as you would like here: http://comm.eval.org/coffee_break_webinars/CoffeeBreak/BetterEvalSeries

Rad Resource: In 2012, AEA co-hosted a coffee break webinar series with Catholic Relief Services, USAID, and the American Red Cross. This collaboration resulted in a series of four public coffee break offerings, and anyone – AEA member or not – may view the recordings from all four online here. Included in this set are:

  • Monitoring and Evaluation Planning for Projects/Programs
  • Evaluation Jitters Part I and Part II
  • Simple Measurement of Indicators and Learning for Evidence-Based Reporting

We hope to work further with this wonderful team in the future, and currently Scott Chaplowe who helped to spearhead this series is offering an AEA eStudy.

Rad Resource: AEA hosts a weekly Coffee Break Webinar series for members only. AEA members are welcome to attend any of the Thursday afternoon offerings live throughout the year (see list of upcoming offerings here), or to access recordings of over 100 coffee break webinars via the webinars archive (see the public list of what’s in the archive here). If you aren’t currently a member, we encourage you to join!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, we’re Dominica McBride and Leah Neubauer, members of the AEA Public Statement on Cultural Competence Dissemination Working Group. At AEA 2012, we participated in the brownbag session Critically Reflecting and Thinking about Cultural Competence in Your Evaluation Practice. We’d like to share some of the great insights generated from this session.

In this session, we learned about a nursing student who, influenced by Western Medicine cultural ideologies of time- and task-orientation and protocol, misdiagnosed a patient due to a lack of cultural sensitivity and critical self-awareness. We also learned about a comprehensive model of self-reflection from Hazel Symonette that encompasses internal, external/other, space, and time. Below are several lessons learned:

Lessons Learned

Our sense of time can affect how we do our evaluations and the use and impact of our evaluations.In the case study of the nursing student, she quickly ran through her nursing duties, completing an assessment and diagnosis in under 5 minutes. This rush caused a missed opportunity to gain more insight and the patient was misdiagnosed. We, as evaluators, can do this same thing. If operating in a mental frame of “getting stuff done” and “getting it done fast”, we miss opportunities to collect valuable data and gain additional insight. We also miss the opportunity to be more self-reflective. Without this introspection, we may, like the nursing student, “misdiagnose” or misinterpret the situation and data.

We too often assume that we know.Even in the midst of data collection and trying to be objective, we can often assume that we know something or have an accurate interpretation when we actually do not. In Love’s Executioner, Irvin Yolam, a reputable psychiatrist, talks about how communication is filtered through our experiences, biases, feelings, etc. We never fully accurately know what the other is communicating. However, with self-reflection and dialogue, we can check to see if we’re interpreting correctly. We create spaces to gain other perspectives accurately, thus enriching our interpretations and leading to accurate and comprehensive work.

We strive for perfection – is perfection possible?  In Becoming a Critically Reflective Teacher, Dr. Stephen D. Brookfield discusses teaching culture that strives for the ‘perfect 10’ in their teaching. A critically-reflective state of practice acknowledges a constant state of knowing and learning more.

One lesson harkens back to a lesson from SpidermanWith great power comes great responsibility. As evaluators, our judgments are seen as mattering more than some others. We have a special responsibility to ensure that findings and conclusions are inclusive, public, and shared.

Rad Resources

Hazel Symonette developed a comprehensive model for self-reflection we can easily incorporate into evaluation and daily life.

Stephen D. Brookfield has written extensively on the role of critical self-reflection for adult educators.

This week, we’re diving into issues of Cultural Competence in Evaluation with AEA’s Statement on Cultural Competence in Evaluation Dissemination Working Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Jean King and Laura Pejsa, Minnesota Evaluation Studies Institute (MESI), here, with broad smiles on our faces. We are the proud coaches who are wrapping up this week of posts written by our creative student consultants about ways to evaluate a conference (using exit surveys of Ignite sessions, network visualizing, Twitter, and video clips).  Progressive educators long ago documented the value of experiential learning–“learning by doing”–and our experiences during this year’s AEA conference again provide support for the idea as a means of teaching evaluation. Thoughts about how to use a conference setting to engage evaluation students follow.

Hot Tips:

  • Create an evaluation team. Our experience at MESI confirms the value of having students collaborate on projects. Not only do they learn how to do evaluation tasks, but they also learn how to collaborate, an important skill set for evaluators, regardless of their eventual practice.
  • Encourage innovation. Our charge was to think broadly about conference evaluation. At our first meeting, students brainstormed many possible ways to collect data at the conference, no holds barred, the more creative, the better.  As we sought to be “cutting edge,” technology played a role in each of the four methods selected.
  • Make assignments and hold people accountable. Social psychology explains the merit of interdependence when working on a task. We divided into four work groups, each of which operated independently, touching base with us as needed. Work groups knew they were responsible for putting their process together and being ready at the conference. As coaches, we did not micromanage.
  • Make the process fun. University of Minnesota students take evaluation seriously, but their conference evaluation work generated a great deal of laughter. In one sense it was high-stakes evaluation work (we knew people would use the results), but without the pressure of a full-scale program evaluation.

Lessons Learned:

  • Students can learn the evaluation process by collecting data at a conference or other event. Unlike programs, short-term events offer an evaluation venue with multiple data-collection opportunities and fewer complexities than a full-scale educational or social program.
  • A week-long conference offers numerous opportunities to engage in creative data collection. It is a comparatively low-stakes operation since most conference organizers opt for the traditional post-conference “happiness” survey, and any data gathered systematically will likely be of value.
  • Innovative data collection can generate conversation at an evaluation conference.  Many people interacted with the students as they collected data. Most were willing to engage in the process.
  • Minnesota evaluation students really are above average. Garrison Keillor made this observation about Minnesota’s children in general, but this work provided additional positive evidence.

We’re learning all this week from the University of Minnesota Innovative Evaluation Team from Evaluation 2012. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

No tags

Hi, we’re Anne Schwalbe, Keith Miller, and Michelle Gensinger, graduate students in the Evaluation Studies Program at the University of Minnesota.

We chose Twitter as a platform for exploring creative evaluation methods at the AEA Annual Conference. These are the questions we hope to answer with our analysis:

  • How was Twitter used during the conference?
  • Which sessions at the conference generated Tweets?
  • What did Twitter use look like during the conference (descriptive analytics)?

We had a lot to learn about Twitter and conference evaluation. We hope that our experience helps you decide whether Twitter can contribute to your evaluation practice.

Hot TipBe Prepared to Promote Twitter Use: Not all AEA attendees use Twitter. We wanted to increase twitter use for this project.

  • Fun incentives can promote use without making anyone feel pressured.
  • Have instructions about signing up for Twitter readily available.
  • Demonstrate how Twitter can be used. A projector at our information table displayed AEA’s @aeaweb Twitter feed.

Hot TipKnow Your Hashtags (#): Hashtags are essential to analyzing data. They “tag” all the Tweets related to a topic.

  • Hashtags may already exist for an event, BUT you may need to create and promote them. AEA established #eval12 for the conference. We introduced #evalAHA to track learning moments inspired by the conference.
  • A trusted twitter voice, @aeaweb, spread the word about our hashtag.

Hot TipKnow the Limitations of Twitter Data

  • The sample of individuals Tweeting was small and self-selected.
  • Tweets are public.
  • The format (140 characters) is restrictive.
  • Hashtags don’t always work perfectly. Tweets labeled #eval2012 or #aea12 didn’t show up in our initial searches.

Hot TipHave an Analysis Plan

  • Know your time frame and pull data as soon as possible. It gets tricky to grab Twitter data more than a week old.
  • Manual coding of Tweets is time intensive; we’ve spent over 25 hours coding 1,500 Tweets to explore Twitter use at the conference.
  • Free, quick, and easy online tools provide simple descriptive analytics about a hashtag or Twitter account.

Rad Resources

  • Topsy.com: an online tool that instantly displays a summary of activity, top Tweets, and other fun stuff from any hashtag.
  • NCapture: an NVivo 10 add-on that captures data from social media websites and does some auto-coding.
  • Visual.ly: easy infographics templates for hashtags or twitter users. Here’s one for #evalAHA

We’re learning all this week from the University of Minnesota Innovative Evaluation Team from Evaluation 2012. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

Greetings from Jean King and Laura Pejsa from the Minnesota Evaluation Studies Institute (MESI) at the University of Minnesota. This week we will be introducing you to a crop of graduate student evaluators who (we think) made quite a splash at the AEA conference last month.  If you attended, you may have seen one or more of them filming video interviews, conducting on-the-spot i-Pad surveys, tweeting “Aha” moments, or helping participants identify favorite sessions on the giant TIG visualization. If you were not with us at the conference this year, today’s post will give you some background on this project.

It all started with the local arrangements committee for the AEA conference; the committee wanted to add some sparks of evaluation throughout the week and document experiences not captured on the standard after-conference survey. We created a one-credit, special course at the University of Minnesota titled Creative methods for training and event evaluation, and invited students to join us for a grand experiment. The course and the conference activities would be developed based on the interests and ideas of the students in it.

At our first class meeting, we introduced the students to the goals and history of the conference, provided a place (and food) to come together, and gave them the following loose guidelines:

  • to both pilot and model creative ways of documenting conference experiences;
  • to provide some real-time feedback;
  • to make the evaluation process fun/engaging for conference participants;
  • to explore the potential of emerging technologies;
  • to provide meaningful, usable data to AEA;
  • and to make sure they still had time to attend and enjoy the conference themselves.

Hot Tips

  • You don’t have to look much further than your own back yard for meaningful evaluation experiences for students. Instead of simulating or creating projects, check out the events that may already be happening where a little extra evaluation will go a long way.
  • When it comes to creative methods and technology, students can expand our thinking. Give them an opportunity with relatively low stakes, and see the connections they make between the ways they have learned to use things like social media and the evaluation problem.

This week we will be presenting you with more hot tips, cool tricks, rad resources, and lessons learned from this intrepid group of conference evaluators. Days 2-5 of this week will be written by our four student teams: Survey, Video, Network Visualization, and Twitter. We will wrap up the week with a post summarizing what we learned as instructors that may help others in designing meaningful, real-world evaluation experiences for novice evaluators.

We’re learning all this week from the University of Minnesota Innovative Evaluation Team from Evaluation 2012. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

<< Latest posts

Older posts >>

Archives

To top