AEA365 | A Tip-a-Day by and for Evaluators

Hello! I’m Sheila B Robinson, aea365′s Lead Curator and sometimes Saturday contributor. Evaluation is my newer career. I’m actually an educator, having taught in K12 schools and at a university. I’m also a professional developer, having provided PD courses, workshops, coaching, and mentoring to educators and evaluators for more than 15 years, so I’m no stranger to presentation design.

Lessons Learned: Check out p2i tools before designing any presentation! I’ve learned so much from AEA’s Potent Presentations Initiative (p2i) - AEA’s effort to help members improve their presentation skills, particularly around delivering conference presentations with specific advice about how to make your presentations more potent by focusing on three things: message, design, and delivery – and have incorporated these principles and strategies into my work.  

Rad Resource: Coming soon! The new p2i Audience Engagement Workbook. I’m honored to be able to share my experience in designing and facilitating presentations and professional learning opportunities as we add to the family of p2i tools with the Audience Engagement Workbook, featuring the WHY, WHAT and HOW of audience engagement, along with 20 specific strategies any presenter can use with limited investment of time or money.

Each strategy is described and rated on a number of dimensions such as ease of application, materials needed, cost, and the degree of movement for participants. There’s even a special section on engaging audiences in a webinar environment!

Hot Tip: One strategy to try now!

Four Corners: Choose just about any topic or question that has 3 or 4 positions or answers (e.g. In your family are you a first born, only child, oldest child, or in the middle? In your evaluation work, do you mainly use qualitative, quantitative or mixed methods? Do you consider yourself a novice, experienced, or expert evaluator?) and ask participants to walk to the corner of the room that you specify. Once there, give them an opportunity (3-5 minutes) to discuss this commonality, then return to their seats. If time permits, call on volunteers to share some insights from their brief discussion.

Variation: Ask participants a question that requires them to take sides (usually two sides, but could be three or more). Ask them to walk to the side of the room assigned to that position, and discuss with others who share their views. You can ask them to form two lines facing each other and have a debate with participants from each side presenting support for their position.

Stephanie Evergreen, information designer, dataviz diva, and p2i lead is putting the finishing touches on the layout and design of the workbook and we’ll have it up and ready for you well ahead of Evaluation 2014! In the meantime, look for Stephanie to preview additional strategies in the next AEA Newsletter!

Do you want your audience doing this? (Image credit: zenobia_joy via Flickr

Do you want your audience doing this? (Image credit: zenobia_joy via Flickr)

 

Or this? (Image credit: Chris  Hacking via Flickr)

Or this? (Image credit: Chris Hacking via Flickr)

 

 

 

 

 

 

 

 

   

 

 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

My name is Felicia Sullivan and I research youth civic engagement at the Center for Information and Research on Civic Learning and Engagement (CIRCLE), a non-partisan research center at Tufts University’s Tisch College of Citizenship & Public Service. Building youth-focused evaluation strategies and working with practitioners are important ways CIRCLE links academic research to what is happening on the ground. Recently, we have been exploring what game analytics and data captured by interactive learning systems can tell us about hard to measure civic engagement processes like deliberation, perspective taking, and collaboration.

Lessons Learned

Two recent projects involve games called Civic Seed and Discussion Maker that we developed in collaboration with the Engagement Game Lab at Emerson University and Filament Games an interactive learning game studio in Madison, WI.

Measuring concrete knowledge in learning environments is essential, but capturing processes and interactions are also important.  Civic literacy is more than knowing about government and history, it is about having the skills to act and behave within a civic culture. For schools and national youth programs, capturing growth and development in civic literacy is hard to do. Increasingly we have looked to learning games and interactive technologies to provide us with insights about these complex, developmental processes.

These forays into gaming and technology-enabled learning have us thinking about new approaches to evaluation that are dynamic, formative and adaptive. We are by no means experts in the arena, but here are things we are currently looking at in game-based evaluation:

Hot Tip: Finishing the Game is the Assessment

If designed well, a game can embed the assessment of an outcome within the game play itself in a “stealthy” way.  Achieving game missions or completing tasks can be thought of as “tests” or “benchmarks” in the learning process.  Most of the projects we have been involved with are interested in learning related to civic literacy, but we believe that other domains that work with hard to grasp complex systems or dynamics could benefit from games.

Cool Trick: User Created Content

When game users type in a chat box, share a resource, or select text to support an argument, a content analysis can later provide insights about what users are thinking and experiencing.

Cool Trick: User Analytics

How players engage with a game — the choices they make, the path they take or where they get stuck – is a digital “observation” that can be analyzed.

Rad Resource: Games, Learning, and Assessment

This chapter from a much larger edited volume on assessment in game-based learning captures some of the issues related to assessment with some concrete examples.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

We are Julie Poncelet, Catherine Borgman-Arboleda, and Jorge Arboleda of Action Evaluation Collaborative, independent consultants who use evaluation to strengthen social change. We want to share our experiences using participatory video (PV) in evaluations with youth.

PV is a dynamic, powerful approach whereby youth use video to capture everything from their stories of change to issues that affect their everyday lives to ideas they have to effectuate change in their communities. As pictured below, we recently engaged PV with a group of teens from a community-based NGO in Yucatan, Mexico. Youth produced videos about their dreams and senses of identity. PV is a compelling approach to explore these themes, which emerged from a Theory of Action process with the NGO; specifically identified was the need to have youth analyze critically their communities and find their voices.

PV positions youth as researchers and evaluators of their own communities and supports them to contribute creatively and critically to issues. With PV, youth design, direct, film, and edit videos. They experience empowerment, ownership, and self-esteem rarely garnered from other evaluation approaches. Adults provide technical assistance, build capacity, and facilitate a process for PV to unfold (not to take over the process!). For evaluations, PV creates a space for community members and stakeholders to see the interests and needs of youth in the community and provides a unique platform to reflect collaboratively on meaning and implications.

p2 p3

Lessons Learned: Focus should be on learning the technology and the video storytelling process, as well as providing an appropriate approach for young people to collectively reflect on themselves and their realities. Give youth time to feel comfortable with the equipment and with engaging others in conversation. And remain aware of group dynamics! We often find that boys are more comfortable and will take leadership with technology, so consider breaking groups up by gender.

Consider using a short set of questions that can be asked by youth to stakeholders included in their videos. The insight can help to contextualize the analysis and overall sense-making. The PV process is as much an outcome as the product; engaging in PV is transformative, so don’t worry about getting ‘perfect’ videos.

Hot Tips: Although building a participatory video kit is not cheap, all you really need is a small camera (preferably with projecting capabilities) and a good quality hand held microphone. We have found that when young people hold the microphone, they feel more empowered to speak, so it helps them to find their voice.

Rad Resources: The PV approach aligns nicely with other qualitative methods (Most Significant Change) and different types of evaluations (Monitoring & Evaluation).

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello, this is Jessica Jerney from the Extension Center for Youth Development at the University of Minnesota. I recently served as Project Coordinator for the Innovations on Youth Roles in Evaluation and Assessment project. This initiative included a learning cohort, symposia speakers’ series (featuring Kim Sabo Flores and Katie Richards-Schuster), and applied research.

As a result of this project, we gained new insights into the benefits and challenges of using a cohort model for professional development in youth-focused evaluation. Over nine months, 25 youth program practitioners met and engaged in dialogue, activities, and reflection to explore, test, and create new aspirations for engaging youth in evaluation.

Hot Tips for Engaging Adult Practitioners in Evaluation with Youth

Tackling the task of building youth worker capacity in youth-focused evaluation was a bigger challenge than originally imagined.  If you are considering implementing a learning cohort, consider the following Hot Tips for engaging adults in evaluation with youth:

  • Create a safe space for participants to grow and ideas to flourish. I often say that evaluation can be like therapy for participants. Involvement in qualitative evaluations has created new realizations for many, so did membership in the Innovators Learning Cohort.  We did not expect that the experience would challenge adult practitioner ideas about the roles of young people in evaluation. As Mariah Kornbluh pointed out, we need to be prepared to “address adultism and be an ally.” When engaging a cohort in a potentially controversial issue, allow space for learning, change, and surprises. Be ready to help participants peel back the unconscious ideas that our society has about youth roles in activities, like evaluation, and practice authentic youth-adult partnerships.
  • Establish an on-going group dedicated to learning and growing their practice. We created an application process to identify candidates that had the time, interest, and skills to participate at a high level.
  • Develop skills and share experiences. Adult learning theory tells us that grown-ups want to share their experiences and learn from others. Introduce activities that are challenging and hands-on. We found that it was important to participants to be pushed outside of their comfort zone in both theoretical and practical ways. In some instances, cohort members tried out new skills and ideas in their program and shared the results with the group. In other instances, local youth leaders participated in activities with the cohort and reflected on the experience the next day. These opportunities to develop working theories, test them out, and reflect were critical for growth and change to occur in the space in which they operate.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, we are Anne Gleason and Miranda Yates from the Program Evaluation and Planning Department at Good Shepherd Services in New York City and we would like to share some tools we put together for a youth research camp. Two summers ago, we partnered with youth in one of our afterschool programs to conduct research on what youth think it means to be successful, a topic that the students selected, which ultimately culminated in a student-produced documentary. Drawing on techniques we learned at the Critical Participatory Action Research (CPAR) Summer Institute offered by CUNY’s Public Science Project, we facilitated a series of research camp days with a group of twenty 10-14 year olds. The days were organized as follows: What Is Research, Survey Design Parts I and II, Data Entry, and Data Analysis. Check out the camp schedule for more details.

The project provided an enriching learning experience for everyone involved.  Youth gained a unique first-hand experience conducting research by playing a lead role in the design and implementation of the study and the data analysis. In turn, their insider perspective helped us to better understand how to form more meaningful questions and interpret results. For example, one survey question presented a list of resources and asked respondents to rate their importance to achieving life goals. For the goal of attending college, we saw that older students rated having a supportive family/supportive teachers as less important when compared with younger students. We initially were perplexed as to why older students would hold less value for supportive adults. The youth participants posited that older youth may feel more independent and, thus, be more confident in their own ability to achieve success. This insight underscored the benefit of partnering with youth in research.

Lessons Learned:

  • If you plan to conduct a full youth participatory project, allow yourself plenty of time. Ideally, we would have liked a few extra days to delve deeper into research techniques and data analysis.
  • If you’re limited with time or resources, you don’t have to give up the idea of drawing on participatory techniques. We have also found ways to incorporate youth voice into our evaluation activities that are less time intensive, but inspired by a participatory approach. For example, we routinely conduct focus groups throughout our programs to gather feedback on surveys and other evaluation tools and develop action plans.

Rad Resources: Our camp curriculum included activities, role playing and group discussions. Here are two handouts that might be useful to those considering a camp of their own: Survey Development 101 and Survey Administration 101.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Our names are Jessica Manta-Meyer, Jocelyn Atkins and Saili Willis and we are evaluators at Public Profit, an evaluation firm with a special focus on out-of-school time programs for youth.

We usually evaluate networks of after school programs as a whole (some of which serve more than 20,000 youth, where a survey is indeed one of the best approaches). However, we particularly enjoy opportunities to build the capacity of youth programs to solicit feedback through creative ways that align with best youth development practices.

Here are some of the methods that have been most popular with these programs:

Cool Trick – Journals: At the start of a program, provide journals for all youth in the program and ask them to write something related to the program goals. Is one of the program’s goals to develop leadership skills? They can ask the youth to respond to this question: “In what ways are you a leader?” Is one of the goals to increase enjoyment of reading? “What do you like about reading?” Then, at the end of the program, youth can read what they wrote the first day and write “How would you answer that question differently, now?” or some other question to get them to reflect on how they’ve changed in the program.

Cool Trick – Candy surveys: Ask students to answer surveys questions by putting certain colors of candy in a cup then tally the candy colors to get your responses. Have the youth tally the results themselves. They can even make a bar chart on chart paper by taping the actual candy to the paper. The youth can then eat the candy after they’ve tallied the results.

Hot Tip – used wrapped candy! Starburst works well and is what this summer program used:

editpic1

Cool Trick – 4 Corners Activity: Youth leadership programs do this all the time. They ask youth to “take a stand” next to signs that are marked Strongly Agree, Agree, Disagree or Strongly Disagree in response to a statement like “youth should be able to vote at age 16.” Once the youth stand next to one of the signs, the group can talk out their different perspectives. Programs can also use this to collect both quantitative (how many stand where) and qualitative (what they say about why they are standing where they are) data.

Hot Tip: For more Creative Ways, come to our Skill-Building Workshop Saturday at 8am. Yes, it’s early, but we promise to have you moving, interacting and creating. Plus, there will be candy.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hello, my name is Kim Sabo Flores and I am David White. We are honored to serve as co-chairs of the Youth Focused Evaluation Topical Interest Group (YFE TIG). As we prepare for the 2014 conference and our annual business meeting, we have to acknowledge how truly remarkable it is that a loosely knit group of like-minded individuals could grow into a burgeoning group of over 300 individuals who have begun to define and unify the field and practice of youth focused evaluation within the Association. As a group, we are exploring evaluations focused on youth and positive youth development in a variety of settings. However, at our core, many of us are interested in the practice and outcomes of youth participation in evaluation. This focus has been a key part of our history as a TIG because we understand youth participation to be a pillar of positive youth development.

Hot Tip: Youth-Adult Partnerships

When youth are equal partners with adults in the evaluation process, they share equally the decision making power and responsibility. What does this look like? Here are a few key considerations:

  • Evaluation questions are jointly developed.
  • Evaluation activities are performed by youth and adults.
  • Data are analyzed by youth and adults.
  • Youth and adults receive significant benefit from involvement and from the evaluation findings.

Rad Resource: Youth-Adult Partnerships in the Evaluation Process

This 2005 chapter from the Innovation Center for Community and Youth Development outlines best practices in youth youth-adult partnerships in evaluation.

Our TIG is taking the first of several steps necessary to provide a genuine, inclusive, and participatory space for all evaluators, regardless of age. The TIG will host two sessions at AEA this year designed to ignite conversation about how to best include youth in our yearly conference.

See you in Denver!

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! I’m Sheila B Robinson, aea365′s Lead Curator and sometimes Saturday contributor. While I cringe at describing myself with the heavily clichéd “lifelong learner,” I’m afraid it’s all too accurate. In fact, when I dream of winning the lottery I don’t think about whiling away my days on a beach reading cheesy novels, but rather sitting in a classroom, taking all the courses I missed out on in college. Problem is, my other half Larry has the beach dream, and I’d really miss him!

Lessons Learned:

Try MOOCs – Massive Open Online Courses. MOOCs have been mentioned a few times on aea365 (see here) and many evaluators (including me) have taken a course on data visualization and infographcis from the Knight Center for Journalism from the author of The Functional Art: An Introduction to Information Graphics and Visualization, Alberto Cairo. But, did you know there are additional MOOCs that might appeal to evaluators?

Image credit: AJC1 via Flickr

Image credit: AJC1 via Flickr

MOOCs offer flexibility. They’re free, distance learning (read: you can take courses from the beach!), and you can do your coursework any time of day. Most MOOCs have a video lecture component, some exercises or homework to be completed between sessions, and a final project. There are usually discussion boards where students can communicate with each other, pose questions to the professor, give feedback on each other’s work, or just have conversations. Some courses allow you to work collaboratively on the final project.

With a MOOC, there’s no pressure. While I don’t encourage registering for a MOOC with no intention of finishing, I must admit I am one of the over 90% of MOOC starters who have not finished one. I’m not proud of this, but realities of life called for me to drop something each time, and my MOOC always got the axe. The caveat is, however, that I feel as if I learned a great deal from the weeks I did participate in each course!

MOOCs can offer high quality, rigorous coursework. I took the first few weeks of a course on data analysis and statistical inference (offered by Duke University through Coursera), just to brush up on my skills and see if I could pick up anything new, and before I knew it, we were deep into conditional probabilities, Bayesian inference, and using R! Many offer certificates of completion.

Rad Resources: While Coursera is one of the most popular MOOC sites, look at MOOC aggregators like Class Central or Course Talk to search multiple sites. You’ll find courses on statistics, data analysis, R programming, research methods, writing, problem solving, and much, much more.

So if you see me at the beach under an umbrella with a laptop…

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello aea365ers! I’m Susan Kistler, Executive Director Emeritus of the American Evaluation Association, professional trainer and editor, and all around gregarious gal. Email me at susan@thesmarterone.com if you wish to get in touch.

Rad Resource – Padlet: The last time I wrote about Padlet for aea365, exactly two years ago on September 12 of 2012, it was still called Wallwisher. One name change, two years, and a number of upgrades since then, this web-based virtual bulletin board application is worth a fresh look.

Padlet is extremely easy to set up – it takes under 10 seconds and can be done with or without an account; however, I highly recommend that you sign up for a free account to manage multiple bulletin boards and manipulate contributions.

Padlet is even easier to use, just click on a bulletin board and add a note. You can add to your own boards, or to other boards for which you have a link. I’ve set up two boards to try.

Hot Tip – Brainstorming: Use Padlet to brainstorm ideas and get input from multiple sources, all anonymously. Anonymously is the keyword here – the extreme ease of use (no sign in!) is balanced by the fact that contributions only have names attached if the contributors wish to add their names.

Hot Tip – Backchannel: Increasingly, facilitators are leveraging backchannels during courses and workshops as avenues for attendees to discuss and raise questions. Because Padlet is a platform/device independent application (PIA) accessed through the browser, and does not require a login to contribute, it can make an excellent backchannel tool.

The uses are almost endless – any time you might try sticky notes, Padlet may be a virtual alternative.

***IF YOU ARE READING THIS POST IN EMAIL, PLEASE CLICK BACK TO THE AEA365 WEBSITE TO TRY IT OUT!***

This board illustrates the linen background (there are 15+ backgrounds from which to choose) with contributions added wherever the contributor placed them (the owner may then move them). Just click to give it a try. Please.

Created with Padlet

This board illustrates the wood background with contributions organized as tiles (a new option).

Created with Padlet

The size is small when embedded on aea365, go here to see the same board in full page view.

Hot Tip – Multimedia: Padlet can accommodate pictures, links, text, files, and video (when hosted elsewhere).

Hot Tip – Export: A major improvement to Padlet’s functionality has been the addition of the capacity to export the contributions to Excel for analysis, sharing, etc.

Rad Resource – Training: I’ll be offering an estudy online workshop in October on collaborative and participatory instrument development. We’ll leverage Padlet as an avenue for stakeholder input if you’d like to see it in action. Learn more here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Samantha Grant, and I work for the University of Minnesota Extension as a Program Evaluator for the Center for Youth Development. As an internal evaluator, building evaluation capacity is crucial for my organization (and my mental health!)

Building capacity doesn’t happen overnight, but with a few tactical strategies, you will be on your way.

Hot Tips:

Start where the learner is at. Before embarking on capacity building, gain a good understanding of the organization’s and staff’s competency in evaluation. Tailor training to the readiness of the group. Some learners may be ready for more advanced training while others are just getting a handle on the basics. Try to break people up into mini-cohorts to make the learning experience customized for your audience.

Build Confidence and Affirm Expertise. I work with an incredibly skilled group of youth workers who are naturally building evaluation into their practice without even realizing. We talk about all the ways that they are evaluating or reflecting in their program; how they present data to stakeholders; and how they improve their programs with participant feedback. Knowing that they already act like an evaluator helps to build their confidence in gaining more skills.

Get Creative. Use creative, hands-on strategies to get people engaged in the materials. I’ve found resources from people conducting youth focused evaluations to be especially hands- on. Materials created for use with youth often work with learners of all ages.

Structure capacity building as an entry to greater growth. As your audience becomes savvier with evaluation concepts, they will naturally make connections about how they could grow in the future. (This is without you having to tell them what’s next!) Capacity building has helped me to build trust and relationships with my colleagues, so we can ask hard questions in our evaluation. People begin to respect your skills and see you as a resource and not a threat.

Good luck with your future capacity building!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top