AEA365 | A Tip-a-Day by and for Evaluators

We are Sara Plachta Elliott, executive director of the Youth Development Resource Center in Detroit, and Alicia McCormick, youth development director at Urban Neighborhood Initiatives, a community development organization in Detroit that provides a variety of youth programs. We collaborate to improve youth program quality, making the improvement process more youth-led along the way.

Many youth programs use the Youth Program Quality Assessment (YPQA), or other program quality assessments. YDRC runs the Youth Program Quality Intervention for youth programs in Detroit. We believe the quality improvement process is even more powerful when organizations like UNI engage youth and empower them to lead a quality improvement process for the programs they are involved in.

This summer, UNI deepened youth engagement in the evaluation process by having youth design their own program quality observation tool to compliment the organization’s use of the YPQA. UNI hired a team of 3 high-school aged youth, guided by a young adult intern, through the city’s summer youth employment program. YDRC partnered with UNI to give a day of training to the team on youth-led evaluation, and then the youth team spent the summer designing their own program observation tool and testing it with UNI’s youth programs. The youth used the results to give recommendations to the organization for future program improvements. The youth are now serving as part of the board’s Youth Development Committee, further deepening their leadership and influence within the organization.

Hot Tips:

  • Prepare the adults. An organization’s leaders need to create structures that allow for deeper youth participation. Inviting youth to join UNI’s Youth Development Committee was a structural change that allowed for more impactful youth participation.
  • Give youth agency. While the adults asked the youth to focus on program quality for one aspect of the project, the youth team also selected their own research focus which was “Why Youth?” They choose to produce a video and this increased their engagement in the evaluation work.
  • Pay youth for their work. In the case of UNI, paying youth through summer youth employment was critical for meaningful work and engagement. Adults get paid for their work on evaluation, and youth should too.

Rad Resources:

  • The Youth Engaged in Leadership and Learning curriculum from the John W. Gardner Center provides a lot of activities and meeting agendas to support youth-led research.
  • The Weikert Center for Youth Program Quality’s Youth Program Quality Assessment is a widely used quality improvement assessment tool. Their aligned Youth Work Methods trainings, such as Youth Voice, help build a foundation of readiness.
  • The Neutral Zone’s Youth Driven Spaces initiative provides a variety of resources for increasing youth-driven work, including a TAC Guidebook, a Youth-Adult Partnership Rubric, and an Agency Self-Readiness and Capacity Assessment.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

Hi, I am Cassandra Jessee and I serve as the director of the USAID-funded YouthPower Learning project, a youth-focused knowledge-creating, -curating and -convening project and through it, I represent Making Cents International and the International Center for Research on Women in Washington, DC.

 

We defined our core approach, positive youth development (PYD) to ensure relevance for low-and middle-income countries (LMIC), whereby PYD builds and holistically supports the competencies, skills and abilities of youth so that they are empowered to reach their full potential. PYD differs from many approaches in that it views youth as active partners in development efforts, rather than as problem actors or obstacles to be overcome.

To support youth-focused evaluation and program design in LMIC, we developed the PYD Measurement Toolkit, in which we outline a simple PYD measurement framework with four overlapping domains: assets, agency, contribution, and enabling environment. This framework provides a simple entry point into measurement and aims to makes story telling about how PYD programs easier. We included illustrative PYD measurement indicators and possible tools organized by these domains that can be considered and adapted for the varying LMIC context.

Hot Tips to advance PYD:

  • Engage youth meaningfully throughout all phases of programming and evaluation. Meaningful youth engagement is a key component of effective PYD programs. Training, supporting, and mentoring youth to participate meaningfully has direct skill-building benefits and helps ensure program effectiveness and evaluation relevance.

Rad resources: We commissioned a series of videos that provide guidance and best practices on engaging youth. Also check out a recent webinar series on engaging youth we did in collaboration with AEA YFE TIG:

  • Youth Voice in Action: Tips, Strategies, and Advice from Youth Evaluators
  • Engaging Hard-to-Reach Youth in Research and Evaluation
  • Engaging Youth in Research
  • Use the term “Positive Youth Development” when evaluating programs that incorporate two or more domains of PYD. In our recently released systematic review of PYD programs in LMIC, only one program explicitly identified itself as practicing PYD. The more programs and evaluations recognize that they are employing PYD principles and use PYD terminology, the more the global community will understand the importance of PYD approaches, as well as their effectiveness.
  • Ensure consistent measurement of PYD outcomes. Many PYD programs measure sector-specific outcomes, such as knowledge of HIV, contraceptive use, job placement rates, or reduction in conflict. Very few assess PYD outcomes, such as self-efficacy, positive identity or interpersonal skills. The PYD Measurement Toolkit can help integrate PYD principles into your M&E plans.

Get Involved

Join one of our communities of practice to share and learn with others and check out www.youthpower.org and its repository of more than 2000 resources and events dedicated to PYD.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

Thursday

 

My name is Somongkol Teng, Extension Educator for Evaluation at the University of Minnesota Extension Center for Youth Development. Recently, I conducted an evaluation using focus groups with our 4-H Online Adventure Program, a collaborative, project-based learning program for Minnesota 4-Hers, ages 10 to 12. While many evaluators are familiar with focus groups with adults, conducting one with youth requires careful considerations and preparation.

Below are lessons learned and hot tips:

  • Pick the right facilitator. A good facilitator with adults might not be as good with youth. In our case, we had a colleague who had starred in our training videos facilitate the sessions. He was selected because he was not too involved in the program, but was recognizable by the youth.
  • Be attentive to the age range. Keep the age range no more than two years. Different age groups behave differently and require different strategies.
  • Keep the group small. Unlike focus groups with adults, we found conversation was easier and richer with a smaller group of youth, usually around 5-6.
  • Group youth participants thoughtfully. Find out in advance about the youth’s group dynamic and try to separate close friends. This strategy helped ensure a wider range of comments.
  • Start with fun icebreaking activities. Invest 10-15 minutes for some fun ice-breaking topics about celebrities, video games, etc. to get the conversation started.
  • Ask age-appropriate questions. Remember that youth will have fewer life experiences to draw from compared to adults. When developing questions, keep sentence structures simple, avoid yes/no questions, and be aware of questions that potentially threaten the freedom and independence of young people (e.g. if interested in knowing how decisions were made about their 4-H project selection, try not to stress on “who” made the decision since few youth liked to admit before their peers that their parents decided for them).
  • Use interactive and participatory activities. Including technology or drawing kept the session lively and fun. We embedded a live online polling using UMU, a free online platform for engaging learning experience, into one of our activities.
  • Keep the session short. We found it effective to keep the focus groups to one-hour sessions using a short set of 6 to 8 questions.
  • Provide food. Food is the key to the heart. Find out what the youth like. Do not underestimate the power of food to keep them engaged.
  • Get consents. This is critical! Determine what the appropriate protocol might be to get parental or guardian consent. That said, it is equally important to get youth’s assent to participate in the focus group. Communicate why their participation matters.
  • Be flexible. Things are bound to not go as planned. Have fun, and go with the flow.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

·

Hi, we are Abhijay Kumar  and Jordan Scrimger, members of the Metropolitan Youth Policy Fellows (MYPF), a diverse group of youth working together for a better metropolitan Detroit. The MYPF, which includes members from multiple communities —across cities, neighborhoods, and suburbs – formed in order to participate in issues that impact our lives. We think it is important that youth have a voice in policy decisions. Youth know what is going on in their communities and have ideas for solutions.

The first project we undertook was a survey (N=1,191) asking youth to identify issues in their communities. Afterwards, we engaged youth in focus groups (N=53) to hear more about ideas they had for change. We gathered valuable information and created a report, making these findings and recommendations accessible to everyone.

Along our journey, the MYPF members have reflected on our experiences and compiled some tips that are valuable for anyone wanting to support the efforts of youth:

Hot Tips:

  • Reflect on your power and privilege. Acknowledge resources where you possess privilege, like networks, funding, knowledge, etc. and recognize how it can be used to optimize youth efforts.
  • Provide space, be easy to access, and invite conversation. The belief that adults and youth are equals and that the exchange of ideas is bidirectional creates a more enriching and productive atmosphere for all.
  • Be a “coffee filter”. Rather than change the flow of thoughts, facilitate the refinement process towards a goal.
  • Get youth voice and input. Youth should not be an afterthought. They should be provided a platform to directly influence policy-making.
  • Treat youth as allies, not like a burden. We want a voice in the way that our lives are decided. Our viewpoints can provide valuable insight into the world that could only be conveyed through collaborations with youth.
  • Don’t tokenize. Offering youth an authentic way in which their ideas can be taken seriously and with the intent to problem solve is essential.
  • Ask us! Don’t make assumptions. Decisions that are made must be representative and inclusive of candid youth perspectives. They should be actively involved in the decision making process, as youth can help ensure that good intentions reap positive impact.

RAD RESOURCES:

 

group photo of authors

            Members of our team after presenting at the Fourth International

              Center for Culturally Responsive Evaluation and Assessment Conference.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

I’m Nicole Clark, social worker and owner of Nicole Clark Consulting (http://www.nicoleclarkconsulting.com), where I partner with community-based groups, local/national organizations, schools and more to design, implement, and evaluate programs and services geared toward women and girls of color.

In 2015, I conducted an impact evaluation on a 6-week intensive summer leadership program geared towards high school young women of color in the New York City (NYC) area. The program provided social justice classes, workshops, field trips, and leadership seminars with accomplished women of color leaders and is the flagship program of an NYC-based organization that provides social, political, and economic leadership programming for young women. The organization received funding to implement the program in another borough of NYC where we sought out to determine if it could be implemented successfully.

We used a mixed methods approach consisting of classroom observation, focus groups with the participants, in-depth interviews with on-site leadership, parents, and staff, and a post-intervention survey.

Lesson Learned #1: Conduct a community asset map to highlights the linkages, relationships, and resources located in a community. Prior to the start of the program, organizational staff conducted a community asset map to determine what resources, services, and community organizations and members were in the area. We concluded that this program presented a unique opportunity for the organization to bring a social justice curriculum leadership curriculum to the community.

Lesson Learned #2: Consider the accessibility of a program’s activities. A determinant of participants applying to the program was the commute time to the classroom site. While the program was open to all high school self-identified young women in each of NYC’s five boroughs, the flagship location is Manhattan. Remaining in their home borough allowed participants whose parents and guardians were concerned over them traveling to Manhattan to remain in the borough. Also, participants shared feeling more connected to the program and to the other participants because they were all from the same borough.

Lesson Learned #3: Staff capacity plays a major role in how a program is implemented. While the participants recommended the program be implemented in each borough, program staff identified the lack of staff capacity to do so. As a small organization, several on-site leadership and organization staff frequently traveled between the Manhattan site and the new borough. Also staff felt they had more community relationships in other boroughs compared to the borough in which the program and evaluation was implemented in. This provided a challenge for the staff, but also an opportunity to build community partnerships in a new borough

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

Happy Holidays aea35-ers! I’m Sheila B Robinson, Lead Curator and sometimes Saturday contributor with a wish for a small present from you!

You see, in addition to being your loyal blog curator, editor and writer, I’m also working on AEA’s Potent Presentations Initiative (P2i), a project with the goal of helping evaluators improve their presentation skills. P2i’s impact is growing as we continue to spread the word and add more tools to the site. Presentations are continuing to improve at AEA’s annual conference (the poster session this year was amazing!). And, evaluators are clamoring for even more resources to up their game. Nearly 200 conference attendees showed up to my Evaluation 2017 session on slide design!

Lesson Learned: It’s clear that evaluators are interested in continuing to improve their practice around presentations. I also know that offering the right content for this audience is key to P2i’s success. Given that, will you help me?

Get Involved: I write a monthly AEA Newsletter article for P2i featuring our current P2i tools, blogs, online articles, etc. and I’m working on developing even more tools and guidance documents for the P2i website. But, I need to hear from you! I need to know a little more about the types of presentations you do, the areas of your practice you would like to improve, and what content or resources you would like to have available.

To that end, I’m asking for a little ‘present’ – a few minutes of your time and a bit of your best thinking. I’m sharing a link to a brief survey asking about your presentation work and what might help you improve your practice. To reciprocate, I’ll share results of the survey in this space in early 2018.

 

Coot Trick: Would you please share this post (or this link: https://goo.gl/forms/8HrIk7HLmzpW3bVy1) on social media, wherever other evaluators “hang out?” Thank you!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We’re Chris Lovato, Professor in the School of Population & Public Health at the University of British Columbia, and Kylie Hutchinson, independent consultant and author of A Short Primer on Innovative Evaluation Reporting.

To date, most evaluation capacity-building has tended to focus on program-level managers and staff, ignoring the key role that senior leaders also play in using evaluation for making better decisions. However, informed decision-making and evidence-informed practice depends on senior decision-makers having a full understanding and appreciation of evaluation.  Managers are constantly looking for ways to work smarter and more effectively in their leadership role, but management has never been more complex and challenging. We believe that managers who are savvy users of evaluation are more likely to be more effective leaders and decision-makers. But it’s a significant challenge to capture the attention of these extremely busy individuals.

Rad Resource: Evaluation for Leaders is an innovative and interactive mobile learning course that managers can access on their laptop, tablet, or phone, anywhere and anytime, free of charge. The course is designed to quickly increase leaders’ understanding of evaluation through seven stand-alone units each taking five to ten minutes to complete. Each unit contains practical information and just-in-time tips for how to use evaluation to do things differently. The intent of the course is not to teach managers how to do evaluation, but rather how to better use evaluation in their day-to-day decision making and organization overall. At the end of the course managers have a greater appreciation for how they can better support evaluation in their organization so it can better support them.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings! We are Laura Sefton from the University of Massachusetts Medical School’s Center for Health Policy and Research and Linda Cabral from Care Transformation Collaborative-Rhode Island. When choosing qualitative interviews as the data collection method for your evaluation project, developing an appropriate interview guide is key to gathering the information you need. The interviews should aim to collect data that informs your evaluation aims and avoids collecting superfluous information. From our experience in developing interview guides over the last 10 years, we have the following insights to offer:

Hot Tips:

Wording is key.  Questions should be straightforward and gather insights from your respondents. Your goal should be to develop questions that are non-judgmental and facilitate conversation. Word your questions in ways that elicit more than yes/no responses. Avoid questions that ask “why,” as they may put your respondent on the defensive. Adjust your wording according to the intended respondent; what works for a program CEO may not work for a client of the same program.

Begin with a warm-up and end with closure.  The first question should be one that your respondent can answer easily (e.g., “Tell me about your job responsibilities.”). This initial rapport-building can put you and the respondent at ease with one another and make the rest of the interview flow more smoothly. To provide closure to the interview, we often ask respondents for any final thoughts they want to share with us. This provides them with an opportunity to give us information we may not have asked about but that they felt was important to share.

Probe for more detail.  Probes, or prompts, are handy when you are not getting the information you had hoped for or you want to be sure to get as complete information as possible on certain questions. A list of probes for key questions can help you elicit more detailed and elaborate responses (e.g., “Can you tell me more about that?” “What makes you feel that way?”).

Consider how much time you have.  Once you have your set of key questions, revisit them to see if you can pack them down into fewer questions. We found that we can generally get through approximately ten in-depth questions and any necessary probes in a one-hour interview. Be prepared to ask only your key questions. Your actual interview time may be less than planned or some questions may take longer to get through.

Lessons Learned:

It’s ok to revise the interview guide after starting data collection.  After completing your first few interviews, you may find that certain questions didn’t give you the information you wanted, were difficult for your respondents to understand or answer, or didn’t flow well. Build in time to debrief with your data collection team (and your client, if appropriate) on your early interviews and make adjustments to the guide as necessary.

Rad Resource: As with many topics related to qualitative research, Michael Quinn Patton’s Qualitative Research & Evaluation Methods serves as a useful resource for developing interview guides.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi there! We’re Anne Vo, Ph.D., Director of the Keck Evaluation, Institutional Reporting, and Assessment (KEIRA) Office at the Keck School of Medicine of USC, and Jacob Schreiber, Evaluation Assistant at KEIRA. Today, we offer reflections on what we’ve learned about conducting evaluation within an academic medical center—an environment that offers rich opportunities to observe, conduct, and understand evaluation practice and policy.

Hot Tip #1: Standards Rule Healthcare, Medicine, and Medical Education

Medicine is a highly regulated field. Broad swaths of stakeholders—clinicians, clinical educators, school and hospital administrators—rely on standards to inform decision-making and drive practice. As such, systematic evaluation often manifests as high turn-around monitoring of easily quantifiable outcomes (e.g., student academic performance, residency program match rates, etc.). Successfully “chasing numbers” enables organizations such as academic medical centers to communicate that standards of care and teaching are being met. Because standards offer a common language that stakeholders can use to think through pressing issues of the day, they also become the go-to frame of reference for decision-makers throughout the organization.

Rad Resource:

Hot Tip #2: Everything is “Evaluated,” Everyone is an “Evaluator”

Because standards drive practice in Medicine, evaluation could become a decentralized activity. Aspects of evaluative practice—from question formulation, to data collection, monitoring, analysis, and synthesis—can often be divided among various stakeholder groups across an organization. This cascaded evaluation model emphasizes “local expertise” and echoes “team values” to which healthcare teams aspire. It is reminiscent of development evaluations that organizations such as UNICEF and the UNDP strongly support. And, in the medical context, it is a model that tends to immerse clinical experts in monitoring processes and largely distances them from actual evaluation.

Rad Resource:

Hot Tip #3: Democratic Decision Making is a Core Value

Decisions about how medical education is done are often made through committees and guided by accreditation standards; specifically, LCME Standard 1 on Mission, Planning, Organization, and Integrity (see above link) and Standard 2 on Leadership and Administration. Academic and administrative committees oversee and monitor the quality of Undergraduate Medical Education (what we know as the first four years of medical school). Many of the same stakeholders serve across committees as well as the sub-committees and work groups within each. For evaluation to be meaningful, expect to have many of the same conversations with the same people on different levels. Most importantly, know what each committee’s charge is; its membership; and members’ roles and stances on issues that are up for discussion.

Rad Resource:

Alkin, M.C. and Vo, A.T. (2017). What Is the Organizational, Community, and Political Context of the Program? (pp. 77-87). In Evaluation Essentials: From A to Z (2nd Edition). New York, NY: Guilford Press.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! We are Carolyn Camman, Christopher Cook, Andrew Leyland, Suzie O’Shea, and Angela Towle of the UBC Learning Exchange, which is a little bit of the University of British Columbia in the heart of Vancouver’s Downtown Eastside (DTES). It’s a bridge for mutual understanding and learning between the University and a neighbourhood rich in community, art, and history, but whose residents face challenges, including homelessness, poverty, gentrification, and stigma. The UBC Learning Exchange acts as a member of the community, giving back to residents through community-based programming alongside experiential learning opportunities for students and support for community-based research.

The Learning Lab supports members of the DTES community to engage in activities and scale-up their involvement by offering creative and educational activities in a flexible, low barrier format. In keeping with the arts-based principles of the Learning Lab and community engagement mission of the Learning Exchange, when it came time to make the results of a recent evaluation accessible to a community audience, the answer was obvious: put on a show!

Voices UP! is a theatrical performance co-written and co-performed with the community members who contributed to the original evaluation. It not only communicated evaluation results, but deepened the evaluation itself. Through writing and performing the play, the cast learned more about evaluation and shared new stories and insights. Over its four-performance run from Spring 2016 to Fall 2017, the show evolved and grew.

Hot Tip: There’s growing interest in using arts-based methods in evaluation. Live theatre is a dynamic and engaging approach that encourages people to connect with findings viscerally and immediately as part of a dialogue between performers and audience. In post-performance talk-back, one person said, “It was neat to hear the participants reflecting on what they had just done as well as what it meant to them to be a part of it.” Another commented that “seeing” the impact of the program was more persuasive than reading about it from a report or grant application.

Lessons Learned: A performance doesn’t have to be polished or “professional” to be effective. Community members speaking in their own words is powerful and there are many creative techniques (like puppets!) that can bring evaluation findings to life. Having a conversation with the cast and giving introductions to audiences before performances about different ways theatre can “look” helped set appropriate expectations.

Rad Resources: To keep Voices UP! going even after the curtains come down for the last time, the Learning Exchange staff and cast of program patrons came together to tell their story one last time, this time in a comic book. You can download this resource online for free: http://learningexchange.ubc.ca/voicesupcomic It was created using the same participatory process as the original performance and tells the story of how Voices UP! came to be, with tips and insights for anyone interested in using theatre methods to tell their evaluation story. Look up “reader’s theatre” and “ethnodrama” for more ideas about turning evaluation and research into plays.

The cast and creators of Voices UP! Photo credit: The UBC Learning Exchange

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Older posts >>

Archives

To top