AEA365 | A Tip-a-Day by and for Evaluators

CAT | Youth Focused Evaluation

Hello! My name is Nick Petten, owner of Petten Consulting in Toronto, Canada. I am a practicing program evaluator focused on child and youth programs and a children’s rights advocate. My work with children and young people over the last decade and a half has impressed upon me that they are fully capable of expressing and communicating their lived realities and that those realities are a valued focus of study.

Lessons Learned

When researching childhood, a critical methodological concern is the power dynamics that occur between adults and children. Unequal power can compromise your data and severely distort truth. But more importantly, it can cause harm to your subject.

The new sociology of childhood is a scientific discipline that promotes the process of obtaining assent from children to participate in research. So, how do you seek assent with children?

Hot tip #1: Develop an assent and accountability framework that will help you explain the process of research and its findings in a child-friendly manner and systematically ‘check-in’ with children about their participation in the research.

Hot tip #2: Develop protocols to use when seeking children’s assent that considers as many factors as possible about why they would answer in a particular way.

 Hot tip #3: Dedicate time to building relationships and having conversations with the gatekeepers and remember that such conversations need to strike a balance between providing critical information and too much detail that can lead to confusion.

Hot tip #4: Engage in internal reflection through an explicit and systematic process throughout the research process and think about how the insights gained may influence how you converse with children and their gatekeepers.

Rad resources:

If you are starting to think about how to genuinely involve children’s participation in research, you better be ready to read. By reading some of the great material on research with children you can begin to understand children’s position in research – even in our adult-centric world.

Here is some material to get your started on you reading adventure:

For more information, please go to my website: www.pettenconsulting.com

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

My name is Susan Igras, a Senior Advisor at the Institute for Reproductive Health, Georgetown University whose work is mainly focused in Africa and Asia.  One of our projects is operating in urban-poor areas of Kinshasa, Democratic Republic of Congo.  Designed to evaluate scalable interventions that address social and normative factors that limit adolescent and youth choices and sexual/reproductive health outcomes, we are challenged with fitting youth engagement into the research and evaluation process.  Everyone likes the idea, but how can you operationalize it without jeopardizing the externally-implemented research component?

Staff from participating organizations sat together to brainstorm a way forward; we decided we could focus youth evaluation on questions relating to improving program design, youth engagement, and implementation.  We ended up creating tables that allowed us to get practical:

Evaluation questions |data sources | youth role in designing data collection tools | and youth role in collecting data to answer the questions.

This seemingly-simple planning exercise was critical to move from a nice idea to an actionable evaluation activity.  We are still working on making all steps as youth-led as possible – stay tuned for a blog from one of our youth evaluators!

 RAD resource – for those of us working in French-language contexts:

http://www.troussemj.ca/content/page-de-renvoi – See the chapter on youth-led evaluation in this practical toolkit for engaging youth in mental health issues.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

We are Aspen, Journey, Sira, Sati, and Alexus- members of the Youth Evaluation Team at the Goodman Community Center (GCC) in Madison, Wisconsin. We are high school youth who work with GCC staff, and graduate students and researchers at the University of Wisconsin-Madison to evaluate different programs in our community centers and schools.

Lessons Learned:

Within our evaluation process, many people ask, “Why should there be a youth evaluation team in our community center?” We have learned that the most important reason an evaluation team is beneficial is the opportunity for youth voice.

  • The program gives us a chance to give our input and ideas on problems within our community centers on the north and eastside of Madison.
  • It also gives us a chance to take responsibility for the improvement in the communities, and practice leadership skills around decision making.
  • Another thing that we like about being on an evaluation team is getting to collaborate with staff to organize surveys and collecting data. We also get to work as a team with the staff and get to be treated like one of them, instead of just a student giving feedback.
  • When it comes to the evaluation teams and the process, they’re a good reference to have for college applications and it is great for finding different connections throughout the community for more jobs and evaluation opportunities.
  • This program gives us, and the staff that we work with, a chance to work on improving our skills of working with others, and communicating with people we may not know to get connections for our data collecting. All of these activities have given us the opportunity to grow as a person and experience a college level type of prospect.

Our purpose for having this evaluation team and doing all this work is to better our communities and become role models for younger kids who might want to be a part of an evaluation group to have their voices heard. We also plan on leaving our community center and schools better than they were when we first started.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello! We’re Elizabeth DiLuzio and Miranda Yates from the Strategy, Evaluation, and Learning Division at Good Shepherd Services in New York City. As an organization dedicated to youth and family development, we strive to develop and offer programs that integrate the values, insights, and ideas of our participants. And, as an organization dedicated to evidence-based practice, we are continually seeking ways to incorporate innovative and effective methodologies into our work.

Photovoice, a research methodology implemented with youth and other frequently marginalized populations, utilizes the power of photography as a catalyst for self-expression. It invites individuals to capture on film information about their lives and perspectives that might otherwise be difficult to express. A Community Portraits grant from the Human Services Council and Measure of America, with funding provided by The Leona M. and Harry B. Helmsley Charitable Trust, recently enabled us to utilize Photovoice as a tool for including youth perspectives in our strategic planning process.   Our project focused on the East New York neighborhood of Brooklyn where Good Shepherd Services seeks to deepen its work.

Interested in implementing this approach?

Hot Tips:

  1. Prepare Your Prompts Carefully. Craft 3-5 simply worded prompts that capture the questions you seek to address. Write prompts as first person statements.  Our prompts were:
  • These are the places where I feel like I belong.
  • This is my community at its best.
  • This is something that I would like to change.
  1. Utilize Your Resources.  Ask program staff to assist with recruitment. Solicit feedback on project materials, from the project flyer to the informational packets. We also found success in partnering with a former participant and professional photographer who shared tips with participants and helped host the meetings.
  2. Point-and-Shoot, Disposable, or Cell Phone? There are pros and cons to the type of camera you select. Factors include budget, pixels, product availability, and photograph collection method.
  3. Harvest the Feedback. Design a participatory meeting that offers space and time for participants to reflect on their photos and those of others. Encourage participants to discuss and interpret the photos –  identifying trends, themes, and what can be learned
  4. Share the Results in Multiple Ways. In addition to informing strategic planning, use the results to impact conversations in multiple forums. Ensure participants have copies of all their work to take with them. Display the photos at the program site. Create a photo gallery that is open to the public. Bring photos to spark conversation at a community convening. Write an advocacy report.

Rad Resource:

I Bloomed Here”, a guide created by the National Indian Child Welfare Association, has more helpful tips and ideas for designing your own Photovoice project.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

I’m Amy Campbell, an evaluator at Centerstone Research Institute in Nashville, TN. While I work on several evaluation projects, one of the most rewarding is the Tennessee Healthy Transitions Initiative, where I’m able to work closely with youth and young adults (Y/YA) who have or are at risk of developing mental health or co-occurring disorders.

This year, we had an opportunity to conduct a Youth Participatory Action Research (YPAR) project with a Healthy Transitions Young Adult Leadership Council for the purposes of generating information to inform the services offered to Y/YA in Chattanooga, TN. We were able to engage Y/YA, train them in research methods, and collaboratively develop and implement a research project. Members of our YPAR team presented at Evaluation 2016 and shared how their findings will impact the design of the Tennessee Healthy Transitions Initiative.

Moving forward, this YPAR team will be meeting with Healthy Transitions leaders at the local and state levels to share their findings and collaboratively develop program solutions based on the data. Their findings will also influence the Leadership Council’s actions in the future; they are discussing social media campaigns focusing on issues they identified and other data-informed projects.

Lessons Learned:

  • Stakeholder buy-in is crucial. If you think you might not have this buy-in, you should start by having discussions that try to address this.
  • Y/YA need adequate support and training to be able to effectively engage. Share the expectations you have for your research team early (e.g., commitment, deliverables, etc.), and provide a solid foundation of the basic tenets of research in your first trainings. Use every interaction with your team as an opportunity to teach them about good research design and processes.

Hot Tips:

  • Free food is one of the best recruitment and retention tools you have at your disposal. We fed our team at every meeting, and we offered food for our data collection “event.”
  • Utilize social media and technology! I am located about 150 miles away from our research team, so we had to find creative ways to stay in contact. We used Facebook Messenger to stay connected between meetings, Evernote to keep track of meeting notes and action items, and Google Hangouts to meet remotely when we couldn’t meet in person.

Rad Resources:

The University of California, Berkeley’s YPAR Hub is a great resource for training and preparation exercises for your YPAR teams.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from the Get Outdoors Leadville Youth Research Team! We are writing from 10,200’ in Leadville, Colorado.  We live in the heart of the Rocky Mountains, but many of our friends and classmates do not venture out into nature.

This year we completed a project to learn “How do we connect youth in our community to nature?” Our research team used interactive and visual methods to answer our research question:

  • Story maps to map and interview residents in different neighborhoods
  • Site visits to area programs using an evaluation rubric
  • A mural, “Window to the Outdoors” to learn why connecting to nature matters
  • Interviews with leaders, parents, students and residents.

 Lessons Learned:

Connecting youth to the outdoors is important because being in nature can help us feel less stressed, more inspired and healthier.  Lots of people like to be outdoors, but they don’t always know where to go or feel safe.

We learned that being bilingual was really important. By talking to a lot of different people, we could help our community with some big ideas:

  • Non-metal playgrounds in all mobile home parks for winter use
  • Paid internships for young adults to make career exploration possible
  • Environmental and outdoor programming built into school programming to reach all youth
  • Better coordination of programs so older and younger siblings can participate
  • A hub facility that meets needs of all ages in all seasons

Get Involved

After four months of research, we joined leadership teams and worked alongside adults. Our research really helped because we had data to support our ideas. This was important when we presented to our county commissioners. We prepared ahead of time for meetings and then could share powerful ideas.

We learned that our youth leadership really mattered because:

  • People see it differently when a young person is willing to change their community. They know that is must be really important to them if they are willing to use their free time to do extra work.
  • Sometimes adults forget that they were once young too.
  • Youth are the ones with passion and know what other youth want.
  • We needed our youth research project to find those ideas and adults to help make it happen.

Rad Resources:

We used some really great resources to help with our research and youth-adult partnership:

The Youth-Led Evaluation Toolkit by Kim Sabo Flores gave us some great ideas, especially the ultimate chocolate chip cookie activity

Participatory Visual and Digital Methods (Left Coast Press, 2013) by Aline Gubrium and Krista Harper for the idea of story mapping.

Colorado 9-25 helped us work with adults using their youth-adult partnership resources at co9to25.org. Their mission is to ensure that all Colorado youth are safe, healthy, connected, contributing and educated.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! We are Krista Collins, Director of Strategy & Innovation at Boys & Girls Clubs of America (BGCA), and Mike Armstrong, Vice President of Club Operations and Evaluation at Boys & Girls Clubs of Metro Atlanta (BGCMA). Together we seek to understand how our professional development courses and youth programs work in tandem to support the 58,000 staff members in local communities and across the nation that create opportunities for approximately 4 million youth each year to achieve great futures through our priority focus on Academic Success, Good Character and Citizenship, and Healthy Lifestyles.

Since 2011, BGCA has conducted the annual National Youth Outcomes Initiative (NYOI) to measure how effectively the Club experience is being implemented and its’ impact on our members. Built on research-informed indicators of youth achievement that align with our priority outcomes, and benchmarked against other leading national youth surveys, NYOI data is used to drive continuous quality improvement efforts and communicate our impact to key stakeholders across the youth development field.

Rad Resource: Looking for comparison data to understand the impact of youth development programs? Download our 2015 National Outcomes Report: Measuring the Impact of Boys & Girls Clubs. A few highlights from our report:

  • 74% of members aged 12-17 who attend the Club regularly say they earn mostly A’s and B’s, compared to 67% of youth nationally.
  • By 12th Grade, Club members’ rate of monthly volunteering is more than double that of the national average for same-grade peers.
  • Teens who stay connected to the Club as they get older seem better able to resist high-risk behaviors than teens nationally at the same ages.

Hot Tip: Sharing Club-level results and training on data-utilization promotes survey participation

In four years the number of NYOI participants has grown from 2,800 Club members to 165,000 – that is an increase of almost 6000%! Much of this growth can be attributed to BGCA’s efforts to demonstrate the value of data to local Clubs. BGCA prepares reports for each participating Club organization, and provides local trainings and consultations to ensure that the results are interpreted correctly and used to drive improvement.

Hot Tip: Data-utilization requires learning that is strategic and intentional.

To fully realize the value that formal measurement and evaluation brings, local clubs have employed continuous quality improvement systems that integrate knowledge generation and decision-making at all levels of their organization. Decision making that affects everything from resource allocation at the corporate level to programmatic foci and staff assignments at the club-site level only occurs if a formal and iterative process of reflection and dialogue practiced.

We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

No tags

We are Valerie Hutcherson and Rebekah Hudgins, Research and Evaluation Consultants with the Georgia Family Connection Partnership (GaFCP) (gafcp.org). Started with 15 communities in 1991, Family Connection is the only statewide network of its kind in the nation with collaboratives in all 159 counties dedicated to the health and well-being of families and communities. Through local collaboratives, partners are brought together to identify critical issues facing the community and to develop and implement strategies to improve outcomes for children and families. The GaFCP strongly believes that collaboration and collective effort yield collective impact. Evaluation has always been a significant part of Family Connection, though capacity within each local collaborative greatly differs.

In 2013, GaFCP invited 6 counties to participate in a cohort focused on early childhood health and education (EC-HEED) using the Developmental Evaluation (DE) framework developed by Michael Quinn Patton. (Patton, 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use). Each county was identified by GaFCP based on need and interest in developing a EC-HEED strategy and had the autonomy to identify collaborative partners, programs and activities to create a strategy tailored to meet the needs and resources of the county. As evaluators we recognized the collaborative and their strategy formation as existing in a complex system with multiple partners and no single model to follow. The DE approach was the best fit for capturing data on the complexity of the collaborative process in developing and implementing their strategies. DE allows for and encourages innovation which is a cornerstone of the Family Connection Collaborative model. Further, this cohort work gave us, as evaluation consultants, the unique opportunity to implement an evaluation system that recognized that understanding this complexity and innovation was as important as collecting child and family outcome data. With DE, the evaluator’s primary functions are to elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time, data-based decision-making. Using this approach, we were able to engage in and document the decision making process, the complexity of the relationships among partners and how those interactions impact the work.

Lessons Learned: Just a few of the lessons we’ve learned are:

  1. Participants using a DE approach may not recognize real-time feedback and evaluation support as “evaluation”. Efforts must be taken throughout the project to clarify the role of evaluation as an integral part of the work.
  2. Successful DE evaluation in a collaborative setting requires attention to the needs of individual partners and organizations.
  3. The DE evaluator is part anthropologist thus is required to be comfortable in the emic-etic (insider-outsider) role as a member of the team as well as one involved in elucidating the practice and work of the team.

We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

I am Holly Kipp, Researcher, from The Oregon Community Foundation (OCF). Today’s post shares some of what we’re learning through our efforts to measure social-emotional learning (SEL) in youth in the context of our K-12 Student Success Initiative.

The Initiative, funded in partnership with The Ford Family Foundation, aims to help close the achievement gap among students in Oregon by supporting expansion and improvement of out-of-school time programs for middle school students.

Through our evaluation of the Initiative, we are collecting information about program design and improvement, students and their participation, and student and parent perspectives. One of our key data sources is a survey of students about their social-emotional learning (SEL).

Rad Resources: There are a number of places where you can learn more about SEL and its measurement. Some key resources include:

  • The Collaborative for Academic Social and Emotional Learning, or CASEL
  • The University of Chicago Consortium on School Research, in particular their Students & Learning page

In selecting a survey tool, we wanted to ensure the information collected would be useful both for our evaluation and for our grantees. By engaging grantee staff in our process of tool selection, they had a direct stake in the process and would hopefully buy-in to using the tool we chose – not only for our evaluation efforts but for their ongoing program improvement processes. 

Hot Tip: Engage grantee staff directly in vetting and adapting a tool.

We first mined grantee logic models for their outcomes of interest, reviewed survey tools already in use by grantees, and talked with grantees about what they wanted and needed to learn. We then talked with grantees about the frameworks and tools we were exploring in order to get their feedback.

We ultimately selected and adapted The Youth Skills and Beliefs Survey developed by the Youth Development Executives of King County (YDEKC) with support from American Institutes for Research.

Rad Resource: YDEKC has made available lots of information about their survey, the constructs it measures, and how they developed the tool.

Rad Resource: There are several other well-established tools worth exploring, such as the DESSA (or DESSA-mini) and DAP and related surveys, especially if cost is not a critical factor.

Hot Tip: Student surveys aren’t the only way to measure SEL! Consider more qualitative and participatory approaches to understanding student social-emotional learning.

Student surveys are only one approach to measuring SEL. We are also working with our grantees to engage students in photo voice projects that explore concepts of identity and belonging – elements that are more challenging to measure well with a survey.

Rad Resource: AEA’s Youth Focused TIG is a great resource for youth focused and participatory methods.

The American Evaluation Association is celebrating Oregon Community Foundation (OCF) week. The contributions all this week to aea365 come from OCF team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hello again! My name is Jan Noga here with more tips and lessons learned to help you in evaluating programs serving young children.  Today’s post will focus on implementing and facilitating evaluative activities.

Hot Tip: Using pictures, colored discs, shapes, stickers, or other concrete ways to express a response can be a very useful way around the variations present in expressive language.  If you use stickers, keep a close watch on them and set some rules for using the stickers (“Stickers only go on the paper”).

Lesson Learned: I have found that smiley/frowny/neutral faces are less reliable than using an indicator that is not emotionally loaded. The context a child brings to the classroom matters – some children may associate frowns with bad circumstances at home (such as anger, dysfunction, or possibly abuse). As a result, they may shy away from frownies or refuse to use them.

Lesson Learned: We learned something very interesting when we were using primary color dots. One of the field data collectors decided to go out and pick up some neon colored dots instead. It totally threw the kids (and data) off. Some complained that the dots were too bright. Others loved the brightness, but spent more of their time playing with the dots than responding to questions. We had to completely redo all the classrooms (using a different data collector), but it provided insight into the power of color – as well as the importance of piloting everything and training and monitoring your team.

Cool Trick: Use a simple grid with big, bold, capital letters or numbers in each box. Give children stickers or dots to respond to questions such as, “I like to come to school.” Children respond by putting a sticker in the appropriate box with different colored stickers representing the answer choices. Of course, you may need to prepare students and always have to keep an eye out for patterning (a child who uses the same color no matter what, etc.).  In general, setting the survey up as play helps with engagement and authenticity.

Noga_PK-4 survey grid

Noga_PK-4 demographics

Cool Trick: Make it a game. I print two-sided cards cut from card stock. One side has the answer grid with block capitals in each box. The other side is used to collect demographics. I use the sample shown here to collect each child’s gender, birthday, and initials (the “title” of the book in the dog’s mouth). Children cover the appropriate item with a sticker, and I help as needed. At the end, younger children get a coloring page that has the puppy on it to keep and color in later; children in grades 1-4 get a small toy.

Noga_PK-4_coloring page

Please feel free to contact me if you’d like more information on survey items, scripts, and the protocol.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Older posts >>

Archives

To top