AEA365 | A Tip-a-Day by and for Evaluators

CAT | Youth Focused Evaluation

I’m Sondra Stegenga, an occupational therapist, home visitor, educational administrator, and Ph.D. student at the University of Oregon.  Evidence has shown that meaningful family involvement is key to long-term outcomes for children. In early intervention and early childhood (EC) systems we are charged with basing services, supports, and goals on family needs and priorities. Given the varied learning needs and contextual and cultural values of families, and the lack of research on involving families in data practices, this process may be unintentionally overlooked or underutilized. In a recent study, Brawley and Stormont found that although 82% of EC teachers identified sharing data with families as important, only 42% reported regularly doing so. Data collection in EC programs can become a rote task, completed without much meaning or family involvement. Failing to include families in data processes not only violates foundational tenets of early intervention and early childhood but more importantly deprives families of valuable learning and reflection, greater involvement in their child’s plan, and improved chances of successful outcomes.

Lessons Learned:

  • In 20+ years of working with children and families I learned the impact of involving families in data practices. This lines up with what researchers and evaluators have noted that involving families in data processes leads to increased communication and better outcomes.

Hot Tips:

  • To engage parents in data practices we must first engage families in the whole educational process. Consider cultural, contextual, and family needs. Engagement may look different to each family, but should be conveyed thorough mission, goals, and formal practices explicitly outlining the importance of and practices supporting family involvement. Gathering input from through a variety of methods (via smartphone, in-person, and times convenient for the family) is imperative to meaningful family engagement.
  • Involve families from the beginning as “partners” in data collection, reflection, and use. This will demystify the process and support full, meaningful family engagement. Explain reasoning for data, timelines, and gathering data. Take time to understand parents’ prior experience, fears, and questions related to data. Ask parents what is meaningful to them and discuss how they would like to measure their child’s progress.
  • Use various modes of data presentation. Graphs and visualizations are shown to be powerful communicators of data. In addition, telling the story of the data and linking to family’s needs, priorities, and contexts is key to understanding.

Rad Resources:

The American Evaluation Association is hosting the Disabilities and Underrepresented Populations TIG (DUP) Week. The contributions all week are focused on engaging DUP in your evaluation efforts. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings, I am Brian Molina, a graduate student at in Western Michigan University’s Industrial/Organizational Behavior Management doctoral program. I have conducted single subject research and implemented performance improvement projects across many different settings and organizations.

Lessons Learned:

  • Single subject research is used to evaluate program effectiveness across large groups. The belief that single subject research can only be used with one person at a time is a common (yet understandable) misconception. The number of people in the analysis depends upon the frame of reference we are interested in evaluating. For example, we may easily choose to evaluate the performance of a single individual, a single team, a single department, a single organization…and so on.
  • The methodology is friendly toward organizations with limited resources or experience in evaluation.  Evaluation can be as intimidating! For organizations that are new to the process, interpreting and understanding large group statistical analyses may be difficult. Single subject research typically results in data that show behavior change over time, which can easily be interpreted by researchers and clients alike. Easier data collection and analyses make it more likely that organizations will begin and continue evaluation of their activities.
  • Single subject designs allow for more maximum flexibility in implementing program changes. Conducting research is rarely an orderly process that goes precisely according to plan. Single subject methodology accommodates this unpredictability well. Changes in behavior are rapidly observable during the course of program implementation, not simply at the conclusion of sometimes-lengthy data collection. This allows leaders to make on-the-go changes to the intervention that best serve the client, without contaminating the results of an experimental evaluation.

Rad Resources:

Single Subjects example

The American Evaluation Association is hosting the Disabilities and Underrepresented Populations TIG (DUP) Week. The contributions all week are focused on engaging DUP in your evaluation efforts. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! My name is Nick Petten, owner of Petten Consulting in Toronto, Canada. I am a practicing program evaluator focused on child and youth programs and a children’s rights advocate. My work with children and young people over the last decade and a half has impressed upon me that they are fully capable of expressing and communicating their lived realities and that those realities are a valued focus of study.

Lessons Learned

When researching childhood, a critical methodological concern is the power dynamics that occur between adults and children. Unequal power can compromise your data and severely distort truth. But more importantly, it can cause harm to your subject.

The new sociology of childhood is a scientific discipline that promotes the process of obtaining assent from children to participate in research. So, how do you seek assent with children?

Hot tip #1: Develop an assent and accountability framework that will help you explain the process of research and its findings in a child-friendly manner and systematically ‘check-in’ with children about their participation in the research.

Hot tip #2: Develop protocols to use when seeking children’s assent that considers as many factors as possible about why they would answer in a particular way.

 Hot tip #3: Dedicate time to building relationships and having conversations with the gatekeepers and remember that such conversations need to strike a balance between providing critical information and too much detail that can lead to confusion.

Hot tip #4: Engage in internal reflection through an explicit and systematic process throughout the research process and think about how the insights gained may influence how you converse with children and their gatekeepers.

Rad resources:

If you are starting to think about how to genuinely involve children’s participation in research, you better be ready to read. By reading some of the great material on research with children you can begin to understand children’s position in research – even in our adult-centric world.

Here is some material to get your started on you reading adventure:

For more information, please go to my website: www.pettenconsulting.com

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

My name is Susan Igras, a Senior Advisor at the Institute for Reproductive Health, Georgetown University whose work is mainly focused in Africa and Asia.  One of our projects is operating in urban-poor areas of Kinshasa, Democratic Republic of Congo.  Designed to evaluate scalable interventions that address social and normative factors that limit adolescent and youth choices and sexual/reproductive health outcomes, we are challenged with fitting youth engagement into the research and evaluation process.  Everyone likes the idea, but how can you operationalize it without jeopardizing the externally-implemented research component?

Staff from participating organizations sat together to brainstorm a way forward; we decided we could focus youth evaluation on questions relating to improving program design, youth engagement, and implementation.  We ended up creating tables that allowed us to get practical:

Evaluation questions |data sources | youth role in designing data collection tools | and youth role in collecting data to answer the questions.

This seemingly-simple planning exercise was critical to move from a nice idea to an actionable evaluation activity.  We are still working on making all steps as youth-led as possible – stay tuned for a blog from one of our youth evaluators!

 RAD resource – for those of us working in French-language contexts:

http://www.troussemj.ca/content/page-de-renvoi – See the chapter on youth-led evaluation in this practical toolkit for engaging youth in mental health issues.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

We are Aspen, Journey, Sira, Sati, and Alexus- members of the Youth Evaluation Team at the Goodman Community Center (GCC) in Madison, Wisconsin. We are high school youth who work with GCC staff, and graduate students and researchers at the University of Wisconsin-Madison to evaluate different programs in our community centers and schools.

Lessons Learned:

Within our evaluation process, many people ask, “Why should there be a youth evaluation team in our community center?” We have learned that the most important reason an evaluation team is beneficial is the opportunity for youth voice.

  • The program gives us a chance to give our input and ideas on problems within our community centers on the north and eastside of Madison.
  • It also gives us a chance to take responsibility for the improvement in the communities, and practice leadership skills around decision making.
  • Another thing that we like about being on an evaluation team is getting to collaborate with staff to organize surveys and collecting data. We also get to work as a team with the staff and get to be treated like one of them, instead of just a student giving feedback.
  • When it comes to the evaluation teams and the process, they’re a good reference to have for college applications and it is great for finding different connections throughout the community for more jobs and evaluation opportunities.
  • This program gives us, and the staff that we work with, a chance to work on improving our skills of working with others, and communicating with people we may not know to get connections for our data collecting. All of these activities have given us the opportunity to grow as a person and experience a college level type of prospect.

Our purpose for having this evaluation team and doing all this work is to better our communities and become role models for younger kids who might want to be a part of an evaluation group to have their voices heard. We also plan on leaving our community center and schools better than they were when we first started.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello! We’re Elizabeth DiLuzio and Miranda Yates from the Strategy, Evaluation, and Learning Division at Good Shepherd Services in New York City. As an organization dedicated to youth and family development, we strive to develop and offer programs that integrate the values, insights, and ideas of our participants. And, as an organization dedicated to evidence-based practice, we are continually seeking ways to incorporate innovative and effective methodologies into our work.

Photovoice, a research methodology implemented with youth and other frequently marginalized populations, utilizes the power of photography as a catalyst for self-expression. It invites individuals to capture on film information about their lives and perspectives that might otherwise be difficult to express. A Community Portraits grant from the Human Services Council and Measure of America, with funding provided by The Leona M. and Harry B. Helmsley Charitable Trust, recently enabled us to utilize Photovoice as a tool for including youth perspectives in our strategic planning process.   Our project focused on the East New York neighborhood of Brooklyn where Good Shepherd Services seeks to deepen its work.

Interested in implementing this approach?

Hot Tips:

  1. Prepare Your Prompts Carefully. Craft 3-5 simply worded prompts that capture the questions you seek to address. Write prompts as first person statements.  Our prompts were:
  • These are the places where I feel like I belong.
  • This is my community at its best.
  • This is something that I would like to change.
  1. Utilize Your Resources.  Ask program staff to assist with recruitment. Solicit feedback on project materials, from the project flyer to the informational packets. We also found success in partnering with a former participant and professional photographer who shared tips with participants and helped host the meetings.
  2. Point-and-Shoot, Disposable, or Cell Phone? There are pros and cons to the type of camera you select. Factors include budget, pixels, product availability, and photograph collection method.
  3. Harvest the Feedback. Design a participatory meeting that offers space and time for participants to reflect on their photos and those of others. Encourage participants to discuss and interpret the photos –  identifying trends, themes, and what can be learned
  4. Share the Results in Multiple Ways. In addition to informing strategic planning, use the results to impact conversations in multiple forums. Ensure participants have copies of all their work to take with them. Display the photos at the program site. Create a photo gallery that is open to the public. Bring photos to spark conversation at a community convening. Write an advocacy report.

Rad Resource:

I Bloomed Here”, a guide created by the National Indian Child Welfare Association, has more helpful tips and ideas for designing your own Photovoice project.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

I’m Amy Campbell, an evaluator at Centerstone Research Institute in Nashville, TN. While I work on several evaluation projects, one of the most rewarding is the Tennessee Healthy Transitions Initiative, where I’m able to work closely with youth and young adults (Y/YA) who have or are at risk of developing mental health or co-occurring disorders.

This year, we had an opportunity to conduct a Youth Participatory Action Research (YPAR) project with a Healthy Transitions Young Adult Leadership Council for the purposes of generating information to inform the services offered to Y/YA in Chattanooga, TN. We were able to engage Y/YA, train them in research methods, and collaboratively develop and implement a research project. Members of our YPAR team presented at Evaluation 2016 and shared how their findings will impact the design of the Tennessee Healthy Transitions Initiative.

Moving forward, this YPAR team will be meeting with Healthy Transitions leaders at the local and state levels to share their findings and collaboratively develop program solutions based on the data. Their findings will also influence the Leadership Council’s actions in the future; they are discussing social media campaigns focusing on issues they identified and other data-informed projects.

Lessons Learned:

  • Stakeholder buy-in is crucial. If you think you might not have this buy-in, you should start by having discussions that try to address this.
  • Y/YA need adequate support and training to be able to effectively engage. Share the expectations you have for your research team early (e.g., commitment, deliverables, etc.), and provide a solid foundation of the basic tenets of research in your first trainings. Use every interaction with your team as an opportunity to teach them about good research design and processes.

Hot Tips:

  • Free food is one of the best recruitment and retention tools you have at your disposal. We fed our team at every meeting, and we offered food for our data collection “event.”
  • Utilize social media and technology! I am located about 150 miles away from our research team, so we had to find creative ways to stay in contact. We used Facebook Messenger to stay connected between meetings, Evernote to keep track of meeting notes and action items, and Google Hangouts to meet remotely when we couldn’t meet in person.

Rad Resources:

The University of California, Berkeley’s YPAR Hub is a great resource for training and preparation exercises for your YPAR teams.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from the Get Outdoors Leadville Youth Research Team! We are writing from 10,200’ in Leadville, Colorado.  We live in the heart of the Rocky Mountains, but many of our friends and classmates do not venture out into nature.

This year we completed a project to learn “How do we connect youth in our community to nature?” Our research team used interactive and visual methods to answer our research question:

  • Story maps to map and interview residents in different neighborhoods
  • Site visits to area programs using an evaluation rubric
  • A mural, “Window to the Outdoors” to learn why connecting to nature matters
  • Interviews with leaders, parents, students and residents.

 Lessons Learned:

Connecting youth to the outdoors is important because being in nature can help us feel less stressed, more inspired and healthier.  Lots of people like to be outdoors, but they don’t always know where to go or feel safe.

We learned that being bilingual was really important. By talking to a lot of different people, we could help our community with some big ideas:

  • Non-metal playgrounds in all mobile home parks for winter use
  • Paid internships for young adults to make career exploration possible
  • Environmental and outdoor programming built into school programming to reach all youth
  • Better coordination of programs so older and younger siblings can participate
  • A hub facility that meets needs of all ages in all seasons

Get Involved

After four months of research, we joined leadership teams and worked alongside adults. Our research really helped because we had data to support our ideas. This was important when we presented to our county commissioners. We prepared ahead of time for meetings and then could share powerful ideas.

We learned that our youth leadership really mattered because:

  • People see it differently when a young person is willing to change their community. They know that is must be really important to them if they are willing to use their free time to do extra work.
  • Sometimes adults forget that they were once young too.
  • Youth are the ones with passion and know what other youth want.
  • We needed our youth research project to find those ideas and adults to help make it happen.

Rad Resources:

We used some really great resources to help with our research and youth-adult partnership:

The Youth-Led Evaluation Toolkit by Kim Sabo Flores gave us some great ideas, especially the ultimate chocolate chip cookie activity

Participatory Visual and Digital Methods (Left Coast Press, 2013) by Aline Gubrium and Krista Harper for the idea of story mapping.

Colorado 9-25 helped us work with adults using their youth-adult partnership resources at co9to25.org. Their mission is to ensure that all Colorado youth are safe, healthy, connected, contributing and educated.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! We are Krista Collins, Director of Strategy & Innovation at Boys & Girls Clubs of America (BGCA), and Mike Armstrong, Vice President of Club Operations and Evaluation at Boys & Girls Clubs of Metro Atlanta (BGCMA). Together we seek to understand how our professional development courses and youth programs work in tandem to support the 58,000 staff members in local communities and across the nation that create opportunities for approximately 4 million youth each year to achieve great futures through our priority focus on Academic Success, Good Character and Citizenship, and Healthy Lifestyles.

Since 2011, BGCA has conducted the annual National Youth Outcomes Initiative (NYOI) to measure how effectively the Club experience is being implemented and its’ impact on our members. Built on research-informed indicators of youth achievement that align with our priority outcomes, and benchmarked against other leading national youth surveys, NYOI data is used to drive continuous quality improvement efforts and communicate our impact to key stakeholders across the youth development field.

Rad Resource: Looking for comparison data to understand the impact of youth development programs? Download our 2015 National Outcomes Report: Measuring the Impact of Boys & Girls Clubs. A few highlights from our report:

  • 74% of members aged 12-17 who attend the Club regularly say they earn mostly A’s and B’s, compared to 67% of youth nationally.
  • By 12th Grade, Club members’ rate of monthly volunteering is more than double that of the national average for same-grade peers.
  • Teens who stay connected to the Club as they get older seem better able to resist high-risk behaviors than teens nationally at the same ages.

Hot Tip: Sharing Club-level results and training on data-utilization promotes survey participation

In four years the number of NYOI participants has grown from 2,800 Club members to 165,000 – that is an increase of almost 6000%! Much of this growth can be attributed to BGCA’s efforts to demonstrate the value of data to local Clubs. BGCA prepares reports for each participating Club organization, and provides local trainings and consultations to ensure that the results are interpreted correctly and used to drive improvement.

Hot Tip: Data-utilization requires learning that is strategic and intentional.

To fully realize the value that formal measurement and evaluation brings, local clubs have employed continuous quality improvement systems that integrate knowledge generation and decision-making at all levels of their organization. Decision making that affects everything from resource allocation at the corporate level to programmatic foci and staff assignments at the club-site level only occurs if a formal and iterative process of reflection and dialogue practiced.

We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

No tags

We are Valerie Hutcherson and Rebekah Hudgins, Research and Evaluation Consultants with the Georgia Family Connection Partnership (GaFCP) (gafcp.org). Started with 15 communities in 1991, Family Connection is the only statewide network of its kind in the nation with collaboratives in all 159 counties dedicated to the health and well-being of families and communities. Through local collaboratives, partners are brought together to identify critical issues facing the community and to develop and implement strategies to improve outcomes for children and families. The GaFCP strongly believes that collaboration and collective effort yield collective impact. Evaluation has always been a significant part of Family Connection, though capacity within each local collaborative greatly differs.

In 2013, GaFCP invited 6 counties to participate in a cohort focused on early childhood health and education (EC-HEED) using the Developmental Evaluation (DE) framework developed by Michael Quinn Patton. (Patton, 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use). Each county was identified by GaFCP based on need and interest in developing a EC-HEED strategy and had the autonomy to identify collaborative partners, programs and activities to create a strategy tailored to meet the needs and resources of the county. As evaluators we recognized the collaborative and their strategy formation as existing in a complex system with multiple partners and no single model to follow. The DE approach was the best fit for capturing data on the complexity of the collaborative process in developing and implementing their strategies. DE allows for and encourages innovation which is a cornerstone of the Family Connection Collaborative model. Further, this cohort work gave us, as evaluation consultants, the unique opportunity to implement an evaluation system that recognized that understanding this complexity and innovation was as important as collecting child and family outcome data. With DE, the evaluator’s primary functions are to elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time, data-based decision-making. Using this approach, we were able to engage in and document the decision making process, the complexity of the relationships among partners and how those interactions impact the work.

Lessons Learned: Just a few of the lessons we’ve learned are:

  1. Participants using a DE approach may not recognize real-time feedback and evaluation support as “evaluation”. Efforts must be taken throughout the project to clarify the role of evaluation as an integral part of the work.
  2. Successful DE evaluation in a collaborative setting requires attention to the needs of individual partners and organizations.
  3. The DE evaluator is part anthropologist thus is required to be comfortable in the emic-etic (insider-outsider) role as a member of the team as well as one involved in elucidating the practice and work of the team.

We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Older posts >>

Archives

To top