AEA365 | A Tip-a-Day by and for Evaluators



YFE TIG Week: Kim Sabo Flores on Data-Driven Decision Making in Youth Focused Evaluation

Hi my name is Kim Sabo Flores, I am the Co-Founder of Algorhythm. Over the last 20+ years, I have been working as an evaluator in the field of youth development. Recently I’ve observed an unfortunate trend in the field: A LOT OF TALK about “evidence-based,” “research-based” and “data-driven” decision-making, and very LITTLE ACTION. This is particularly true for youth practitioners, working on the front lines of social change, where data could have the greatest impact. Why, in this rich information and technology era is this still a challenge?

Here are a few Hot TIPS:

Bring the power of data to the front lines of social change: Data is power! And for the most part that power is held by senior-level staff and has been used to leverage resources rather than to drive programmatic decision-making. It is rarely the case that evaluation findings are shared and analyzed with front-line staff, and there is a radical misunderstanding of their ability to effectively understand and utilize data.

Hot Tip: Support ALL staff to learn from and make meaning of data; be sure they’re included when you share your findings.

Value rather than evaluate: Research and even evaluation reports are written and consumed by academics and funders. However, they leave practitioners with limited practical information about how to improve outcomes for ALL youth —specifically the most difficult to serve.

Hot Tip: Utilize predictive and prescriptive analytics that focus on what “works” for each and every youth, valuing all the various pathways taken toward success rather than just those taken by the “average” youth.

Measure what matters: Driven by funding demands, program staff spend precious time and resources capturing mandated data such as report cards, test scores, attendance records, etc., with the full knowledge that these metrics do not fully tell their story and are not fully attributable to their programs. Front line workers are tired of gathering meaningless data that doesn’t answer their questions.

Hot Tip: Use research-based social/emotional measures to show proximal gains that contribute to academic achievement, reduction of risk and thriving. These types of outcomes speak directly to the work of youth development and allow front-line staff to see their contribution.

Provide timely insights at a low cost: Take advantage of new technologies that allow programs to gather data, immediately analyze it and put it to use. Such technologies increase data utilization and ultimately increase the impact on youth. Best of all, it drastically decrease the cost and allows more nonprofits to afford evaluation and to afford it more often!

Rad Resources:

Foundation For Young Adult Success: UCChicagoCCSR. Concept Paper for Research and Practice. June 2015.

FREE webinar:“21st Century Impact Measurement for Youth Serving Organizations,” and learn more about a game-changing approach to impact measurement.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.



  • Kelly K Garcia · March 9, 2016 at 4:42 am

    Dear Ms. Flores

    I am currently a student at Texas A&M University-Central Texas. I am taking a Program Evaluation course where we get to participate in certain projects for evaluation. One of those projects that some of our class members are working on involve youth. Thank you for pointing our some of the issues in with data, when assessing youth. Data needs to be brought to the front lines and we have to make sure that what we learn and apply, helps ALL youth.

    Thank you,


  • Steve · March 9, 2016 at 2:13 am

    Hi Kim,

    Thank you for your insightful article. I am a teacher who agrees with your data-driven focus for youth focused evaluation. It seems to be an occupational hazard for teachers to prioritize talk over action. This discourages real change from developing because a deviation from the status quo may induce concern, feigned or otherwise, by those not comfortable with the pace of change. Rogers’ Diffusion of Innovations is a way of describing the various temperaments and stages any potential change to the status quo goes through. In my experience, every staff or reasonably large community personifies their roles under this model. If there are too many innovators, ideas may jump around without having time to germinate.

    Yet, if an organization is too stifling regarding innovation, it will fossilize and eventually atrophy. Most people won’t take action without seeing someone first lead the way. The challenge is to seek to balance out the often contrasting views into something cohesive while taking action. So-called Laggards may use any discord or discomfort as reason enough to avoid taking action when the real concern may be a fear of action.

    I appreciate your point regarding giving a context (by senior-level staff) for data. The volume of atomized data I see if rising exponentially and using it cohesively and efficiently remains a challenge. A culture change in management to listen to front-line workers is needed rather than top-down initiatives that may be divorced from the reality on the ground.

    Your point about not seeking to serve the average young person but all of them is well-taken. Using smart data to provide a better initial baseline data would get away from solutions disconnected from those who would use your services (what would it look like at the low, medium and high level and how would our service accommodate those people?).
    Many teachers I have talked to want to give more meaningful and timely feedback but the current reporting paradigm was designed for a vastly different time. People may rightly fear a steadily encroaching growing expectation of being on all of the time even when the solutions may offer a benefit.

    Some collaboration tools like Slack or Google Forms can share common knowledge is an effective and data-driven process. It’s an evolving process but I appreciate your contribution to using data to serve us rather than the reverse.
    Thank you,


  • Barbara Burrow · February 29, 2016 at 3:33 pm

    Hello Ms. Flores,

    Your post likely reflects what many in program evaluation have experienced. Data is power and is useful for funding but programming and outcomes are improved through sharing this data with those who have direct application on outcomes. Part of professional development for staff should include guidance and education on how to read, interpret, and apply the findings. If management were to improve on sharing the information, they might be pleasantly surprised the value and insights their staff resources bring.

    There is a small group of undergraduate students initiating an evaluation of an afterschool program. This will be a first for this program and I can only hope that each subsequent evaluation improves on the first thereby assisting them to improve outcomes which in turn improves funding.

    Thank you for your post,
    Barbara Burrow


  • Beki Saito · November 11, 2015 at 6:52 am

    I am so right with you Kim in terms of walking along the edge of the crossroads of youth development and developmental, participatory evaluation. Best, beki


Leave a Reply




To top