AEA365 | A Tip-a-Day by and for Evaluators

CAT | Community Psychology

Hi, we are Olya Glantsman, a visiting professor of Psychology at DePaul University and Judah Viola, a dean of the College of Professional Studies and Advancement at National Louis University in Chicago. Last month we celebrated the release of our book titled Diverse Careers in Community Psychology.

The idea for the Diverse Careers in Community Psychology was born as a response to the question often posed by our students: “What can I do with a degree in Community Psychology?” Each time we got this question, we became more and more convinced that a text like this is long overdue for students (undergraduate or graduate), professionals looking to expand, shift, or change their career, and anyone who is mentoring or advising community minded students or employees.

Below are some hot tips we culled from the results of an extensive career survey of over 400-plus participants and 23 chapters written by over 30 different community psychologists (CPs) with various backgrounds.

Hot Tips: When looking for a job

  • Keep in mind, that those with community-oriented degrees do not have a problem finding a job
  • Many professionals successfully market their job skills and competencies rather than their degree and really find a “niche” – using unique interests and talents that the organization or team needs
  • A large number of survey participants have cited using Practice Competencies in helping them secure a job and using the same competencies throughout their work
    • All respondents reported using between five and fourteen competencies
  • When searching for a practice job, start preparing while still in training, cast a broad net, and search multiple disciplines
  • Obtaining mentorship and networking are two of the most important activities one can participate in.
    • More than half (59%) of survey respondents reported that they found out about their current job through networking

Rad Resources: Job Search

  • Participants use multiple search techniques when looking for employment (e.g., job postings, networking, listservs, etc.)
  • Finding practice related job search:
    • AEA, APA, and American Public Health Association (APHA) website, Indeed.com, Idealist.org, npo.net, simplyhired.com, and careerbuilder.com

Rad Resources: Job Training

Whether you are beginning your career or trying to expand or shift into a new arena, there are lots of options and opportunities. Whatever your journey, we hope you would find more helpful tips and hints in our book.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Our names are Susan M. Wolfe and Kyrah K. Brown and we are consultants at CNM Connect where we provide evaluation and capacity building services to nonprofit organizations.  Our work also includes evaluating community collaborations and coalitions. To effectively address most health, education, and other social issues at a systems level requires that communities address inequity and injustice.

RAD RESOURCE: In January, 2017 an article titled “Collaborating for Equity and Justice: Moving Beyond Collective Impact” was published in the Nonprofit Quarterly.

The authors presented the following six principles to promote equity and justice and each has implications for how coalitions and community collaboratives are evaluated.

Principle 1: Explicitly address issues of social and economic injustice and structural racism.

  • HOT TIP: Nearly all human problems, especially where there are disparities in outcomes, can be traced to social and economic injustice and/or structural racism. As an evaluator, examine whether these issues are being discussed and directly addressed.

Principle 2: Employ a community development approach in which residents have equal power in determining the coalition’s or collaborative’s agenda and resource allocation.

  • HOT TIP: Ask who has the actual power to make decisions and set agendas for the collaborative.

Principle 3: Employ community organizing as an intentional strategy and as part of the process. Work to build resident leadership and power.

  • HOT TIP: Closely examine the membership and leadership to determine the extent to which residents, or those who are directly affected by the issue at hand, are members and leaders.

Principle 4: Focus on policy, systems, and structural change.

  • HOT TIP: Review the agendas and activities to determine whether they are promoting more programs, or facilitating change in policies, systems, and structures.

Principle 5: Build on the extensive community-engaged scholarship and research over the last four decades that show what works, that acknowledge the complexities, and that evaluate appropriately.

Principle 6: Construct core functions for the collaborative based on equity and justice that provide basic facilitating structures and build member ownership and leadership.

RAD RESOURCE: The Collaborating for Equity and Justice Toolkit provided by The Community Tool Box can be accessed at: https://www.myctb.org/wst/CEJ/Pages/home.aspx

HOT TIP: Many nonprofits and health agencies are engaged in collaborative work and are oftentimes looking for effective frameworks to model after. Evaluators can use the Collaborating for Equity and Justice Toolkit to facilitate discussions and coalition development and planning efforts. When you introduce nonprofits and collaboratives to the framework, it may be helpful to provide brief presentations or facilitate interactive planning sessions. Prepare guided questions that help nonprofits to think about the application of the six principles in their work. As mentioned, the application of this framework can prove to be useful in refining or developing coalition goals that are intentional and evaluating their efficiency and effectiveness in meeting those goals.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi! I’m Tara Gregory, Director of the Center for Applied Research and Evaluation (CARE) at Wichita State University. Like any evaluator, the staff of CARE are frequently tasked with figuring out what difference programs are making for those they serve. So, we tend to be really focused on outcomes and see outputs as the relatively easy part of evaluating programs. However, a recent experience reminded me not to overlook the importance of outputs when designing and, especially, communicating about evaluations.

In this instance, my team and I had designed what we thought was a really great evaluation that covered all the bases in a particularly artful manner – and I’m only being partially facetious. We thought we’d done a great job. But the response from program staff was “I just don’t think you’re measuring anything.” It finally occurred to us that our focus on outcomes in describing the evaluation had left out a piece of the picture that was particularly relevant for this client – the outputs or accountability measures that indicated programs were actually doing something. It wasn’t that we didn’t identify or plan to collect outputs. We just didn’t highlight how they fit in the overall evaluation.

Lesson Learned: While the toughest part of an evaluation is often figuring out how to measure outcomes, clients still need to know that their efforts are worth something in terms of the stuff that’s easy to count (e.g., number of people served, number of referrals, number of resources distributed, etc.). Although just delivering a service doesn’t necessarily mean it was effective, it’s still important to document and communicate the products of their efforts. Funders typically require outputs for accountability and the programs place value in the tangible evidence of their work.

Cool Trick: In returning to the drawing board for a better way to communicate our evaluation plan, we created a graphic that focuses on the path to achieving outcomes with the outputs offset to show that they’re important, but not the end result of the program.  In an actual logic model or evaluation plan, we’d name the activities, outputs and outcomes more specifically based on the program. But this graphic helps keep the elements in perspective.    example graph of outputs and outcomesThe American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

word bubble with lego manHi folks! I’m Jill Scheibler, a community psychologist and Senior Research Analyst at Carson Research Consulting, a women-led firm whose mission is to help clients thrive by using data to measure impact, communicate, and fundraise. We’re passionate about storytelling with data to make a difference.

At CRC I’m the “word nerd”, implementing our qualitative projects. Like many evaluators, I’ve had to translate academically-honed skills to the often faster-paced world of evaluation. A recent project for a county health department’s substance abuse initiative provides an example of how I tailor qualitative methods to meet clients’ needs.

Hot Tips

Allot ample time for clarifying goals. As with all good research, methods choices flow from the question at hand. In this case, our client wanted to understand the impact of substance abuse on their county, and new resources to be tapped. Like many clients, they lacked research savvy, and thought they required services exceeding their budget and available time. We gradually learned they had access to lots of quantitative data and support from the state to help interpret it. They were missing community stakeholder feedback. So, we provided a qualitative needs assessment component.

Build in more meetings than you think you’ll need, and bring checklists. Be prepared to leave meetings thinking you have all needed answers and learning afterwards that you’ve been (well-meaningly) misinformed! (Quantitative sidebar example: after building a data dashboard for another client in Excel2013, based on their word, we learned they had Excel2007. A costly reminder to always ask more questions!)

Choose tool(s) carefully to maximize usefulness. I generally opt for interviews where probes can offset “one-shot” data collection situations. Here, I instead designed a qualitative survey, using mostly open-ended questions, for efficient gathering of perspectives. The client collected surveys themselves, disseminating hard copies and a SurveyMonkey.com link, and accessed a targeted sample from within a community coalition.

Familiar guidelines for interview and survey design apply to qualitative surveys, but I advise keeping questions very focused and surveys as short as possible to mitigate higher skip rates with qualitative surveys.

Cool Trick

You may think your reporting options are limited compared to quantitative results. Not so! Instead of writing text-heavy reports that eat up valuable time, and folks are disinclined to read (#TLDR), consider telling “data stories” using bullet points and visualizations. This client received a two-pager for internal, local stakeholder, and state use. I’ll also provide an in-depth explanation of results and action steps in a webinar.

Rad resources

Jansen’s “The Logic of Qualitative Survey Research and its Position in the Field of Social Research Methods.”

Great tips on qualitative surveys from Nielsen Norman.

Awesome tips from CRC colleagues for larger community surveys.

Achievable qual visualization ideas from Ann Emery.

Some tools for qual analysis and visualization from Tech for Change.

I genuinely enjoy working creatively with clients, because it makes evident how suited qualitative methods for linking research to action. I’d love to hear how others do this work, please get in touch!

image of Jill Scheibler

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Aloha! I am Anna Smith Pruitt, a community and cultural psychology doctoral candidate at the University of Hawai‘i at M?noa. I currently work with Dr. Jack Barile and his Ecological Determinants Lab evaluating community mental health and housing programs that serve people experiencing severe mental illness and homelessness on O‘ahu. These program participants are medically and socially vulnerable due to their compromised mental and physical health and marginalized social identities. Community psychology (CP) emphasizes that any interaction – including evaluation – is an intervention, and we strive to make our evaluations empowering experiences for program participants. I have found CP methods and values – particularly, collaboration, empowerment, citizen participation, and social justice – instructive in evaluating programs designed to serve marginalized groups. Relying on CP values and using a community-based participatory approach, Dr. Barile and I engaged participants in a Housing First (HF) program as co-researchers and together, created an ongoing evaluation partnership. Below are tips and resources we found useful:

HOT TIP 1: Collaborate!

One way community psychologists work to empower marginalized groups is by including members as partners in the research process. Such participatory approaches can be difficult when working with vulnerable individuals who may be actively experiencing physical and mental health complications (e.g., psychosis), and often evaluators (myself included!) assume these individuals are incapable of meaningfully participating in research. In my experience, even the most vulnerable member can engage in the evaluation process!

RAD RESOURCE: For info on participatory approaches, see the Community Toolbox’s chapters on CBPR research and participatory evaluation.

HOT TIP: Intentionally build advocacy into the evaluation design.

Once empowered, marginalized groups want to use their power to change the social conditions that contribute to their marginalization. In other words, empowerment leads to increased citizen participation and advocacy! At the conclusion of data analysis, HF participants chose to advocate for themselves using program evaluation data.

RAD RESOURCE: Check out the Community Toolbox’s Advocating for Change Toolkit.

Photovoice exhibit

Photovoice exhibit

HF participants advocated for themselves through a Photovoice exhibit (above).

HOT TIP: Practice self-reflexivity/critical awareness.

When evaluating programs with marginalized groups, constantly examine your privilege and be aware of power dynamics and the historical context that impact your relationship with participants and their experiences. Given Hawai’i’s colonial history and the fact that a disproportionate amount of people experiencing homelessness are Native Hawaiian, my status as a member of the colonizer group necessarily impacted my relationships with program participants.

RAD RESOURCE: Assess your privilege by calculating your privilege capital. Identify and purge your biases with the Bias Cleanse.

HOT TIP: Be creative and flexible in choosing methods.

Participants will have varying skills and capabilities, and you will need to creatively strategize ways to involve program participants in the evaluation process. The Photovoice method was useful for the HF participants because it allowed for both visual and verbal contributions to knowledge building.

RAD RESOURCES: Need help brainstorming methods? See Action Catalogue – an interactive tool for choosing methods and Participatory Methods.org.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings!  Welcome to the Community Psychology TIG Week!  I, Carissa Coleman, a Community Psychologist from James Bell Associates as well as the other members of the TIG Leadership Team, welcome you to a week of Community Psychology and our influence in evaluation work.

Our Community Psychology ideals spread across many disciplines including psychology, social work, education, medicine, and all types of prevention work.

We invite you to visit our website at http://comm.eval.org/communitypsychology/home/ to learn more.

Moving our Field: Toward Theory, Systems, and Dynamic Methods

As a Community Psychologist, I, Leonard A. Jason from DePaul University, would like to offer three ideas that have the potential to energize and transform our field. They involve theoretical perspectives, appreciation of the complexities of the natural world, and dynamic methodological tools that can be used to capture these complex processes.

Many of us work in the field of evaluation to better understand the relationship between people and their contexts in ways that might alleviate human suffering.  Yet, as argued in a recent special issue on Theories in the Field of Community Psychology, the ideological nature of our work that prioritizes efforts to improve people’s lives can result in less willingness to consider the possible contribution of theory.  I am not arguing that our work will coalesce around only one theory, but I believe there has been an unfortunate reluctance to attempt to develop predictive theory, in part because it is seen as a distraction from taking action. However, there is no obvious reason why sound theory cannot be developed that increases the effectiveness of our social action efforts and accomplishes our goal to better understand the complexities of people and groups living within multifaceted ecosystems.

Theory must contend with a natural world that is endlessly beautiful and elegant, but also one that often feels mysterious, unpredictable, and filled with contradictions. Dynamic feedback loops are the norm within this organic stew, and as a consequence, our work would be more contextually rich if it transcended reductionistic and simplistic linear cause and effect methods. Theories can help us capture a systems’ point of view, where the reality of the ever-changing world is made up of mutual interdependencies regarding how people adapt to and become effective in diverse social environments.

Rad Resource:  Are there methods that help us conceptualize and empirically describe these transactional dynamics? There are, such as those contained within the Handbook of Methodological Approaches to Community-Based Research, which profiles a new generation of quantitative and qualitative research methods that are holistic, culturally valid, and support contextually- and theoretically-grounded community interventions. Mixing qualitative and quantitative research methods can provide deeper exploration of causal mechanisms, interpretation of variables, and contextual factors that may mediate or moderate the topic of study. Theories and sophisticated statistical methods can help us address questions of importance for the communities in which and with whom we work by capturing the dynamics of complex systems and providing us the potential to transform our communities in fresh and innovative ways.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Growing up in the South with my name – Lovely Dhillon – assured me of a few things.  One, people would likely remember me, and two, that I would have to try to live up to an often used Southern expression, “lovely is as lovely does.”  After having moved back to the South last year after over 20 years away, I realize how much I missed the colloquialisms, the regularly warm hellos, the willingness to engage in insightful conversation, and the sense of community.

As my new hometown, Atlanta, welcomes the AEA membership, I know Atlanta will provide that warmth, knowledge, community and, truth be told, calories.  I wonder, however, what we, as a membership organization, will provide Atlanta, other host cities and the larger community around us.

Please join us as we dig into just this question through a think tank, “Designing AEA’s Collective Impact” on Friday, October 28th in Room L505  (8:00 am – 9:30 am).  Drawing from AEA’s mission to “support the contribution of evaluation to the generation of theory and knowledge about effective human action,” Beverly Parsons, Denise Roosendaal, Matt Keene, Susan Wolfe and I will engage with you about ways the AEA membership does, could or should work in collective ways to impact the communities in which we have our annual meetings, and/or work collectively on engaging in broad social issues.  We will investigate what organizations in other sectors do toward collective action and consider what AEA members can design and set in motion in the next few months.

Cool Tricks:

One example of AEA action is the Community Psychology TIG sponsoring “Walk the Talk” sessions at annual conferences.  In Atlanta this year, the TIG will visit the Georgia Justice Project: http://www.gjp.org/. Participants will have a chance to interact with project staff and members and learn about their evaluation questions, challenges, successes, and needs.  Other AEA members are highlighting small, local nonprofits in workshop sessions as case examples so that AEA members can provide insights and suggestions that might otherwise be inaccessible.

Lessons to Be Learned:

We expect there are many other examples of how AEA members are working together to contribute to organizations and issues outside of our clients, colleagues or institutions.  This session will be a great way to hear all about those ideas and come up with others.

Hot Tip:

As you pack your bags for your trip to the city that was key to the civil rights movement, come with ideas of what we, as a membership, can do to collectively to advance positive social change.

Oh – and remember to come with a warm smile and an empty stomach!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Emily Spence-Almaguer and I am an Associate Professor of Behavioral and Community Health at the University of North Texas Health Science Center. I spend most of my professional time serving as an independent evaluator for community initiatives and conducting assessment studies. I am a social worker by training and have found that the conversational skills used in Solution-Focused Therapy have great application in the realm of evaluation and community assessment.

Hot Tips: My favorite ways to use solution-focused dialogues are in:

  • Focus group and individual interviews because they help generate rich qualitative data and great ideas for continuous program improvements.
  • Evaluation planning meetings because they help stakeholders articulate a wide range of potential outcomes and describe how those outcomes might be observed (i.e., measured).
  • Meetings where stakeholders are being debriefed around disappointing evaluation results. The nature of solution-focused dialogues avoids finger-pointing and helps drive forward momentum.

Hot Tips:

  • It’s all about the questions!! Solution-focused dialogues are driven by questions that promote deep reflection and critical thinking.
  • Context: Use questions that help situate people’s minds in a particular context and use details in your question that will encourage an individual to imagine him or herself in that moment. Here’s an example that I use with consumers at a program trying to help lift individuals and families out of poverty:
    • I want you to take a moment and imagine that you just learned that the Bass [local philanthropist] family recently donated $100,000 to the United Way for this project. They want you to help them figure out how to best spend the money. What is the first thing you would advise them to do? What would you advise them to do next?
    • Expertise: I love the way that Gaiswinker and Roessler referred to this as the “expertise of not-knowing”. In solution-focused dialogues the words of questions and tone of delivery are carefully crafted to amplify the assumption that the stakeholders have exceptional knowledge, skills and capacities.

Rad Resource: For an introduction to solution focused concepts, I like Coert Visser’s Doing What Works Blog.

spence

Download from the AEA Public eLibrary to View the Poster in Full Size!

Rad Resource: I presented on Solution-Focused dialogues in evaluation at AEA’s Evaluation 2012 conference. You can download my poster and resources list from the AEA public eLibrary here.

Lessons Learned: A direct question, such as “What would you recommend to improve this program?” often fails to generate detailed or meaningful responses. In focus groups with program consumers, I find that this question is interpreted as “what is wrong with the program?” and may lead to comments in defense of the program staff members (see my 2012 AEA poster for an example of this from my data).

The American Evaluation Association is celebrating Best of aea365, an occasional series. The contributions for Best of aea365 are reposts of great blog articles from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

We are Natalie Wilkins and Shakiyla Smith from the National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.

As public health scientists and evaluators, we are charged with achieving and measuring community and population level impact in injury and violence prevention. The public health model includes: (1) defining the problem, (2) identifying risk and protective factors, (3) developing and testing prevention strategies, and (4) ensuring widespread adoption. Steps 3 and 4 have proven to be particularly difficult to actualize in “real world” contexts. Interventions most likely to result in community level impact are often difficult to evaluate, replicate, and scale up in other communities and populations.[i]

A systems framework for injury and violence prevention supplements the public health model by framing injury within the community/societal context in which it occurs.[ii] Communities are complex systems- constantly changing, self-organizing, adaptive, and evolving.  Thus, public health approaches to injury and violence prevention must focus more on changing systems, versus developing and testing isolated programs and interventions, and must build the capacity of communities to implement, evaluate, and sustain these changes.[iii] However, scientists and evaluators face challenges when trying to encourage, apply, and evaluate such approaches, particularly in collaboration with other stakeholders who may have conflicting perspectives. A systems framework requires new methods of discovery, collaboration, and facilitation that effectively support this type of work.

Lessons Learned:

  • Evaluators can use engagement and facilitation skills to help stakeholders identify their ultimate goals/outcomes and identify the systems within which these outcomes are nested (Goodman and Karash’s Six Steps to Thinking Systemically provides an overview for facilitating systems thinking processes).
  • Evaluators must also address and communicate around high-stakes, conflictual issues that often undergird intractable community problems. “Conversational capacity”[iv] is an example of a skillset that enables stakeholders to be both candid and receptive in their interactions around challenging systems issues.

Rad Resources:

  • Finding Leverage: This video by Chris Soderquist provides an introduction to systems thinking and how it can be applied to solve complex problems.
  • The Systems Thinker: Includes articles, case studies, guides, blogs, webinars and quick reference “pocket guides” on systems thinking.

i Schorr, L., & Farrow, F. (2014, November). An evidence framework to improve results. In Harold Richman Public Policy Symposium, Washington, DC, Center for the Study of Social Policy.

ii McClure, R. J., Mack, K., Wilkins, N., & Davey, T. M. (2015). Injury prevention as social change. Injury prevention, injuryprev-2015.

iiiSchorr, L., & Farrow, F. (2014, November). An evidence framework to improve results. In Harold Richman Public Policy Symposium, Washington, DC, Center for the Study of Social Policy.

iv. Weber, C. (2013) Conversational capacity: The secret to building successful teams that perform when the pressure is on.  McGraw Hill Education: New York, NY

The American Evaluation Association is celebrating Community Psychology (CP) TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CPTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hi! I am Cathy Lesesne and I work at ICF International doing public health related evaluation and research. My passion is doing work that affects the lives of adolescents, particularly those with the most need and the least voice in how to meet those needs. I do a lot of work in and with schools and school districts focused on optimal sexual health for teens and how to ensure youth have skills and ability to make healthy choices no matter when they decide to engage in sexual activity.

I often see well-intentioned school or school district staff creating solutions for youth and testing them rather than involving youth in solution identification and evaluation of the success. It is clearly easier to retain the power to determine the solutions and to see if they work in the end through evaluation. However, in my own work I have seen the power of youth engagement and involvement in both developing programs and services as well as in helping to evaluate and improve those resources.

Rad Resources: As evaluators, we often have the ability to make recommendations to our clients and partners working with youth AND we have the power to approach our evaluation work with youth in empowering and engaging ways. But we don’t always know how. I highly recommend that you dig into the Youth-Adult Partnerships in Evaluation (Y-AP/E): A Resource Guide for Translating Research into Practice and find your own ways to apply the wide range of ideas, tip sheets, and examples for engaging youth as partners in evaluation. Many of these examples may also help your clients or partners think of ways to better engage youth in the development of programs and services that reflect them and their real interests and needs. If youth are empowered to be partners in developing and testing solutions, they become allies instead of subjects; sources of solutions instead of sources of data.

The American Evaluation Association is celebrating Community Psychology (CP) TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CPTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Older posts >>

Archives

To top