AEA365 | A Tip-a-Day by and for Evaluators

TAG | community

¡Saludos! We are Lisa Aponte-Soto and Saúl I. Maldonado, co-chairs of the Latina/o Responsive Evaluation Discourse (LA RED) TIG and AEA GEDI alumni. Aponte-Soto is the National Program Deputy Director of RWJF New Connections at Equal Measure, and Maldonado is a lecturer at Santa Clara University’s School of Education.

Content for our TIG Week features updates from AEA 2015 and discussions about evaluation theory and practice. Our post highlights the Birds of a Feather LA RED session at Evaluation 2015, “How do we attend to evaluation with a Latina/o Cultural Lens?” Facilitators shared experiences, resources, and dialogued with attendees regarding culture, context, and Latina/o responsive evaluation (LRE) practices.

Lessons Learned:                                                                                                               

  • Attend to Cultural ValuesRespeto (respect) and familismo (collectivism) are among central cultural values vital for gaining confianza (“trust”). Showing respeto to Latina/o communities requires staying humble, asking thoughtful questions, and sharing decision-making. This may also entail providing additional space or activities to accommodate participants’ children and extended family members.
  • Be Inclusive of Language and Linguistic Differences – To maintain the integrity of the evaluation results, it is important to know the community and to prepare protocols and instruments in Spanish and English. Translations do not guarantee instruments’ appropriateness for Latina/o subgroup/s being served. These differences are critical to practicing LRE. While this may be challenging, it is necessary to communicate to funders, colleagues, or partners.
  • Be Inclusive of Community – An LRE approach demands a multilayered process rooted in community participatory approaches that engage Latina/o staff, leaders, advocates, and community members. Meaningful collaboration with promotoras (lay community workers) and other community members is always appropriate, as they are the most attuned to culturally responsive community needs.
  • Beware of Power Differentials – As evaluators, it is important to remain mindful of professional privileges that influence power differentials when engaging with communities – even if you are a part of the community, are Latina/o, and/or live, socialize, and work with Latina/os. Being reflective of one’s value systems, expertise, and stakeholder expectations may prevent culturally inappropriate partnerships.

Hot Tip #1: Stakeholder Engagement – Navigating community, stakeholder, and client needs requires advocacy to negotiate marginalized representation. Excluding voices leads to erroneous results, but so does the over-adjustment of evaluation designs.

Hot Tip #2: Efficiency Isn’t Always Effective – Organizational structures are important when conducting evaluations, but overemphasizing efficiency can compromise the most effective collaborations with stakeholders.

Rad Resource: The Building Evidence Toolkit is a free receta (recipe) for Latina/o community-based organizations to document their programs outcomes.

Rad Resource: LA RED recognizes evaluators have individual experiences that encompass multiple identities beyond race/ethnicity. LA RED is a space for evaluators working collaboratively with/for Latina/o communities regardless of their personal racial-ethnic background. To join the discourse, please email us at lared.tig@gmail.com.

The American Evaluation Association is celebrating Latina/o Responsive Evaluation Discourse TIG Week. The contributions all this week to aea365 come from LA RED Topical Interest Group members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Ann Price and I am the President of Community Evaluation Solutions, Inc.(CES), a consulting firm based just outside Atlanta, Georgia. I am a community psychologist and work to infuse environmental approaches into my work developing and evaluating community prevention programs. Much of my work involves working with community coalitions.

Hot Tip: Appreciate how long it takes for community coalitions to mature. Often, community members want to jump in and get right to work. However, the first thing community coalitions need to do is develop structures and processes that will help ensure their long-term success. It may be helpful for you to work with your coalition to develop a logic model that details what steps the coalition needs to take in order to be successful. Here is one example from our work with the Drug-free coalition of Hall County, based on a model by Fran Butterfoss and Michelle Kegler’s Community Coalition Action Theory (2002). Having this Logic Model helped coalition members focus on establishing a good foundation and to recognize the importance of planning and evaluation.

Ann Logic Model

Rad Resource: Fran Butterfoss’s book, Coalitions and Partnerships in Community Health (2007), is a great reference book for coalition leaders, researchers and evaluators. It includes surveys that coalition leaders can use to assess the health of their coalition.

Rad Resource: Fran Butterfoss has a new book, Ignite! Getting Your Community Fired Up for Change, an excellent and accessible resource for coalition leaders and members filled with tips to inspire coalitions to action. 

Hot Tip: Community Anti-Drug Coalitions of America (CADCA) is another good resource for both coalitions and evaluators. They host The National Leadership Forum each December in Washington, D.C. and the Mid-year Training Institute held at various locations around the country. Both meetings include one-to one coaching for coalition leaders and a separate tract for youth, the National Youth Leadership Initiative.

Lesson Learned: “Evaluation as intervention” is a concept I have been pondering lately. When you find your coalition is stuck in a “meet and talk” rut, think about redesigning the evaluation to focus on the environmental change strategies the coalition has implemented and the community reach of each strategy. Work on documenting the link between their chosen strategies and community outcomes. Then, use evaluation data to provide more timely feedback to the coalition. This would be a great opportunity to involve coalition members in discussions about where they are, where they would like to be and how, working together, they can get there.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi all!  Liz Zadnik here, aea365 Outreach Coordinator and occasional Saturday Contributor.  I wanted to share some insights and reflection I had as the result from a recent EVALTALK discussion thread.  Last month, someone posed the following request:

I’m searching for a “Why Evaluate” article for parents/community members/stakeholders. An article that explains in clear and plain language why organizations evaluate (particularly schools) and evaluation’s potential benefits. Any suggestions?

Rad Resources: Others were kind enough to share resources, including this slideshare deck that moves through some language and reasoning for program evaluation and assessment, book recommendations  There is also a very helpful list from PlainLanguage.gov offering possible replacements for commonly-used words.  (Even the headings – “Instead of…” and “Try…” – make the shift seems much more manageable).

Lessons Learned: Making evaluation accessible and understandable requires tapping into an emotional and experiential core.

  • Think about never actually saying “evaluate” or “evaluation.”  It’s OK not to use phrases or terms if they are obstacles for engaging people in the evaluation process.  If “capturing impact,” “painting a picture,” “tracking progress” or any other combination of words works…use it!  It may be helpful to talk with interested or enthusiastic community members about what they think of evaluation and what it means to them.  This helps gain insight into relevant language and framing for future discussions.
  • Have the group brainstorm potential benefits, rather than listing them for them.  Similar to engaging community members in discussion of the “how” is also asking them what they feel is the “why” of evaluation.  I have heard the most amazing and insightful responses when I have done this with organizations and community members.  Ask the group “What can we do with the information we get from this question/item/approach?” and see what happens!
  • Evaluation is about being responsible and accountable.  For me, program evaluation and assessment is about ethical practice and stewardship of resources.  I have found community members and colleagues receptive when I frame evaluation as a way to make sure we are doing what we say we’re doing – that we are being transparent, accountable, and clear on our expectations and use of funds.

We’d love to hear how others in the aea365 readership are engaging communities in accessible conversations about evaluation.  Share your tips and resources in the comments section!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! My name is Angela Fitzgerald and I am a Senior Researcher with the National Council on Crime and Delinquency (NCCD: www.nccdglobal.org) – a nonprofit that works to promote just and humane social systems. I have been involved in and have witnessed evaluation work from a number of vantage points, and a challenge that seems to consistently plague organizations is engaging community members (one subset of stakeholders) in the evaluation process. Engaging community members in the evaluation process helps to ensure that evaluation content is understood and of relevance to that audience, identifies advocates who can champion the projects being evaluated, and lends credibility to evaluation findings. I have compiled and listed a few ‘Lessons Learned’ to help organizations overcome the challenge of engaging community members.

Lessons Learned: Develop relationships with community organizations and groups. People are more likely to invest time in something with which they are already familiar or belong. For an organization undertaking an evaluation project, this may require connecting to other organizations or groups to which your desired audience belongs. Developing relationships with other community-based organizations or groups creates ambassadors who are willing to help recruit individuals on your behalf to participate in the evaluation.

Make the process accessible. Engaging community members may require a different process than engaging other types of stakeholders. For example, community members who want to be involved may not be available during the work day. Providing opportunities for engagement through different mediums (web-based, telephone, in-person) and during non-traditional work hours will help to maximize opportunities for individuals to become involved in the evaluation process. Also taking into consideration potential barriers to engagement for community members and working to overcome them (e.g., scheduling meetings in locations that are accessible via public transportation) signifies that you care about their involvement.

Follow-up with your audience after the evaluation is complete. Completing a proper evaluation takes much time and effort, and once completed it’s easy to move on to the next project before sharing findings with your community member stakeholders. Those who have been engaged will be interested in knowing the outcome of the evaluation, and possibly contributing to future work. Make sure to share evaluation findings with community members, and allow them to provide their input in the interpretation of findings. This exchange may be supremely important in defining next steps for the project, as well as ensuring that these community members will want to invest time on your future projects.

Rad Resources: Check out the Center for Disease Control’s (CDCs) website for more helpful information on engaging stakeholders in your evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Saludos! My name is Grisel M. Robles-Schrader, President of Robles-Schrader Consulting and lead organizer of the Consortium for Latino Access to Research Opportunities (CLARO), based in the Chicagoland area.

CLARO is a collaboration of diverse community sectors involved in Latino-focused healthcare, evaluation, and research. CLARO is interested in promoting research literacy, evaluation and engagement aimed at improving health outcomes among Chicago’s diverse Latino communities.

Lessons Learned #1:  Latino-Focused Approaches are Needed. The “Latino/Hispanic” community represents a diversity of experiences, histories, and cultures. Latinos make up 1/3 of the population in Chicago, but are disproportionately affected by poor health outcomes such as high rates of asthma, diabetes, high cholesterol, obesity, and HIV/AIDS. These health disparities are associated with culture, behavior, acculturation, socioeconomic status and stigma associated with healthcare prevention, diagnosis and treatment options. As a result, there is an ongoing need for culturally responsive health interventions developed by and for Latino communities.

#2: Multi-Sectoral Partnerships are the Heart of Community-Driven Efforts. CLARO is comprised of community-based organizations, healthcare facilities, community members, universities, community coalitions, independent consultants, foundations, and government commissions that work with Latino communities.

During the past 12 months we have convened five community meetings, established a steering committee, identified a group name, drafted a work plan, and doubled our membership. All of these achievements are due to the volunteer investment of our diverse membership base.

#3: Mutually Beneficial Partnerships Fuel Ongoing Achievement. In this initiative members participate in a variety of ways including: providing student support, completing action steps, sharing information, and providing resources (i.e., meeting space, refreshments, etc.).

Members take on activities based on their availability, skills, and passions. We always have a rotating group of people who are actively completing activities and others who are “sitting on the bench” until the right opportunity arises.

#4: Mutual Respect and Open Communication. Community-based evaluation and research are inherently vulnerable to miscommunication and misunderstandings because of differences in values and goals each community sector holds. Communication breakdowns between these sectors can further stigmatize and reinforce negative stereotypes about each sector.

We encourage members to share opposing opinions or “play devil’s advocate” as it allows the group to consider the full complexity of the decisions to be made. As a result, we are able to create more effective action plans.

Rad Resources:

  • Community Tool Box offers free resources in English and Spanish to assist in building and sustaining successful coalitions aimed at improving community health.
Clipped from http://ctb.ku.edu/en

Get Involved: Contact Grisel M. Robles-Schrader at griselconsult@gmail.com to get involved!

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! I am Caryn Mohr, a Research Scientist at Wilder Research in St. Paul. I’m one of more than 40 researchers in the office who conduct primary research. My own work focuses on education programs addressing opportunity and achievement gaps. Our office also manages Minnesota Compass, a nationally recognized community indicators project. Opportunities to collaborate with Compass staff have shown me the power of connecting primary research to community indicators data.

Those of us who conduct primary research gather new data first-hand. We use a variety of methods to evaluate the impact of individual programs and test research hypotheses. My own work ranges from case studies to long-term, quasi-experimental studies of education programs. We administer surveys, conduct interviews, convene focus groups, and employ a variety of methods that give us deep and direct knowledge of study participants’ experiences. We work closely with individual programs and organizations to help them understand their impact.

My colleagues at Minnesota Compass help us see the big picture. Compass provides a common framework for measuring and tracking state and local progress on a range of topics, including education, the economy, health, housing, and other important social issues. In each area, an advisory committee of stakeholders identified key indicators which are monitored over time to understand the health and progress of our community.

Lessons learned:

  • See the big picture. Considering results of individual program evaluations in the context of community indicators provides a broader perspective and meaningful context to stakeholders. For example, results of a recent STEM (science, technology, engineering, and math) education program evaluation can be considered in the context of indicators of progress along the STEM cradle-to-career continuum. This context can help program staff consider their goals in relation to benchmarks and gaps pertinent to the continuum. Connections to indicators afford exploration of questions such as: How do program goals relate to research-based benchmarks and community needs? Are resources being targeted effectively?
  • Identify themes. Considering study results in the context of community indicators can help researchers identify meaningful themes across individual program evaluations. In education, our community indicators show stark achievement gaps. It is important to consider what our first-hand knowledge of individual programs tells us about addressing these gaps, and how this relates to research literature and community needs. Moving the needle on indicators requires understanding how programs work on the ground. Likewise, effectively targeting program resources necessitates an understanding of community needs.

Hot tip:

Explore connections to community indicators to provide meaningful context to individual program evaluations.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello. I’m Craig Helmstetter, a Senior Research Manager at Wilder Research, a division of the Amherst Wilder Foundation in Saint Paul, Minnesota (right across the Mississippi from Minneapolis).

Remember that quartet that was playing as the Titanic sank? How good were they?

If your answer was “Who cares?!! The ship was sinking!!”, then you probably already get my point. Evaluators are generally good at gaging the effectiveness of specific programs on the specific lives of the specific individuals served by those programs. But we aren’t always so good at tracking the broader community changes that also impact the lives of those same individuals.

Maybe we should be.

Hot tip: Community indicators projects can be a great complement to more granular program-by-program evaluations. Indicators projects can provide context to other evaluations, and can provide a platform to help engage and inform broader community improvement efforts.

Lessons learned: Involve your stakeholders. Early in my own indicators work I thought that I was smart enough to pick indicators all by myself. That is a good way to get ignored. Wilder’s community indicators project, Minnesota Compass, now enjoys some success due to the involvement of over 500 project advisors. First they helped us shape the project. Now they are the project’s champions.

Hot tip: Resist the temptation to include too much data in your indicators work. A couple well-chosen indicators say way more than does a laundry list of every available data source. (And picking those few well-chosen indicators is a great way to engage your advisors.)

Lessons learned: Dig deeper. It is often not enough to look at overall trends. Where possible, cross-tabulate by race, income, gender, and place.

In Minnesota, for example, we take pride in our nation-leading workforce participation rate. However, there is a really embarrassing under-belly to that same indicator – namely that we have the biggest Black-White employment gap in the nation. Both facts are good for evaluators to know. The first is particularly relevant if you are trying to compare the success rate of employment programs in Minnesota to those located elsewhere in the U.S., and the second is relevant if you are evaluating the success rates of programs participants by race.

And don’t even get me started on how important it is to get information like this in front of policy-makers, planners, and the broader public.

Rad Resource: The Community Indicators Consortium’s website includes links to indicators projects throughout the nation, as well as “how to” webinars and publications.

Twin Cities Hot tip: Looking for some nightlife while you are in town for the AEA meeting? First Avenue, made famous in Prince’s movie Purple Rain, is right down the street from the conference.

The American Evaluation Association is celebrating Minnesota Evaluation Association (MN EA) Affiliate Week with our colleagues in the MNEA AEA Affiliate. The contributions all this week to aea365 come from our MNEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

We are a collaborative team from three research and planning firms: Loraine Park (Harder+Company Community Research), Carolyn Verheyen (MIG), and Eric Wat (SSG). As part of a larger community needs assessment, we recently conducted asset mapping in multiple communities.

Our aim was to use that process to engage residents and parents in conversations about the resources, social supports, and strengths in their community. We were interested not only in what resources exist in the community, but also how well they are utilized. Below are some tips and lessons we learned through this experience.

Hot Tip: Extensive outreach is important to ensure that a diverse cross-section of residents participate in asset mapping activities. Use multiple resources to connect to your target audience. Our target audience included parents with young children and service providers within the identified communities. We used varied outreach methods including: distributing flyers at community locations where families with young children gather and inviting potential participants via e-mail and telephone. We also used community partners such as local non-profit organizations and outreach workers to assist with these efforts. Be prepared to address the needs of participants, including transportation, translation, childcare, etc.

Hot Tip: Printed copies of maps used during the asset mapping exercise should be large, in full color and should have some information to help people orient themselves. The maps we printed were approximately 3’x4’ and included major streets, highways, rivers, and some landmarks, such as parks, hospitals, and schools.

Rad Resource: We used Geographic Information Systems (GIS) to create base maps then used Illustrator to add other features such as icons and legends and to make the maps user-friendly and graphically pleasing.

Hot Tip: During the asset mapping session, we broke into small groups of 8 to 10 people. Each group included a facilitator to guide the discussion and a note taker to record key comments and observations. Sometimes community members were recruited as facilitators. All facilitators and note takers received training in advance of the asset mapping session. In addition to placing stickers to identify community resources, participants were encouraged to write on the maps and talk about the resource they identified.

Hot Tip: We created stickers with icons for each of the questions we asked. For example, when we asked participants to identify where they access health care, they were given stickers with this icon.

Hot Tip: When conducting the asset mapping exercise, it is helpful to start with an easier question so people can orient themselves to the map. For example, we began by asking participants to identify where they live and work, again using pictorial stickers to represent home and work.

Resource: For more information about asset mapping, please see our AEA presentation materials on the Harder+ Company website here or in the AEA eLibrary here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Susan Wolfe and I am the owner of Susan Wolfe and Associates, LLC, an independent consulting firm that applies Community Psychology principles to strengthening organizations and communities. Our services include program evaluation. We have found an ecological perspective useful for thinking through evaluation designs and analyzing the results. One ecological framework I have found useful is James Kelly’s (1968) ecological conception of preventive interventions, and many of Kelly’s later works that expanded on his ecological thinking. Kelly’s ecological conception consists of the four principles described below.

Principle 1:  Functions within a social unit are interdependent (The Ecosystem Principle). Measuring outcomes without considering the interdependencies of participants or organizations can lead to erroneous or invalid conclusions. Interdependencies may contribute to program success or undermine it.

Hot Tip:  Include measurement of interdependencies in your evaluation design, particularly those that might affect the outcomes. Examples might be social networks or other services or education programs used by participants.

Principle 2:  The cycling of resources. Identifying efficiencies in resource use can be useful information for programs to plan for sustainability.

Hot Tip:  Include measures of resource use and efficiency in your evaluation design. Assess the extent to which each component contributed to success, whether there are unnecessary positions or materials, and whether work could be reorganized more efficiently or whether resources may be available through collaborations with other entities.

Principle 3:  The environment affects styles of adaptation. Understanding environments and programs includes understanding the adaptive skills individuals and organizations use to survive and thrive.

Hot Tip:  Include an assessment of the program’s environment and how the organization or the program participants adapt to it. Determine the extent to which the program has adapted to the needs of the individuals or groups it serves and whether further adaptation (e.g., hours, content) may further program success.

Principle 4:  The succession principle: the evolution of natural communities. Understanding how change has occurred in a community or organization over time makes it possible to anticipate the rate and direction of change in the future.

Hot Tip:  Include an examination of changes in the program or community it serves over time, including a historical view. This can help the organization to understand the direction it may need to take, or reasons why certain tactics may not work well within a specific community.

Rad Resource: Kelly, J.G. (2006). Becoming ecological:  An expedition into community psychology. New York:  Oxford University Press.

Reference:

Kelly, J.G. (1968). Toward an ecological conception of preventive interventions. In J.W. Carter, Jr. (Ed.). Research contributions from psychology to community mental health. New York: Behavioral Publications.

The American Evaluation Association is celebrating Systems in Evaluation Week with our colleagues in the Systems in Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our Systems TIG members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting Systems resources. You can also learn more from the Systems TIG via their many sessions at Evaluation 2010 this November in San Antonio.

· ·

Hello, I am Kimberly Kay Lopez, I have a community-based evaluation and research practice based in Houston, Texas. My work is concentrated in participatory evaluation methods used for the evaluation of youth programs and services.

Hot Tip: Using Photovoice in an Empowerment Evaluation: When working with youth, I have used the Empowerment Evaluation model many times. I have found that using Photovoice and journal writing within the Empowerment Evaluation model yields a variety of rich evaluation data. The goals of the Photovoice process enhance the evaluation process. The Photovoice process allows participants to document issues, engage in dialogue, and impact policy. I first integrated Photovoice with the Empowerment Evaluation model when evaluating the long-term impact of a multi-year teen pregnancy prevention program among urban Latino youth, I utilized the Photovoice process as a tool to “take stock” within the Empowerment Evaluation model. Youth were given cameras to capture the impact of the program. Youth were also given journals and guided writing assignments to express the impact that the teen pregnancy prevention program had on them. We also held traditional Empowerment Evaluation discussion groups. The youth and I analyzed visual data, journal data and discussion group data to develop the final evaluation report.

Rad Resource: PhotoVoice.org: PhotoVoice is an international organization that works with vulnerable populations. They offer several publications including a manual for using Photovoice. The methodology series gives further instruction on working with specific populations such as refugees. http://www.photovoice.org/shop/info/methodology-series

Rad Resource: PhotoVoice Manual: A comprehensive Photovoice Manual developed by Prairie Women’s Health Centre of Excellence. www.pwhce.ca/photovoice/pdf/Photovoice_Manual.pdf

Hot Tip: Using Journal Writing in Evaluations: I have found when working with middle school students, some students may be reluctant to participate in a discussion group. Offering youth writing opportunities allows those students uncomfortable in a group discussion a way to contribute to the evaluation process, on their terms.

Rad Resource: Guide on Engaging Youth: The National Clearinghouse for Families and Youth has a great guide on engaging youth in writing. www.ncfy.com/publications/pdf/lbd_write.pdf

There are all kinds of ways to get creative with data collection-digital storytelling, video cameras, blogs, tweets, text messages! Get creative! Use your imagination! Have fun!

Want to learn more from Kimberly? She’ll be on the program this November at Evaluation 2010, The American Evaluation Association’s Annual Conference in San Antonio, Texas.

· · ·

Older posts >>

Archives

To top