AEA365 | A Tip-a-Day by and for Evaluators

CAT | Community Psychology

Hello, my name is Tiffeny Jimenez and I am an Associate Professor in the Community Psychology Doctoral program at National Louis University, Chicago. As a practitioner and educator, I continuously reflect on how well I apply the principles of the field, and a key part of this work involves working with others that seek to speak truth to power through a process of mutual liberation.

I hold a number of roles within the community, and to be true to the goals of promoting empowering processes and outcomes I must adhere to a level of consciousness committed to realizing freedom. This inevitably requires understanding my place, power, role and biases in every situation related to my purpose within settings. I believe, whether working to change the culture of an organization to be more empowering, facilitating dialogue with community groups about their data to influence decision-making, or teaching a course a community psychology course in which we hope to inspire learning and action, the principles of creating empowering settings are the same.

Lessons Learned: I recently re-read Paulo Freire’s Pedagogy of the Oppressed and find it helpful when reflecting on roles where I seek to facilitate empowerment. As a community psychologist, the five main conditions we must cultivate include: love, humility, faith, hope, and critical thought. First, we must have authentic love for the people we work with, and love for the world itself. Second, humility allows us to relate most correctly with others in that we realize our limitations and the potential of others. Third we must possess faith in the potential of human beings along with a belief that the world can be addressed in a formative way. Fourth we need to have a disposition of hope that it is possible to change the world in a positive way. Last, enacting critical thought recognizing the world is dynamic and that we are inextricably related to it.

Once the five primary conditions are present, we need to be able to facilitate and foster first-hand authentic humanizing dialogue without fear of being vulnerable in this public act of intimacy. Ultimately, it is through dialogue that everyone gains the freedom to name or put voice to a more comprehensive understanding of reality.  Dialogue allows for moving beyond any perceived limits that constrain our actions and move into a level of consciousness where we believe we can mutually reconstruct the world.

Hot Tip: Toward the goal of world reconstruction our facilitation must acknowledge people’s realities, tear down fixed perceptions of the world, and assist in reframing limits as opportunities for growth. I would argue that any slight inauthenticity of the five conditions will erode the potential for trust to be built and reduce effectiveness as educators/facilitators.

Rad Resource: Freire, P. (2000). Pedagogy of the oppressed (30th anniversary ed.). New York: Continuum.

 

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

No tags

Greetings AEA 365’ers. I’m Susan Staggs, an independent evaluator and community psychologist, here to talk about infusing community psychology values including respect for diversity, social justice, and prevention rather than treatment into the evaluation synthesis process.

Cool Trick: Evaluation Synthesis, as defined by Jane Davidson, is “combining evaluative ratings on multiple dimensions or components to come to overall conclusions.” It’s a simple thing to infuse these values into the synthesis process, but it’s not often done, perhaps because evaluators are overly focused on criteria such as effectiveness, efficiency, and impact. Those are wonderful criteria, but if we’re serious about valuing diversity and creating positive social change (and our stakeholders agree), we need to explicitly evaluate performance on those criteria of merit.

Hot Tip: One way to synthesize data is creating a set of rubrics to get to a final judgement. Below are examples of partial rubrics:

  • Rubric 1 for determining merit on criteria:

Rubric 1: Merit Determination for Each Criteria

Criteria Importance Weighting Merit Determination
Excellent Good Fair Poor
Emphasis on prevention over treatment Medium 80% of funds devoted to prevention programming 60% of funds devoted to prevention programming 40% of funds devoted to prevention programming Less than 40% of funds devoted to prevention programming
Explicit attention to social justice concerns High Free babysitting provided during 90% of education sessions Free babysitting provided during 85% of education sessions Free babysitting provided during 80% of education sessions Free babysitting provided during 75% of education sessions
Effectiveness High
Efficiency Low

 

  • and Rubric 2 for synthesizing Rubric 1 results into a final judgment. Rubric 2 has a soft hurdle requirement, which is a requirement that must be met to obtain an exemplary final rating.

 

Rubric 2: Final Rating Determination

Final rating Performance Standard
Exemplary Performance Merit determination of excellent required on all high importance criteria and good on criteria of medium and low importance
Soft Hurdle Merit determination of excellent on prevention and social justice criteria
Proficient performance Good merit on criteria of high or medium importance; no poor merit
Fair performance Good merit on high importance criteria; all other merit determinations of good or fair; no poor merit

 

Rad Resources: (on Evaluation Synthesis Methodology) The synthesis methodology chapter in Jane Davidson’s book Evaluation Methodology Basics, and her UNICEF Methodology brief on evaluative reasoning.

 

*Special thanks to Amy Gullickson, University of Melbourne, for introducing me to the art and science of synthesis.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

No tags

My name is Sharon M. Wasco and, as a community psychologist, I frequently adopt participatory approaches to evaluation. I often say, “Commitment to collaboration guides my work,” and this post is organized to answer three questions about collaboration.

Hot Tip: What does collaboration mean? To me, collaboration means two or more partners working together in ways that increase capacity to achieve a stated goal.

Below is my visualization of Chris Huxham’s (1996) typology of ways that organizations (and/or individuals) can work together.  Collaboration is the highest level of functioning, with capacity building as the collaborative advantage.

Cool Trick: How can we observe — or recognize — collaboration in our work? Synergy can be used as an indicator of collaboration.

The Center for the Advancement of Collaborative Strategies in Health developed a theoretical framework that specified SYNERGY as a measurable output of a collaborative relationship, and as a precondition of partnership effectiveness.

Rad Resource: Can collaboration be measured? The Partnership Self Assessment Tool is a survey and a process to measure collaboration in the field.  There are nine items that make up the measure of synergy, as shown below.

Along with partners at the Florida Coalition Against Domestic Violence and Strategic Prevention Solutions, I’ve used this tool in the evaluation of a five-year primary prevention initiative focused on strategies at the “outer layers” of the SEM.

FCADV’s community and social change efforts included Coalition Building, which has been defined as “a strategy designed to increase two or more organizations’ abilities to work collaboratively on statewide or community prevention programs, policies, or resources” (read more about IPV prevention strategies here).

FCADV-Certified DV Centers convened Community Action Teams (CATs) to plan and implement prevention activities in over 40 diverse communities across Florida. The PSAT allowed us to measure increased collaboration and to document success of the CATs!

Hot Tips: Not every working group is meant to collaborate.  A partnership has been defined as a group of people and organizations 1) that continually work together to develop and modify strategies to achieve their goals 2) has begun to take action to implement its plans 3) has at least five active members 4) exists for at least six months.

To qualify as a group-level indicator, PSAT methodology requires at least 65% of the group to complete the survey within thirty days.

 

Rad Resources: Read more about different working relationships in the Community Toolbox.

Read more about the Centers for Disease Control and Prevention’s (CDC’s) Domestic Violence Prevention Enhancements and Leadership Through Alliances, Focusing on Outcomes for Communities United with States‘s (DELTA FOCUS) initiative to prevent intimate partner violence.

Find the Partnership Self-Assessment Tool on the registry of the National Collaborating Centre for Methods and Tools.

 

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

No tags

I’m Brian Hoessler, Founder of Strong Roots Consulting, a firm focusing on program evaluation and strategic planning for non-profits in Saskatoon, Canada. Over the past year I’ve been learning about principles-focused evaluation (as articulated by Michael Quinn Patton) that asks how principles contribute to the evaluation process. Recent discussions around competencies and our role in society have also shed light on the principles (implicit and explicit) guiding our evaluation work.

Lesson Learned: Evaluation isn’t the only field using principles. My “home field” of community psychology has identified five foundational principles in defining core practice competencies:

  1. Ecological Perspectives – The ability to articulate and apply multiple ecological perspectives and levels of analysis in community practice;
  2. Empowerment – The ability to articulate and apply a collective empowerment perspective to support communities that have been marginalized in their efforts to gain access to resources and to participate in community decision-making;
  3. Sociocultural and Cross-Cultural Competence – The ability to value, integrate, and bridge multiple worldviews, cultures, and identities;
  4. Community Inclusion and Partnership – The ability to promote genuine representation and respect for all community members, and act to legitimize divergent perspectives on community and social issues; and,
  5. Ethical, Reflective Practice – In a process of continual ethical improvement, the ability to identify ethical issues in one’s own practice, and act to address them responsibly (e.g., articulate how one’s values, assumptions, and life experiences influence one’s work; articulate strengths and limitations in one’s own perspective; and developing and maintaining professional networks for ethical consultation and support).

Although I have not explicitly referred to these five foundational principles in my evaluation work, I realize the strong alignment between those ideals and my practice.

 

Hot Tip: In evaluating programs and initiatives, I endeavour to look beyond individuals and families to ask questions about how the program (and the evaluation itself!) contributes to positive and negative outcomes for groups, organizations, and communities. I also try to take a critical lens to my practice, asking questions about who’s truly benefiting from the evaluation and planning processes and how I can better work from a position of community empowerment and inclusion. Sociocultural and cross-cultural competency is a particular focus for me right now, especially in light of the Truth and Reconciliation Commission’s Calls to Action to address the harmful and ongoing effects of colonization for Indigenous people in Canada.

The question, though, remains: do these principles contribute to achieving our desired outcomes?

Rad Resources:

 

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

No tags

I’m Jeff Sheldon, 2018 Chair of the Community Psychology TIG.  Today I want to discuss why I view evaluation as a social intervention in keeping with the principles of community psychology and with our 2018 conference theme, Speaking Truth to Power.

It is my contention that evaluation’s best present use is as a social intervention.  You might disagree, but my values orientation, as seen through the lens of community psychology has always been toward using evaluation for social justice, emancipation, and eliminating oppression and marginalization.  This is why in my own practice I naturally gravitate toward evaluation approaches in the relativist and ideological traditions.

I characterize evaluations with an ideological orientation as social intervention because they are foundational to development of individual and organizational empowerment and self-determination.  Social interventions ensure individuals and groups have the power to influence the direction of their lives and their social institutions.

Rad Resource: Bennett, E. M. (1987). Social intervention: Theory and practice. In E. M. Bennett (Ed.), Social intervention: Theory and practice (pp. 13-28). Queenston, NY: Edwin Mellen Press.

In Bennett’s (1987) construction of social intervention, the target of an intervention’s effort is social structure (e.g., law, politics, and economy) rather than individual (e.g., judge, legislator, and CEO) behavior within that structure.

RAD Resources: Crossman, A. (2017). Social structure defined: An overview of the concept. Retrieved from: https://www.thoughtco.com/social-structure-defined-3026594.

As defined by sociologist Ashley Crossman (2016), social structure is the organized set of social institutions and patterns of institutionalized relationships that together compose society. Social structure is both a product of social interaction and directly determines it, is not immediately visible to the untrained observer, but is always present and affects all dimensions of human experience within society.

Bennett concludes by noting that the success of a social intervention is ultimately judged by the degree to which there has been movement to reduce oppressive conditions, and whether the strategies employed are creating social processes and structures which provide the marginalized person with greater access to goods, services, and movement toward a psychological sense of well-being.  In other words, evaluation as social intervention is an emancipatory process which identifies and eradicates factors contributing to disempowerment (i.e., lack of knowledge, lack of influence, lack of skills, lack of self-efficacy, and fear) and increases individual self-determination through a sense of connectedness to a larger social collective.   

Hot Tip: Given our zeitgeist the dual rationales of empowerment and self-determination compels use of evaluation as a social intervention.  Yes, evaluation used in this way has an overt political agenda of changing power differentials, but if you are concerned with enhancing peoples’ decision-making power and helping them restore control over their lives then consider collaborative, participatory, participatory action research, community-based participatory research, or empowerment evaluation as your preferred approach in the relativist and ideological evaluation traditions.  Help them speak truth to power. 

 

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

No tags

I’m Ann Price, an evaluator and community psychologist based in the Atlanta Metro Area and President of Community Evaluation Solutions. As we get ready for Eval 2018 I want to share my reflections on our conference theme, speaking truth to power, one of four community psychology principles.

Rad Resource: The community psychology principles, developed by the Society for Community Research and Action, guide our work as community psychologists and should resonate with evaluators. One says:

Community research and action requires explicit attention to and respect for diversity among peoples and settings

I am particularly struck by the words “requires explicit attention to.” Recently, I was in New York City and visited the Metropolitan Museum of Modern Art, spending considerable time in the Adrian Piper exhibit. I was not familiar with her work, but one series of photographs caused me to stop in my tracks as this particular series was emblazoned with these words in red ink: “Pretend not to know what you know”. Some photos depicted violence against African Americans; in another, a white woman and her healthy, smiling little boy in between two other photos of women with their starving immigrant children. In another exhibit section Ms. Piper asked a series of multiple choice questions one of which was Q: Do you have at least one black friend? A. If yes, how often do you have contact with him or her? Daily, weekly, monthly, yearly, none of the above.

Our conference theme has already generated a lot of conversation and some strong emotions. I think speaking truth to power has to first start with ourselves. As a practitioner who works in communities, self-reflection is a necessary exercise. One of my clients is going through a year-long process of race, equity, and inclusion training. I suspect this work has started an awakening in many of the participants, of which I am one, and who like me, thought they were more attentive then they actually are.

So today I am inviting you to ask yourself: What do I need to attend to? What am I pretending not to know that I really do know? Doing the work evaluators do has to start with some serious self-reflection and if necessary, making amends. Then, listening more than talking as another great next step. Enriching your practice through education, readings, self-reflection, and experiences is also needed. In addition to Piper’s exhibit, here are some things I have been attending to lately:

Rad Resources: Some documentaries to generate discussions:

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

No tags

Hi, we are Olya Glantsman, a visiting professor of Psychology at DePaul University and Judah Viola, a dean of the College of Professional Studies and Advancement at National Louis University in Chicago. Last month we celebrated the release of our book titled Diverse Careers in Community Psychology.

The idea for the Diverse Careers in Community Psychology was born as a response to the question often posed by our students: “What can I do with a degree in Community Psychology?” Each time we got this question, we became more and more convinced that a text like this is long overdue for students (undergraduate or graduate), professionals looking to expand, shift, or change their career, and anyone who is mentoring or advising community minded students or employees.

Below are some hot tips we culled from the results of an extensive career survey of over 400-plus participants and 23 chapters written by over 30 different community psychologists (CPs) with various backgrounds.

Hot Tips: When looking for a job

  • Keep in mind, that those with community-oriented degrees do not have a problem finding a job
  • Many professionals successfully market their job skills and competencies rather than their degree and really find a “niche” – using unique interests and talents that the organization or team needs
  • A large number of survey participants have cited using Practice Competencies in helping them secure a job and using the same competencies throughout their work
    • All respondents reported using between five and fourteen competencies
  • When searching for a practice job, start preparing while still in training, cast a broad net, and search multiple disciplines
  • Obtaining mentorship and networking are two of the most important activities one can participate in.
    • More than half (59%) of survey respondents reported that they found out about their current job through networking

Rad Resources: Job Search

  • Participants use multiple search techniques when looking for employment (e.g., job postings, networking, listservs, etc.)
  • Finding practice related job search:
    • AEA, APA, and American Public Health Association (APHA) website, Indeed.com, Idealist.org, npo.net, simplyhired.com, and careerbuilder.com

Rad Resources: Job Training

Whether you are beginning your career or trying to expand or shift into a new arena, there are lots of options and opportunities. Whatever your journey, we hope you would find more helpful tips and hints in our book.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Our names are Susan M. Wolfe and Kyrah K. Brown and we are consultants at CNM Connect where we provide evaluation and capacity building services to nonprofit organizations.  Our work also includes evaluating community collaborations and coalitions. To effectively address most health, education, and other social issues at a systems level requires that communities address inequity and injustice.

RAD RESOURCE: In January, 2017 an article titled “Collaborating for Equity and Justice: Moving Beyond Collective Impact” was published in the Nonprofit Quarterly.

The authors presented the following six principles to promote equity and justice and each has implications for how coalitions and community collaboratives are evaluated.

Principle 1: Explicitly address issues of social and economic injustice and structural racism.

  • HOT TIP: Nearly all human problems, especially where there are disparities in outcomes, can be traced to social and economic injustice and/or structural racism. As an evaluator, examine whether these issues are being discussed and directly addressed.

Principle 2: Employ a community development approach in which residents have equal power in determining the coalition’s or collaborative’s agenda and resource allocation.

  • HOT TIP: Ask who has the actual power to make decisions and set agendas for the collaborative.

Principle 3: Employ community organizing as an intentional strategy and as part of the process. Work to build resident leadership and power.

  • HOT TIP: Closely examine the membership and leadership to determine the extent to which residents, or those who are directly affected by the issue at hand, are members and leaders.

Principle 4: Focus on policy, systems, and structural change.

  • HOT TIP: Review the agendas and activities to determine whether they are promoting more programs, or facilitating change in policies, systems, and structures.

Principle 5: Build on the extensive community-engaged scholarship and research over the last four decades that show what works, that acknowledge the complexities, and that evaluate appropriately.

Principle 6: Construct core functions for the collaborative based on equity and justice that provide basic facilitating structures and build member ownership and leadership.

RAD RESOURCE: The Collaborating for Equity and Justice Toolkit provided by The Community Tool Box can be accessed at: https://www.myctb.org/wst/CEJ/Pages/home.aspx

HOT TIP: Many nonprofits and health agencies are engaged in collaborative work and are oftentimes looking for effective frameworks to model after. Evaluators can use the Collaborating for Equity and Justice Toolkit to facilitate discussions and coalition development and planning efforts. When you introduce nonprofits and collaboratives to the framework, it may be helpful to provide brief presentations or facilitate interactive planning sessions. Prepare guided questions that help nonprofits to think about the application of the six principles in their work. As mentioned, the application of this framework can prove to be useful in refining or developing coalition goals that are intentional and evaluating their efficiency and effectiveness in meeting those goals.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi! I’m Tara Gregory, Director of the Center for Applied Research and Evaluation (CARE) at Wichita State University. Like any evaluator, the staff of CARE are frequently tasked with figuring out what difference programs are making for those they serve. So, we tend to be really focused on outcomes and see outputs as the relatively easy part of evaluating programs. However, a recent experience reminded me not to overlook the importance of outputs when designing and, especially, communicating about evaluations.

In this instance, my team and I had designed what we thought was a really great evaluation that covered all the bases in a particularly artful manner – and I’m only being partially facetious. We thought we’d done a great job. But the response from program staff was “I just don’t think you’re measuring anything.” It finally occurred to us that our focus on outcomes in describing the evaluation had left out a piece of the picture that was particularly relevant for this client – the outputs or accountability measures that indicated programs were actually doing something. It wasn’t that we didn’t identify or plan to collect outputs. We just didn’t highlight how they fit in the overall evaluation.

Lesson Learned: While the toughest part of an evaluation is often figuring out how to measure outcomes, clients still need to know that their efforts are worth something in terms of the stuff that’s easy to count (e.g., number of people served, number of referrals, number of resources distributed, etc.). Although just delivering a service doesn’t necessarily mean it was effective, it’s still important to document and communicate the products of their efforts. Funders typically require outputs for accountability and the programs place value in the tangible evidence of their work.

Cool Trick: In returning to the drawing board for a better way to communicate our evaluation plan, we created a graphic that focuses on the path to achieving outcomes with the outputs offset to show that they’re important, but not the end result of the program.  In an actual logic model or evaluation plan, we’d name the activities, outputs and outcomes more specifically based on the program. But this graphic helps keep the elements in perspective.    example graph of outputs and outcomesThe American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

word bubble with lego manHi folks! I’m Jill Scheibler, a community psychologist and Senior Research Analyst at Carson Research Consulting, a women-led firm whose mission is to help clients thrive by using data to measure impact, communicate, and fundraise. We’re passionate about storytelling with data to make a difference.

At CRC I’m the “word nerd”, implementing our qualitative projects. Like many evaluators, I’ve had to translate academically-honed skills to the often faster-paced world of evaluation. A recent project for a county health department’s substance abuse initiative provides an example of how I tailor qualitative methods to meet clients’ needs.

Hot Tips

Allot ample time for clarifying goals. As with all good research, methods choices flow from the question at hand. In this case, our client wanted to understand the impact of substance abuse on their county, and new resources to be tapped. Like many clients, they lacked research savvy, and thought they required services exceeding their budget and available time. We gradually learned they had access to lots of quantitative data and support from the state to help interpret it. They were missing community stakeholder feedback. So, we provided a qualitative needs assessment component.

Build in more meetings than you think you’ll need, and bring checklists. Be prepared to leave meetings thinking you have all needed answers and learning afterwards that you’ve been (well-meaningly) misinformed! (Quantitative sidebar example: after building a data dashboard for another client in Excel2013, based on their word, we learned they had Excel2007. A costly reminder to always ask more questions!)

Choose tool(s) carefully to maximize usefulness. I generally opt for interviews where probes can offset “one-shot” data collection situations. Here, I instead designed a qualitative survey, using mostly open-ended questions, for efficient gathering of perspectives. The client collected surveys themselves, disseminating hard copies and a SurveyMonkey.com link, and accessed a targeted sample from within a community coalition.

Familiar guidelines for interview and survey design apply to qualitative surveys, but I advise keeping questions very focused and surveys as short as possible to mitigate higher skip rates with qualitative surveys.

Cool Trick

You may think your reporting options are limited compared to quantitative results. Not so! Instead of writing text-heavy reports that eat up valuable time, and folks are disinclined to read (#TLDR), consider telling “data stories” using bullet points and visualizations. This client received a two-pager for internal, local stakeholder, and state use. I’ll also provide an in-depth explanation of results and action steps in a webinar.

Rad resources

Jansen’s “The Logic of Qualitative Survey Research and its Position in the Field of Social Research Methods.”

Great tips on qualitative surveys from Nielsen Norman.

Awesome tips from CRC colleagues for larger community surveys.

Achievable qual visualization ideas from Ann Emery.

Some tools for qual analysis and visualization from Tech for Change.

I genuinely enjoy working creatively with clients, because it makes evident how suited qualitative methods for linking research to action. I’d love to hear how others do this work, please get in touch!

image of Jill Scheibler

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top