AEA365 | A Tip-a-Day by and for Evaluators

TAG | practitioners

Hey there!  Liz Zadnik here, Outreach Coordinator for the tip-a-day blog and sometimes Saturday poster.  As you may know from some of my previous posts, my primary job is not as an evaluator.  I like to consider myself an Evaluation Enthusiast – bringing my love of evaluation to others in the anti-sexual violence movement.  

Now for some (not us, of course), “evaluation” and “fun” are not words often used in the same sentence.  I’ve made it my mission over the past few years to infuse all my trainings with fun activities that explore the variety of ways evaluation can enrich practice and capture meaningful work.  

Rad Resource: I can’t say enough about Hallie Preskill and Darlene Russ-Eft’s book Building Evaluation Capacity: 72 Activities for Teaching and Training.  I’m also a big fan of “thinking with things” and encouraging adult learners to play as a way of getting to creative problem-solving.  These approaches have also helped training participants associate a fun atmosphere with learning about evaluation, data, and research.

Part of being an Evaluation Enthusiast also means connecting with researchers and professional evaluators to fill in gaps and stay informed.  I know there is much to be learned and not enough time in the day to learn it all!  

Rad Resource:  The National Sexual Violence Resource Center recently hosted an xCHANGE forum with two brilliant evaluators.  Discussion threads included topics on alternatives to the pre/post-test approach, evaluating prevention with young children, and assessing community-level prevention efforts.  The forum was an opportunity for practitioners to connect with one another and hear from practice-minded evaluators.  Bridging that gap is incredibly important as we strive to inform practice with research and infuse practice into research.

The xCHANGE forum got me thinking about other ways evaluators and organizations can engage practitioners in a more widespread way using social media.  What about hosting a twitter talk or town hall for to provide introductory technical assistance?  Using themes and hashtags like #writinggoodsurveys or #evalbasics could help share best practices while also connecting practitioners and evaluators.  

So, dear AEA365 readers, how do you bring fun and enthusiasm to your work?   

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi there, Liz Zadnik here, bringing you another Saturday post focused on practitioner experiences and approaches. Today I’m going to focus on my personal journey to stay up-to-date and relevant in all things evaluation.

I was not formally trained as an evaluator – everything I know has been learned or gained through on-the-job hands-on experiences and mentorship (I’m very lucky to have been able to work with a few brilliant evaluators and researchers!). Self-study, reading, and ongoing training have been intentionally incorporated into my personal and professional schedule.

Rad Resource: Coursera is an excellent resource for online learning. You can even get certifications in concentrations after completing a set of courses in sequence. They have a number of courses around data analysis and data science!

Rad Resource: iVersity coordinates and archives some really interesting and innovative massive open online courses (MOOCs). The “Future of Storytelling” course gave me a number of ideas and skills for crafting accessible and engaging trainings and resources, as well as some insights for capturing stories for program evaluation. Recent and future courses focus on idea generation methods and gamification theory.

Lesson Learned: Follow your gut! At first I thought I needed to select courses, books, and resources that were explicitly “evaluation-y,” but found it was those courses that made me say “Oooh! That looks interesting!” helped me think creatively and find ways to enhance my evaluation and program development skills.

Rad Resource: MIT Open Courseware is much more structured and academic, as these are courses held at MIT. These require – for me – a bit more organization and scheduling.

Rad Resource: edX is another great chance to engage in online courses and MOOCs. Right now they have two courses on my “to-take” list: Evaluating Social Problems and The Science of Everyday Thinking.

Are there other online course providers or resources you rely on to stay current? How do you stay up-to-date and innovative as you balance other obligations and projects?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi there, Liz Zadnik here, bringing you another Saturday post focused on practitioner experiences and approaches. Today I’m going to focus on a recent (and recurring) experience of getting others excited about evaluation and capturing information.

It is a source of pride that many of my colleagues have said, “Liz, you bring such an enthusiasm for evaluation – it really helps getting people engaged and interested.” Now, I’m not the most knowledgeable or experienced person, but I do know that evaluation and assessment hold an important place in the present and future of the anti-sexual violence movement.

Hot Tip: During a recent webinar I was facilitating, I was talking about sharing data and building trust with community members. I was trying to think of how to explain it and used the analogy of constellations: we do not “own” the stars, but have drawn connections to tell stories about the past, present, and future.

Lesson Learned: Look up! Sometimes this could be literal or figurative. But getting some perspective and being creative can go a long way in engaging people in conversations about evaluation. In my experience, folks often see numbers and equations and statistics (which is fair and true), and this prevents them from seeing how evaluation can help tell a story. Their story.

Rad Resource: The Texas Association Against Sexual Assault released a new toolkit presenting activity-based assessment as a strategy for collecting evaluation data while also implementing a prevention and education program. I’ve found this to be a great way to broaden people’s minds to how evaluation can work for them.

I hope this post has helped illuminate the inner workings of a practitioner passionate about evaluation. My time with aea365 has been incredible so far – I have learned so much and look forward to hearing your thoughts and comments!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Jen Przewoznik, Director of Prevention and Evaluation at the North Carolina Coalition Against Sexual Assault. I have been working with and within lesbian, gay, bisexual, transgender, queer, and intersex (LGBTQI+) communities for 15 years. I’d like to share some thoughts about conducting research with and within LGBTQI+ communities that I have learned, using as an example a current study I am co-investigating.

Research with and within LGBTQI+ communities has happened for decades. More and more of this research is conducted by people who are well trained in data collection and analysis regarding people who claim non-normative sexual and gender identities. Unfortunately, a lot of this research still misses the mark. Some researchers, agenda-driven, “miss the mark” because they are actively trying to defame LGBTQI+ people. Most studies, however, seem to miss the mark due to fundamental design flaws.  There are still measurement tools being created (maybe right now?!?! Let’s hope not right now) that conflate sexual orientation and gender identity.

Hot Tip: Friends don’t let friends conflate sexual orientation and gender identity. I know you wouldn’t do this, but if you see a researcher doing this, please tell them to stop.

Hot Tip: Engage BOTH LGBTQI+ people and researchers in the process of creating instruments to better understand LGBTQI+ lives and experiences.  Myself and Juliette Grimmett, NC Sexual Violence Prevention Team member, are collaborating with Drs. Paige Hall Smith and Leanne Royster of UNC Greensboro on a study about LGBTQI+ peoples’ experiences with sexual violence on NC College Campuses.  The results will help campuses create inclusive and affirming sexual violence prevention programming. We began by holding a daylong semi-structured qualitative discussion group to engage folks in conversations about sexual violence and LGBTQI+ communities. People were chosen for their experience in sexual violence or LGBTQI+ campus work with an emphasis on inviting people we knew to be allies and/or themselves LGBTQI+-identified.

Lessons Learned: The output from the meeting heavily informed the survey, which includes questions about sexual violence without using normative terms for body parts and allows participants to choose “all that apply” for identity questions. Our colleagues reminded us that this work can’t be as neat and tidy as it sometimes seems researchers and statisticians would like.

When we exclude necessary research elements because we do not have the knowledge or are too concerned with whether the data will be publishable (statistical significance, the enemy of robust LGBTQI+ research. Kidding. Sort of.), we are left with results that are largely unreliable. While this shouldn’t hold us back from doing this work, it is incredibly important that we continue to explore ways to ask difficult questions and analyze complex responses in order to truly understand peoples’ lived experiences.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hi there, Liz Zadnik here, new(ish) member of the aea365 curating team and sometimes Saturday poster. Last year Sheila posed the question What is it that YOU would like to read about on this blog?

One of the responses resonated with me, as it represented my relationship with evaluation as a professional:

I would love to see a post, or series of posts about evaluation from the perspective of practitioners for whom their primary job is not evaluation. Perhaps tips on how to best integrate evaluation into the myriad of other, seemingly more pressing, tasks without pushing it to the back burner.

I work in the anti-sexual violence movement at a state coalition, focusing on prevention strategies, training, and making community-based rape crisis centers accessible to people with disabilities. These three areas are my priorities – there are deliverables and activities that don’t always include evaluation and assessment. Many times – given my love of evaluation – I am the sole voice at the table asking about an evaluation plan. Most of the time we can weave evaluation in from the ground floor, other times it happens a little late(r).

Hot Tip: Ask this (or a similar) question: “How will we know we’ve been successful?” This is the most effective way I have found to help get people thinking about evaluation. It has started some of the most engaging and enlightening conversations I’ve ever had, both about a project and the work of the movement.

Lesson Learned: Sometimes, evaluation takes a backseat to program implementation and grant deliverables. This can be disappointing (to say the least), but I do see a change. Funders are more frequently asking for research, “evidence,” or assessment findings, providing evaluation enthusiasts (like myself) to engage our colleagues in this work.

Lesson Learned: Practice and challenge yourself, even if no one is ever going to see it. One of the ways I “integrate evaluation into the myriad of other, seemingly more pressing, tasks” is evaluating myself and my own performance. I regularly incorporate evaluative questions into training feedback forms, look for ways to assess the effectiveness of my technical assistance provision, and record my professional progress throughout the year. I sit in on as many AEA Coffee Break webinars and other learning opportunities as I can, always practicing the skills discussed and looking for ways to apply them to my work.

I would so appreciate hearing from other practitioners (and evaluators!) about their experiences infusing evaluation into their work. I’d also be happy to answer any questions you might have or write about specific projects in the future. Let me know – the aea365 team is here to please!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Deshonna Collier-Goubil and I am a young scholar (newly minted PhD) who has had the wonderful opportunity to collaborate with seasoned evaluators and practitioners throughout my graduate education. Most recently I collaborated with a group of practitioners and evaluators to contribute a chapter in a newly released volume specifically written for young scholars. Our book chapter highlights the importance of collaborating with practitioners.

Lessons Learned: Collaboration has many benefits for both evaluators and practitioners. For the evaluator, receiving assistance from practitioners may help to decrease barriers to rich data, the quality of research can be improved with collaboration, and the overall research process can be improved by adding the input and assistance of frontline workers. For practitioners, evaluators can aide in obtaining research funding, clarifying research goals and expectations, and can highlight the need for institutional change or can put sound research behind an excelling program.

Research collaborations can be transformative for both the evaluator and the practitioner. One should approach a collaboration with deliberation, willing to both teach and learn. An array of cognitive, technical, and interpersonal skills are needed to develop and maintain effective collaboration. Having a firm grasp on communication, trust, honesty, respect, commitment, and flexibility can make or break a collaborative relationship.

Keep in mind, however, that just as benefits exist barriers may also arise during a collaborative effort. Evaluators should try to be as open and honest as possible with practitioners in initial negotiations to attempt to eliminate issues popping up in the future. Discussing division of labor, purpose of the collaboration, timelines for completion, how research will be conducted, data ownership, and how results will be communicated and disseminated for example can eliminate misunderstandings about these things in the future. Other barriers to be cognizant of are blurred roles, divergent perspectives, differences in degree of institutional support, competing and conflicting goals, and communicating difficult results. Despite experiencing barriers, evaluators should embrace adversity and persist in the collaborative relationship. Overcoming these barriers can strengthen the collaboration.

Overall, in a model collaboration, evaluators and practitioners develop shared goals, with consensus on a few key practice and research standards. The investment of time, resources, effort, flexibility, and the willingness to think outside of the box are required. Members of the collaboration learn to enter each other’s world and appreciate the others perspective. This is where the true learning begins.

Would you like to discuss evaluator-practitioner collaboration more with Deshonna and her colleagues? She’ll be contributing to a roundtable on the topic this November at Evaluation 2010, AEA’s Annual Conference.

·

Archives

To top