AEA365 | A Tip-a-Day by and for Evaluators

TAG | resources

Hello! We are Natalie Wilkins and Courtney Barnard, and we are members of the Community Psychology TIG leadership team. Community and stakeholder engagement is a core principle of both evaluation and community psychology. As evaluators and community psychologists, it is not surprising that we are very interested in learning about how stakeholder engagement strengthens our work and how we can continuously improve the ways we include communities in our work.

We have compiled some of our favorite resources for evaluating community collaborations and coalitions. We hope they will be helpful in your work too!

Rad Resources:

  • The National Cancer Institute has developed the Level of Collaboration Scale. This scale is a composite of other existing models and instruments and its purpose is to assess collaboration among partners.
  • Check out these great tools from Fran Butterfoss and Coalitions Work
    • Coalition Member Survey
      A 49-item survey for members to rate the coalition on aspects related to planning, implementation, leadership, local and statewide involvement, communication, participation, progress and outcomes.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, my name is Courtney Barnard and I am a social worker at a children’s health care system in north Texas. I spend half my time out in the community coordinating a coalition and the other half evaluating seven other community-based coalitions and programs.

You can use evaluation to help plan for the sustained impact of a coalition’s efforts. The long-term goal is not always to sustain the coalition itself, or even its main activities, but to sustain the change in knowledge, attitudes, skills, and behaviors that have occurred because of the community’s work.

Our sustainability plans are based on three core concepts of sustainability, as outlined by Monte Roulier with Community Initiatives Network.

Hot Tip: Begin the sustainability conversation with a visioning activity, using this prompt: “Imagine that it is [three years in the future] and your coalition has been extraordinarily effective for the past three years. Write a letter to a colleague describing what’s now going on as a result of a fruitful past 3 years.”

  1. Building an effective coalition and “backbone” support – Successful coalitions share a common vision based on mutual benefits and prioritized strategies and outcomes aimed at shared goals. A strong backbone is needed to guide the group in articulating their goals and in planning how they will get there.

Lessons Learned: Backbone supports are able to offer structure (e.g. bylaws, coordinators), lead strategic planning processes, and engage others in the community. Although this structure takes time to develop, it is essential for long-term functioning.

Courtney's pyramid

 

  1. Employing the right mix of strategies – Use easily-achievable actions to engage members and build momentum. These quick wins will help the coalition spring board into longer-term, more complex actions.

Rad Resource: Use the Spectrum of Prevention from the Prevention Institute to assess where your strategies fit. If your coalition chooses not to take action at one of the levels, make sure someone else in the community is acting on that level.

Hot Tip: Use this “formula” for Impact when prioritizing strategies/action steps to see how you can get the biggest bang for your buck.

Courtney's formula

  1. Securing diverse resources – Braid and pool separate sources of funding when possible and leverage non-financial resources (communication, staff, volunteers, office space, food, research, fiscal management, etc.).

Lessons Learned: Make your propositions compelling so no one can say “no”. Be clear and specific about real needs. Demonstrate results and your plan for future outcomes. And frequently praise contributors, no matter how small the contribution.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Norma Martinez-Rubin, an independent evaluation consultant with a practice focused on evaluating disease prevention and health promotion projects. Small-sized independent evaluation consultancies can offer clients the accessibility and nimbleness absent in larger organizations mired in their own bureaucracies. Our multiple roles as Chief Executive Officer, Chief Operations Officer, Chief Informational Officer, and Chief Creative Officer require becoming informed about managing ourselves, our content, our service, and our client relationships. I hope the following lessons and resources will help you to be successful as a small independent evaluation firm.

Lesson Learned: Shifting career gears to launch a solo practice entails exploration, soul searching, and rethinking how to best balance personal life and professional interests. Hence, the attraction to sole proprietorship and its rewards: autonomy, schedule flexibility, and choice over the selection of work projects that coincide with personal values. Did you notice my exclusion of endless financial reward? That was intentional.

Going solo is among the riskiest career endeavors. I have learned that I must be ready to ride the financial waves associated with economic shifts and their effects on prospective clients. Contract work may not be as abundant when prospective clients are facing economic woes. The financial safety net (steady income, pre-packaged insurance and retirement benefits, paid travel, and professional development) offered by employers is absent. So, what to do? Short of obtaining a Masters in Business Administration degree, identifying a business mentor is of utmost necessity. Passion alone will not yield income to cover the costs of doing business.

Rad Resources:

  • Perhaps not so “rad” because of its governmental roots, yet a helpful orientation to the ins and outs of establishing and growing a business, the U.S. Small Business Administration is a resource for starting and managing a business, contracting, and obtaining loans and certifications as business progresses.
  • SCORE–previously known as the Service Corps of Retired Executives and supported by the SBA– offers counseling and mentoring.  Through the SBA or SCORE websites, one may set up free email news particular to a location.
  • Social marketing options can provide visibility and broad reach for small businesses. As a service organization, it is of utmost importance to identify one’s value to prepare a strong online presence. Susan Chritton’s book, Personal Branding for Dummies presents a fun, provocative approach to identifying personal value as a service provider. After all, along with your content expertise, the core of your business is you. To thrive during the ups and downs of the business of independent consulting, developing that core strength is essential.

The American Evaluation Association is celebrating the Independent Consulting TIG (IC) Week. The contributions all week come from IC members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

We are Hilary Loeb and Kelly Bay of the Research and Evaluation Department at the College Success Foundation.  Many of our scholarship and support programs host events in which we collect data from students and educators.  As internal evaluators, we often rely on colleagues to collect and enter survey data from these groups. The results are used for staff learning internally and external reporting.  To help evaluators increase survey relevance, decrease demands on respondents’ time, and ultimately boost data quality and response rates, below are tips on instrument design and data collection.

Lessons Learned:

Look for ways to make surveys easier for staff to administer up front and more useful to stakeholders at the back end. The key is keeping the main focus on your programs while building support for data collection and analysis efforts.

Hot Tips

Survey Design:

  • Ensure that survey content is relevant: Meet with the entire program team and start with the question, “What do we want to learn about our program?” before discussing what’s needed for grant-reporting requirements.
  • Draft a survey using previously tested questions:  You don’t have to reinvent the wheel. By using previously tested survey questions from existing “banks” of items, you can save time and often improve the quality of the data collected (see Rad Resources).
  • Pilot test surveys with your program team and other stakeholders. This exercise never fails to elicit important feedback and takes only a modest amount of time. It’s amazing what fresh eyes can find! Where possible, use trainings and even Board meetings as opportunities to pilot and discuss surveys.

Survey Data Collection:

  • Be strategic about paper versus online surveys: When event participants can’t readily access computers, paper surveys may help increase response rates.  Online surveys are more appropriate when participants are able and willing to access technology.
  • Designate sufficient time and staff to collect survey data: Ensure that there is a specific time slot dedicated for survey completion. It should be near to but not at the very end of the event.  We suggest providing a script to help staff describe the survey’s purpose and value.
  • Consider using scanning software for paper surveys: Scanning software automates data entry by reading the optical marks on paper survey forms, which can reduce errors and save time.  Before purchasing, it’s best to test. We piloted a Free Demo of Remark Office OMR, to confirm that this was the right software for our organization.

Rad Resources:

A Bing search of survey item banks yields over 60 million results.  Our favorites in the education and youth development field include:  Ansell-Casey Life Skills Assessments,  the Youth Behavioral Risk Surveillance System and  National Center for Educational Statistics resources .

The American Evaluation Association is celebrating Internal Evaluators TIG Week. The contributions all week come from IE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· ·

I’m Ryan Watkins and I am an associate professor at George Washington University.  Among other things, I maintain the needs assessment resource website Gap In Results.  My topic today is how I remain current with research and practice in the many fields associated with needs assessment.

Hot Tips:

  • Start a Virtual Book Club. Book clubs help me to stay motivated to continually read the latest literature, and starting a club is simple.  My club meets four times a year.
    • We use Freeconferencecall.com to host calls since their service is free and downloadable MP3 recordings are available to share with those who cannot attend.
    • To manage the group, I use a spreadsheet on Google Docs to maintain the participant list and track potential books that the members of club might enjoy.
    • We also use free Doodle polls to (a) select the books (I limit the choice to four or five and then the members select their preference) and (b) schedule times when the most members can join the book discussion.  Based on this same model a graduate student at another university also started an “articles club” to read and discuss six research articles each year.
  • Create a Personal Learning Network. In today’s world, finding all of the latest information on research and practice is too much for any one person. I found six colleagues around the globe who are interested in the topics that I find most valuable and we agreed to simply share articles, books, blog postings, and other things that we are reading. We agreed that when you receive an email recommending a valuable resource that there is no pressure to reply or comment, we are just sharing what we find (not starting dialogues, though if individual members want to discuss resources they are more than welcome to without copying everyone in group).
  • Routinely Review Journals and Magazines. There are numerous publications on most every topic these days, and it is hard to keep up with all the information that is available. Once a year, however, I review the Table of Contents from each issue of many publications in order to identify potential articles that will be of interest.  My current list of journals and magazines is around 35, but you can start with just 5 or 10 that are of the most interest to you.  Most publishers provide the Table of Contents from each issue on their website, and from there you can read the abstracts.  You can also subscribe to RSS feeds that will push the contents of issue to your email every month if that is preferred, though I recommend setting up a separate email account for storing all the emails.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

· · · ·

Feb/12

18

Susan Kistler on Joining AEA

I’m Susan Kistler, the American Evaluation Association’s Executive Director. Once each year I write explicitly to those of you who are not AEA members in hopes of encouraging you to join the association.

Hot Tip – Join AEA: Membership in AEA is only $80 per year, $60 if your primary membership is with the Canadian Evaluation Society, or $30 if you are a full-time student. It quickly pays for itself. You can join online.

Hot Tip – AEA members have access to four of the leading evaluation journals: As a member, you’ll receive hardcopy and electronic subscriptions to AEA’s own journals, The American Journal of Evaluation and New Directions for Evaluation, as well as electronic access to Evaluation Review and Evaluation and the Health Professions. You’ll receive not only the latest articles, but also access to 20+ years of archival content.

Hot Tip – AEA members can attend over 40 free Coffee Break Demonstrations each year: And, members can access the recordings in the members-only Coffee Break Archive. Want to get a taste for them? Check out the list of upcoming CBDs as well as the public list of those that are already in the archives.

Hot Tip – AEA members receive discounts on professional development (online and in person): Full members receive $50 off each one-day workshop, whether offered as an AEA eStudy (David Fetterman is offering Empowerment Evaluation this week; Tom Chapel is offering Intro to Evaluation, and Michelle Kobayashi is offering Survey Development next month) or in-person at the AEA annual conference, where you’ll find over 50 pre- and post-conference workshops.

Hot Tip – AEA members engage with the field’s thought leaders: For one week each month AEA hosts a leader in the field on its members-only Thought Leaders Forum. The discussant shares insights, explores issues, and answers your questions. You can lurk in the background and take it all in, or dive in and discuss. Next up? Gail Barrington in March.

Hot Tip – AEA members promote their business: Whether you are an independent consultant or part of a 100 person firm, AEA membership gives you a free listing of your evaluation-related services in AEA’s Find-an-Evaluator database. Directly accessible from the homepage, the FAE listings are among the most highly used sections of the AEA website.

Hot Tip – AEA members build their professional networks and get questions answered: AEA membership includes membership in up to five of AEA’s 50+ Topical Interest Groups (TIGs), from Data Visualization and Reporting, to Multiethnic Issues in Evaluation, to Independent Consulting, to Quantitative Methods, to Evaluating the Arts and Culture, there is a TIG or two (or five) for everyone.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

· · ·

Our names are Wendy Viola, Lindsey Patterson, Mary Gray, and Ashley Boal and we are doctoral students in the Applied Social and Community Psychology program at Portland State University.  This winter, we took a course in Program Evaluation from Dr. Katherine McDonald.  We’d like to share three aspects of the seminar that we felt made it so useful and informative for us.

  1. Classroom Environment. The format of the course encouraged open and interactive dialogue among the students and the instructor. The atmosphere was conversational and informal, allowing students the space to work through sticky issues and raise honest questions without fear of judgment. Regular course activities allowed us to consider creative approaches to program evaluation and develop activities that we brought to class for other students. For example, Dr. McDonald incorporated program evaluation activities, such as Patton’s activities to break the ice with stakeholders, and Stufflebeam’s (2001) “Program Evaluation Self-Assessment Instrument,” into our classroom activities.

Hot Tip: Engage students by facilitating an open and interactive environment that fosters discussion and creativity.

  1. Course Content. The course covered both evaluation practice and theory, including the historical and philosophical underpinnings of evaluation theories. Because gaining expertise in the theory and practice of program evaluation in a 10-week course is not possible, Dr. McDonald provided us with a tremendous amount of resources for us to peruse on our own time and refer back to as necessary, as we begin working on evaluations more independently.

Hot Tip:  Provide students with templates, examples, and additional references about any activities or topics covered in order to allow them access to resources they will need once the course is over.

  1. Applications. One of the most valuable aspects of the course was its emphasis on the application of theory to the real world.  During the course, we developed and received extensive feedback on logic models, data collection and analysis matrices, and written and oral evaluation proposals. Additionally, we participated in a “career day” in which Dr. McDonald arranged a panel of evaluators who work in a variety of contexts to meet with our class to discuss careers in evaluation.

Hot Tip: Allow students to practice skills they will need in the real world and expose them to the diverse career opportunities in the world of program evaluation.

Our seminar only scratched the surface of program evaluation, but these features of the course provided us with a strong foundation in the field, and elicited excitement about our futures in evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

Hi! I’m Brian Yates, Professor in the Department of Psychology, and Director of the Program Evaluation Research Laboratory (PERL), at American University in Washington, DC. I’ve also been the AEA Treasurer for the past 3 years, and am looking forward to serving for 3 more.

I’ve included cost as well as outcome measures in my quantitative and qualitative evaluations since the mid-1970s.

Lesson Learned – 1) Costs are not money. Money’s just a way to get access to the resources that make programs work. What matters for programs, and what I measure when I’m evaluating costs, are people’s time — clients’ as well as staff’s, space used, transportation (of clients to and from programs, often) … and not just total time spent working in the program, but the amount of time spent in the different activities that, together, are the program.

Hot Tip: When asking stakeholders about program costs, I make a table listing the major activities of the program (therapy, groups, education, for example) in columns and the major resources used by the program (staff and client time, office space, transportation, for example) in rows. Different stakeholders put the amount of each resource that they use in each activity, and then compare others’ entries with their own. Insights into program operations often ensue!

Lesson Learned – 2) The most valuable resources may not have a price. Many programs rely on volunteered time and donated space and materials: these often don’t come with a monetary price attached. One can assign a monetary value to these resources according to what the same time from the same person would be paid in a job, but the most important thing to measure is the amount of time, the capabilities of the person, and ways they spent their time.

Lesson Learned – 3) When measured only as money, cost findings are instantly obsolete and do not aid replication. Inflation can quickly make specific monetary values for program costs out of date and, all too soon, laughably low. Translating 1980 dollars into 2011 dollars is possible, but still does not inform planners as to what specific resources are needed to replicate a program in another setting.

Lesson Learned – 4) When presenting costs, keep resources in their original units. Yes, time is money … but it comes in units of hours to begin with. Report both, and your audience will learn not just price but what it takes to make the program happen.

Rad Resource: Here’s a free on-line and down-loadable manual I wrote on formative evaluation of not only cost, but also cost-effectiveness and cost-benefit … and not just for substance abuse treatment! http://archives.drugabuse.gov/impcost/IMPCOSTIndex.html

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Archives

To top