AEA365 | A Tip-a-Day by and for Evaluators

TAG | free tools

Hi there! We’re Anjie Rosga, Director, and Natalie Blackmur, Communications Coordinator, at Informing Change, a strategic consulting firm dedicated to increasing effectiveness and impact in the nonprofit and philanthropic sectors.  In working with clients large and small, we’ve found that organizations are in a better position to learn if they take the time to prepare and build their capacity to evaluate. To facilitate this process, Informing Change developed the Evaluation Capacity Diagnostic Tool  to measure an organization’s readiness to take on evaluation.

Rad Resource: The extent to which evaluation translates into continuous learning is in large part dependent on the organizational culture and level of evaluation experience among staff. These are the two primary categories—themselves divided into six smaller areas of capacity—in the Evaluation Capacity Diagnostic Tool. The tool is a self-assessment survey that organizations can use on their own, in preparation for working with an external evaluator or alongside an external evaluator. A lower score indicates that an organization should, for example, focus on developing outcomes and indicators, track a few key measures or develop simple data collection forms to use over time. The higher the score, the higher the evaluation capacity; staff may then be able to collect more types and a greater volume of data, design more sophisticated assessments, as well as integrate and commit to making changes based on lessons learned.cap tool graphic

However, there’s more to the Evaluation Capacity Diagnostic Tool than the summary score. It is a powerful way to catalyze a collective discussion and raise awareness about evaluation. Taking stock and sharing individuals’ perceptions of their organization’s capacity can jumpstart the process of building a culture that’s ready to evaluate and implement learnings.

Hot Tip: Make sure everyone is on the same page. Especially if an organization is inexperienced in evaluation, it’s important to discuss the vocabulary in the Tool and how it compares with individuals’ own definitions.

Hot Tip: Assessing evaluation capacity can be a tough sell. Organizations come to us because they’ve made the decision to begin evaluation, but gauging their capacity to do so can feel like a setback. To get organizations on board, we frame evaluation capacity as an investment in building a learning culture and the infrastructure that can make the most of even relatively limited data collection efforts.

We love to hear from folks who have implemented or reviewed the tool! Feel free to reach out to us at news@informingchange.com.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I am Anna Douglas and I conduct evaluation and assessment research with Purdue University’s Institute for Precollege Engineering Research; also known as INSPIRE. This post is about finding and selecting assessments to use in evaluation of engineering education programs.

Recent years have seen an increase in science, technology, engineering, and mathematics (STEM) education initiatives and emphasis on bringing engineering learning opportunities to students of all ages. However, in my experience, it can be difficult for evaluators to locate assessments related to learning or attitudes about engineering. When STEM assessment instruments are found, oftentimes they do not include anything specifically about engineering. Fortunately, there are some places devoted specifically to engineering education assessment and evaluation.

Rad Resource: INSPIRE has an Assessment Center website, which provides access to engineering education assessment instruments and makes the evidence for validity publicly available. In addition, INSPIRE has links to other assessment resources, such as Assessing Women and Men in Engineering, a program affiliated with Penn State University.

Rad Resource: ASSESS Engineering Education is a search engine for engineering education assessment instruments.

If you don’t find what you are looking for at the INSPIRE, AWE, or ASSESS databases, help may still be there.

Lesson Learned #1: If it is important enough to be measured for our project, someone has probably measured it (or something similar) before. Even though evaluators may not have access to engineering education or other educational journals, one place to search is Google Scholar with keywords related to what you are looking for.  This helps to 1) locate research being conducted in the similar engineering education area (and they may have used some type of assessment) and 2) locate published instruments, which one would expect has a degree of validity evidence.

Lesson Learned #2: People that develop surveys, generally like others to use them. It’s a compliment. It is ok to contact the authors for permission to use the survey and validity evidence collected, even if you can not access the article.  At INSPIRE, we are constantly involved in the assessment development process. When someone contacts us for use of an instrument, we view that as a “win-win”… the evaluator gets a tool, our instrument gets used, and with the sharing of data and/or results, we can get further information about how the instrument is functioning in different settings.

Lessons Learned #3: STEM evaluators are in this together. Another great way to locate assessment instruments is to post through the STEM RIG in LinkedIN, or pose the question to the EvalTalk listserv. This goes back to Lesson Learned #1: most of the important outcomes are being measured by others.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings this is June Gothberg, Senior Researcher at Western Michigan University.  A few years ago, I became involved with AEA’s Potent Presentations Initiative and worked with Stephanie Evergreen to include universal design principles.  I currently hold a position on the p2i advisory board.  What I didn’t anticipate when I started working with the group is how much p2i would change my presentation worldview.  At conferences or watching any presenter, I find myself reflecting on key p2i principles.  I’ll say things like, “too many bullets” or “that picture doesn’t bleed off the page.” Today, I thought I’d share my own lessons learned.

Lessons Learned:

Evaluators need presentation skills.  As professional evaluators we are often called upon to provide an overview of evaluation results.  Our presentation skills and message can directly impact an organization’s evaluation use.

Often, you don’t know what you don’t know.  I think we’ve all sat through mind numbing presentations.  I’ve always blamed it on a boring speaker with poor delivery.  What I didn’t know was to create potent presentations, delivery is just one component.  Potent presenters need to attend to:

  • Message
  • Design
  • Delivery

I highly recommend these two p2i tools: Presentation Assessment Rubric and The Messaging Model

If you are unsure where to begin, start with eliminating bullets.   As a past classroom teacher, this was difficult for me.  I thought if I didn’t put my content in bullets, the students wouldn’t learn what I intended.  The problem with bulletsis your audience can read your slides faster than you can read it to them. When you use bullet points, you risk reducing your presentation to a read-aloud session (BORING!).  Research shows that text heavy slides not only correlate with boring presentations, but also reduce learning.  Cognitive researcher Chris Atherton found “sparse slides” increased memory and attention.

If I don’t give bullet points then how will they remember what I said? 

  • Find an image to represent your point.
  • If you feel you must use bullets, use only one per slide. Here is an example from my own slide deck:
p2i example

p2i example

 

 

 

 

 

 

  • Give handouts.  One thing I’ve used from the field of Universal Design (UD) is the use of handouts with key points.  For your audience members with visual or hearing challenges, this increases their ability to participate.  It also gives your whole audience a space to take notes and follow along with key points without distracting from the presenter.

The devil is in the details and details take time.  Through our work with p2i we’ve found you need to begin at least three months in advance to create potent presentations.  A good planning tool with timeline for preparing presentations is the p2i Presentation Preparation Checklist.

Ensure your presentations are accessible to all.  For ideas to include all people please refer to Creating Presentations Potent for All.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Hi, I’m Annaliese Calhoun, Project Manager at the Center for Public Health Systems Science at Washington University in St. Louis. My team has been developing the Program Sustainability Assessment Tool, a new tool designed to help programs to rate their sustainability capacity. We define sustainability capacity as the ability to maintain programming and its benefits over time.

After testing reliability and refining the tool, the 40-item Program Sustainability Assessment Tool is now available in a free online format along with other great resources on sustainability planning.
Lessons Learned: Program sustainability capacity is about more than just funding. Through our research, we’ve identified eight key domains that influence a program’s ability to continue providing services and their benefits over time. One of these key components is program evaluation. The full list includes:

program sustainability framework

  • Political Support: Internal and external environments that support your program
  • Funding Stability: Establishing a consistent financial base for your program
  • Partnerships: Cultivating connections between your program and its stakeholders.
  • Organizational Capacity: Having the internal support and resources needed to effectively manage your program
  • Program Evaluation: Assessing your program to inform planning and document results
  • Program Adaptation: Taking actions that adapt your program to ensure its ongoing effectiveness.
  • Communications: Strategic communication with stakeholders and the public about your program.
  • Strategic Planning: Using processes that guide your program’s directions, goals, and strategies.
  • Building program sustainability capacity requires assessment and planning. The Program Sustainability Assessment Tool was designed to identify a program’s areas of sustainability strength and challenge. Program staff and stakeholders can then use results from this assessment to inform sustainability planning.

Rad Resource: I’m excited to announce the launch of this new website www.sustaintool.org  featuring the Program Sustainability Assessment Tool and sustainability planning resources. You can use this site to:

  • Deepen your understanding of the factors affecting sustainability capacity.
  • Create a free account and assess your program’s sustainability capacity. When you’re finished, you’ll get a summary sustainability report.
  • Invite other people to assess the same program and have all the responses totaled in a group summary sustainability report.
  • Explore tools and resources for developing a program sustainability plan.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from the University of Chicago. I am Sarah Rand, Associate Project Director and Research and Evaluation Associate at the Center for Elementary Math and Science Education (CEMSE).

A challenge for every evaluator is to present findings in informative way that will be useful to the client. This summer our team evaluated Google’s computer science camp for high school students and this project was a great opportunity to create our first online evaluation report. Our intention was to create a report that was easy to share with others, interactive, engaging, readable, and of course shared our data and findings. There were many issues to consider in creating a report online such as confidentiality, privacy, what information to make available by PDF, and the layout.

Lessons Learned:

  • It takes a team to put together an online report. Our evaluation team worked together to create a sketch of our ideas for the layout that we then shared with our center’s web developers. We worked closely with the web team to put our ideas on to the webpage and then iterated the design and content as the site was developed.
  • We included video clips and direct quotes from interviews with students and faculty members in the online report, which is a more objective way to present findings. Viewing and reading data in this form was particularly powerful for our client.
  • A report in this format does not necessarily require a presentation. It may be best to share the online report with your client, allow them to explore it on their own, and then answer any questions they may have.

Rad Resource:

  • Creating an online report from scratch can feel like a big job. Asana is a great free online tool for project management and it helps to break down the big task into smaller, more manageable pieces. There are many tasks involved in creating an online report including choosing photos, creating the layout, deciding which content should go where, and uploading video and audio. Asana helps you manage all of the tasks and assign them to the correct person. Asana is also helpful in keeping track of the status of various pieces of the project, so you can complete the project by your deadline.

Rad Resource I wish I had:

  • Examples of other online evaluation reports. I did some searching for previously created reports and could not find any. This may be because they are private, as is the case for the report we created, or that they just aren’t that common.

Feel free to get in touch if you want to learn more- email me or visit the Center for Elementary Math and Science Education  to learn about our evaluation work!

The American Evaluation Association is celebrating the Chicagoland (CEA) Evaluation Association Affiliate Week with our colleagues in the CEA AEA Affiliate. The contributions all this week to aea365 come from our CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


· ·

My name is Kate Rohrbaugh and I am Co-Chair of the Business, Leadership, and Performance TIG along with Michelle Baron.  I’m a Research Team Leader at a consulting firm in Virginia leading a group studying capital project organizations and teams in the process industries.  Today I’d like to talk about the renaming of our TIG and the tools we used to conduct this work.

When I accepted my current position five years ago, I had to rethink my AEA TIG membership because I had been a faithful member of educationally related TIGs, which were no longer relevant.  The number of TIGs at AEA can be overwhelming at times, but it also offers a wide variety of “homes” to evaluators regardless of the content area.  In my new position I turned to the Business and Industry TIG where I found a small but dedicated group of professionals.  I “lurked” with this group for a year, and within a short time (since it was a smaller group), I was able to take an active role in the leadership of this TIG.

In discussions with the leadership of the TIG and at AEA, we determined that the name of the TIG was unnecessarily limiting both presenters and audience – evaluation issues in for-profit organizations are relevant to a wide variety of evaluation professionals in both private and public sectors.  For this reason, we canvassed the membership and working closely with the AEA staff and board, identified a new name for our TIG.

Rad Resources

  • AEA maintains a list of members in each TIG and faithfully protects AEA membership from unnecessary contact, but this was a great source for contacting our membership about the desire to change the name of the TIG and solicit ideas for renaming the TIG.
  • To canvass our membership, we turned to the old faithful Survey Monkey which met our simple needs for collection and analysis.
  • To discuss the results with the TIG leadership located across and outside the United States, we turned to FreeConferenceCall.com, which is exactly what you think it is.

We are excited about the AEA 2012 in Minneapolis and hope to see lots of new faces at our presentations and business meeting!

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

We are Hilary Loeb and Kelly Bay of the Research and Evaluation Department at the College Success Foundation.  Many of our scholarship and support programs host events in which we collect data from students and educators.  As internal evaluators, we often rely on colleagues to collect and enter survey data from these groups. The results are used for staff learning internally and external reporting.  To help evaluators increase survey relevance, decrease demands on respondents’ time, and ultimately boost data quality and response rates, below are tips on instrument design and data collection.

Lessons Learned:

Look for ways to make surveys easier for staff to administer up front and more useful to stakeholders at the back end. The key is keeping the main focus on your programs while building support for data collection and analysis efforts.

Hot Tips

Survey Design:

  • Ensure that survey content is relevant: Meet with the entire program team and start with the question, “What do we want to learn about our program?” before discussing what’s needed for grant-reporting requirements.
  • Draft a survey using previously tested questions:  You don’t have to reinvent the wheel. By using previously tested survey questions from existing “banks” of items, you can save time and often improve the quality of the data collected (see Rad Resources).
  • Pilot test surveys with your program team and other stakeholders. This exercise never fails to elicit important feedback and takes only a modest amount of time. It’s amazing what fresh eyes can find! Where possible, use trainings and even Board meetings as opportunities to pilot and discuss surveys.

Survey Data Collection:

  • Be strategic about paper versus online surveys: When event participants can’t readily access computers, paper surveys may help increase response rates.  Online surveys are more appropriate when participants are able and willing to access technology.
  • Designate sufficient time and staff to collect survey data: Ensure that there is a specific time slot dedicated for survey completion. It should be near to but not at the very end of the event.  We suggest providing a script to help staff describe the survey’s purpose and value.
  • Consider using scanning software for paper surveys: Scanning software automates data entry by reading the optical marks on paper survey forms, which can reduce errors and save time.  Before purchasing, it’s best to test. We piloted a Free Demo of Remark Office OMR, to confirm that this was the right software for our organization.

Rad Resources:

A Bing search of survey item banks yields over 60 million results.  Our favorites in the education and youth development field include:  Ansell-Casey Life Skills Assessments,  the Youth Behavioral Risk Surveillance System and  National Center for Educational Statistics resources .

The American Evaluation Association is celebrating Internal Evaluators TIG Week. The contributions all week come from IE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· ·

I’m Ryan Watkins and I am an associate professor at George Washington University.  Among other things, I maintain the needs assessment resource website Gap In Results.  My topic today is how I remain current with research and practice in the many fields associated with needs assessment.

Hot Tips:

  • Start a Virtual Book Club. Book clubs help me to stay motivated to continually read the latest literature, and starting a club is simple.  My club meets four times a year.
    • We use Freeconferencecall.com to host calls since their service is free and downloadable MP3 recordings are available to share with those who cannot attend.
    • To manage the group, I use a spreadsheet on Google Docs to maintain the participant list and track potential books that the members of club might enjoy.
    • We also use free Doodle polls to (a) select the books (I limit the choice to four or five and then the members select their preference) and (b) schedule times when the most members can join the book discussion.  Based on this same model a graduate student at another university also started an “articles club” to read and discuss six research articles each year.
  • Create a Personal Learning Network. In today’s world, finding all of the latest information on research and practice is too much for any one person. I found six colleagues around the globe who are interested in the topics that I find most valuable and we agreed to simply share articles, books, blog postings, and other things that we are reading. We agreed that when you receive an email recommending a valuable resource that there is no pressure to reply or comment, we are just sharing what we find (not starting dialogues, though if individual members want to discuss resources they are more than welcome to without copying everyone in group).
  • Routinely Review Journals and Magazines. There are numerous publications on most every topic these days, and it is hard to keep up with all the information that is available. Once a year, however, I review the Table of Contents from each issue of many publications in order to identify potential articles that will be of interest.  My current list of journals and magazines is around 35, but you can start with just 5 or 10 that are of the most interest to you.  Most publishers provide the Table of Contents from each issue on their website, and from there you can read the abstracts.  You can also subscribe to RSS feeds that will push the contents of issue to your email every month if that is preferred, though I recommend setting up a separate email account for storing all the emails.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

· · · ·

We’re Amy Schaller and Bryna Koch, Evaluation Specialists at the University of Arizona, Cooperative Extension Services.

How do you support an inaugural cross-site evaluation with a national initiative while incorporating technology and building grantee capacity?  This was the charge of the CYFERnet (Children, Youth, and Families Evaluation & Research Network) Evaluation Team at the University of Arizona.  Our vision was to create a website that would serve as a central platform to house a range of evaluation resources, tools, and training materials for grantees.  And so, CYFERnetSEARCH.org was born.  The evaluation capacity-building website, designed to support the efforts of the Children, Youth, and Families At-Risk (CYFAR) initiative funded by USDA-NIFA, is also available to the public.

The resources we developed for the site included: content for eight learning modules including quizzes and videos, a searchable database of vetted evaluation instruments, a user-account feature, and online logic model and survey builders.  All of these features were developed from the ground-up and much energy was invested in generating original content, design, and features; no small undertaking. The process was, and remains, highly collaborative with the features on the website evolving numerous times since they were first developed.

Lessons Learned – DIY web development is fairly common these days, but to do so successfully requires some strategic vision and input from outside sources.  Here are some “Behind the Scenes” tips we can share from our experience to help those of you whose evaluation work intersects with website development:

  • If you can, work with web-design professionals.  Working across disciplines allows fresh thinking and combined perspectives. Solicit input from professionals with varying backgrounds.
  • Do your design research.  Select websites you like to use and consider what makes those sites appealing to you. Is it the feel or the look? How do they provide information? How do they make that information accessible?
  • Choose the tone for your site.  What feeling do you want to convey to your user?  What colors, images, and features communicate the essence of your project/company?
  • Do your platform research. Drupal, WordPress, Joomla? Should you consider open-source? How much modification is needed so the platform works for you?
  • Be the user. Interact with your site as a user would on a daily basis.
  • Organization and presentation matter. How can the content be intuitively grouped into themes that enable easier navigation for your users?
  • Consider the pace of technology and industry.  Will your design, platform and content be relevant in one or two years?

Rad Resources – 

cyfernetsearch.org – Our website

Trello.com A free online collaboration tool for project management

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week with our colleagues in the AZENet AEA Affiliate. The contributions all this week to aea365 come from our AZE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · ·

I’m Kelly Hannum. I’ve been evaluating leadership development programs for almost two decades. I am convinced that effective leaders and effective evaluators have similar mindsets and employ similar skills.  I encourage leaders to think like evaluators, and via this post I’m encouraging evaluators to think and develop themselves as leaders.

At the Center for Creative Leadership, we say “effective leadership results in shared direction, alignment, and commitment”. Leaders help focus people on defining and achieving something of shared value, but effective leadership is often a collective act.  How often you have worked with diverse stakeholders to create shared direction, alignment, and commitment related to an evaluation? Stakeholders often have different values and perspectives. Our role as evaluators is to effectively and respectfully lead these complex situations in a manner that reflects our Guiding Principles. What does “value” look like from different perspectives? What types of evidence of “value” are appropriate? Our training and experience is a powerful asset, but if left unchecked our assumptions can be a liability. Thinking of, and developing, ourselves as leaders can help us improve our evaluation practice.

Lessons Learned:

Be curious about yourself. Self-awareness is the foundation for being a good leader and for being a good evaluator. Understand your assets and limitations, plan accordingly, and continue to develop yourself. Challenge assumptions that may get in the way of understanding value from different perspectives.  Seek, consider, and apply feedback about yourself.

Be curious about others. Pay attention to other perspectives, that is the foundation for respect and understanding of complex situations. Examine and reexamine perceptions and beliefs, assumptions or stereotypes, about individuals, groups, and even programs and processes. Seek different perspectives and listen with curiosity and openness.

Hot Tip:

Reflect on how you create shared direction, alignment, and commitment. Think about keeping a journal or having informal conversations or debriefs after key meetings.

Most successful development experiences contain elements of assessment, challenge, and support – are you balanced? What do you need to add or reduce?

  • Assess yourself from different perspectives to uncover areas of excellence as well as areas for growth
  • Challenge yourself by learning about and trying new things
  • Get the support you need to be effective

Rad Resources:

Track your reflections using a free online journal like Penzu.

The Leadership Learning Community offers a collection of free leadership development resources including evaluation of leadership development.

The Center for Creative Leadership offers free articles and podcasts. The white papers are particularly helpful.

AEA’s Statement on Cultural Competence in Evaluation provides an overview of cultural competence, why it is important, and how to develop it.

Flipping the Script: White Privilege and Community Building is useful to my work.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Older posts >>

Archives

To top