AEA365 | A Tip-a-Day by and for Evaluators

TAG | College Access Programs

Hi, this is Tania Jarosewich of Censeo Group, a program evaluation firm in northeast Ohio, and Linda Simkin of Action Research Associates of Albany, New York. We worked on different aspects of the evaluation of KnowHow2GO, an initiative funded by Lumina Foundation to strengthen college access networks.  We are excited to share with you the College Access Network Survey, a resource that Linda helped to create as part of the Academy for Educational Development (AED) evaluation team. The network survey is a tool to gather network members’ perspectives about their engagement with a network and a network’s effectiveness and outcomes.

During implementation of KH2GO, the AED technical assistance team, with Linda’s help identified five dimensions of an effective network: network management, sustainable services systems, data-driven decision-making, policy and advocacy, and knowledge development and dissemination. This framework helped guide the development of the survey, technical assistance, and evaluation of network-building efforts.

As part of the evaluation, KnowHow2GO grantees invited members of their statewide or regional networks to respond to the survey. The Network Survey provided useful information for the foundation, initiative partners, technical assistance providers, network leaders, and network members to plan technical assistance and professional development, and allowed networks to monitor network health. With minor changes, the survey can be applied to network efforts focused on different content or service areas.

Lesson Learned: Support grantees’ Network Survey use and analysis. Network leaders focused on their work – not on evaluation. Letters that introduced the survey, an informational webinar, support monitoring response rates, and individual trouble shooting were helpful to encourage grantees to engage network members in the survey.

Lesson Learned: Provide targeted technical assistance and professional development based on survey findings. The survey results allowed technical assistance providers to target their support and helped to emphasize the usefulness of the survey instrument and process

Lesson Learned: Use network survey results to show progress towards network outcomes. Information about the strengths of each network were useful for the funder and participating networks. The survey results were triangulated with other evaluation data to provide a comprehensive analysis of growth in the network building process.

Rad Resource: You can obtain a copy of the College Access Network Survey and guidelines for its use from Carrie Warick, Director of Partnerships and Policy, National College Access Network (NCAN), WarickC@CollegeAccess.org, 202-347-4848 x203. The survey can be adapted for use with networks focused on various content areas.

Rad Resource: Keep an eye out for a longer article about the Network Survey that will appear in an upcoming issue of the Foundation Review. You can also access additional resources about the Network Survey here – handouts (free from the AEA Public eLibrary) and a Coffee Break webinar recording (free only for AEA members).

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Maran Subramain and I am a graduate student at Western Michigan University. I served as a session scribe at Evaluation 2010 and attended session number 577, Communication: At What level Does It Help or Hinder Evaluation Capacity? I chose this session because, as a fairly new graduate student in the evaluation program, I am interested in understanding the competencies that are needed by an evaluator, and communication or interpersonal skill is one of them.

Lessons Learned: This session explained about how volunteer board members can better serve in an evaluation association; and the challenges and benefits of small school district program evaluation. Here are some of the substances that I gained:

  • Small evaluation associations should limit the number of projects they are involved in and should form a strong committee for some of the projects they are involve in. By doing so, the association can better focus and manage the projects.
  • Attending a lot of board meetings and replying to a large volume of emails could be a burden to volunteer board members. Closer attention must be given to communication during the initial stages of board meetings or in emails so that ineffective communication between volunteer board members and information seekers can be avoided.
  • The second presentation explained the challenges and benefits of a program evaluation conducted in a small town in central Florida. The evaluators benefitted from direct communication with major stakeholders, easy access to schools, the program, and families, and less bureaucratic decision making.
  • Challenges encountered included the evaluators being viewed as ‘outsiders’ to the program when they were collecting data, and changes were hard to make due to various parties’ personal interests.
  • Small scale program evaluation can be as challenging as, or more challenging than, in a large school setting.

Hot Tip: I think there is a great potential for this topic to be developed more. For example, will any particular skills such as positivity and openness make the evaluator-stakeholder communication better? Or, can nonverbal cues such as ‘toning down’ the evaluators’ dress code when they visit very poor families help evaluators to gain more information? These ‘soft skills’ are not explored widely in evaluation, and they may bring lot of benefits if it is studied properly.

At AEA’s 2010 Annual Conference, session scribes took notes at over 30 sessions and we’ll be sharing their work throughout the winter on aea365. This week’s scribing posts were done by the students in Western Michigan University’s Interdisciplinary PhD program. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice.

· · ·

Hi, my name is Susan Geier and I am a doctoral student at Purdue University studying research methods, measurement and evaluation. I employ a participatory evaluation approach with the GEMscholar project and have learned much from the Native American college students and the dedicated program staff.

Lessons Learned: I would like to share my three R’s for participatory evaluation:

1. Build Rapport: In addition to conducting formal interviews and assessments, I interacted informally with the students and mentors when time allowed, during meals and in between activities. I spent time learning about Native American history and culture from the project team and students.

2. Demonstrate Relevance: I discussed with the stakeholders and participants possible benefits of the evaluation process and their unique roles in the improvement and success of the program components. For example, when the students expressed interest in helping future GEMscholars, a peer-mentoring option was added to the program. Consequently, students began to see the evaluation process as a mechanism for sharing their experiences and suggestions instead of an outside critique of their lives and activities.

3. Maintain Responsiveness: I provided the stakeholders with information in a timely and accessible format. Often these were oral reports followed by brief documents outlining the changes discussed. We had conversations about those issues that could not be resolved in a timely matter and possible effects on the program. In turn, the project team made ongoing changes, adding components where needed and modifying those elements that were not serving the objectives of the program. Assessments were modified if needed and the process continued.

Hot Tip: Journaling is a useful technique to capture real time reactions to interventions. This is particularly important when working with groups who are being introduced to unfamiliar and/or uncomfortable experiences as part of an intervention. I worked closely with the researcher and program coordinator to develop pertinent guiding questions for the students’ and mentors’ daily reflection journals. This is also a good time to develop an analysis rubric if applicable. Journals can be hand written or online (I provide a link to an online journal using Qualtrics). The journal entries provide a project team with valuable insights about how the program elements are perceived by all involved.

If you want to learn more from Susan, check out the Poster Exhibition on the program for Evaluation 2010, November 10-13 in San Antonio.

· ·

My name is Susan Kistler and I am AEA’s Executive Director. Although I normally contribute each Saturday’s aea365 post, I am very excited to hand off those duties this week to the Chair of the College Access Programs TIG. One of AEA’s newest TIGs, CAP is building a strong group and program for the conference and they are going to be sponsoring the coming week on aea365. All week long you’ll see great contributions from our CAP colleagues!

My name is Rita O’Sullivan. I teach at the University of North Carolina at Chapel Hill and am the Executive Director of EvAP (Evaluation, Assessment, & Policy Connections). Within AEA, I also serve as the Chair of the College Access Programs TIG.

Evaluating college access programs can be challenging: a) Program participation can differ greatly among students in the same program; b) Measuring the ultimate desired program outcome (i.e. advancement to college) can be difficult, as student data become harder to gather after they leave high school.

Lessons Learned – Tracking Program Participation: Often there are many different opportunities for students to participate in a college access program. For example, they may enroll in tutoring programs and/or take part in organized college visits. They may be part of an intensive Freshman Academy program that meets daily for a year or only attend a one hour career counseling session. Evaluators need to remember that outcomes are usually proportional to program participation, so they need to estimate participation levels. They also need to maintain a balance in terms of the evaluation resources that will be used to gather data about program participation. One college access program had its program coordinators spending 20% of their time (one day per week)collecting and entering program participation data. With all this effort, however, participant turnover made it impossible to draw any conclusions about the relationship of program participation to program outcomes. On the other hand, asking students annually about program participation can result in serious underestimates. A possible compromise might be to ask students quarterly to identify the program-related activities in which they have participated.

Lessons Learned – Tracking Students’ Entrance and Persistence into College: Most college access programs operate through local public school districts in middle schools and/or high schools. Gathering data from high school seniors just before graduation about their “intentions” for the up-coming Fall is common practice in many places. Unfortunately, this practice usually overestimates the desired outcomes. Yet a more accurate alternative can be much more difficult to pursue. The National Student Clearing House keeps such data but charges for its services. Beyond that there’s growing evidence that getting to college is just a first, albeit important, step in finishing college. It’s incumbent on the evaluator to understand the college-going patterns within a given state context, so that a reasonable estimate of college-going can be made. In some states the vast majority of students attend public colleges and universities, so that forging partnerships with these colleges can yield extremely useful estimates by which to measure program outcome accomplishments and even persistence rates. Where this isn’t the case, other strategies, such as the National Student Clearing House, need to be explored.

·

Archives

To top