AEA365 | A Tip-a-Day by and for Evaluators

TAG | Professional Development

Hello! We are Debi Lang and Judy Savageau from the Center for Health Policy and Research at UMass Medical School. Earlier this year, Debi published a post on how program-specific learning objectives can help measure student learning to demonstrate program impact. Today’s post shows how to measure whether training or professional development programs are meeting learning objectives using a retrospective pre-post methodology.

Start at the End!

Using a traditional pre-and-then-post approach to measure student learning can suffer when students over or underestimate their knowledge/ability on the pre-test because we often “don’t know what we don’t know.” Therefore, the difference between pre and post-program data may inaccurately reflect the true impact of the program.

Instead of collecting data at the beginning and end of the program, the retrospective pre-post approach measures students’ learning only at the end by asking them to self-assess what they know from two viewpoints – BEFORE and AFTER participating. The responses can be compared to show changes in knowledge/skills.

Below is an example of the retrospective pre-post design excerpted from the evaluation of a class on American Sign Language (ASL) interpreting in health care settings. Students are self-assessing their knowledge based on statements reflecting the learning objectives.

Hot Tips:

Here are some recommendations for designing a retrospective pre-post survey (as well as other training evaluation surveys):

  • Write a brief statement at the top of the form stating the purpose of the evaluation along with general instructions on when, how and to whom to return completed forms, a confidentiality statement, and how responses will be used.
  • Include space at the end to ask for comments on what worked and suggestions for improvement.
  • Since many learners may not be familiar with the retrospective approach, use plain language so instructions are easily understood. This can be especially important for youth programs and when written or verbal instruction is not given in a student’s native language.

And Now for the Statistics…

Generally, a simple paired t-test is used to compare mean pre and post scores. However, if sample sizes are too small such that the data are not normally distributed, the non-parametric equivalent of the paired t-test would typically be computed. To analyze the data from the ASL class, with a sample size of 12, we used the Wilcoxon signed-rank test. Below are the average class scores for the 3 measures.

Lessons Learned:

Using a retrospective pre-post design allows for analysis of anonymous paired data, whereas the traditional pre-post approach requires linking the paired data to each student, which may compromise anonymity.

If follow-up data is collected (e.g., 6 months post-training) to measure sustainability of knowledge, additional analytic testing would require a plan to merge the two data files by some type of ID number.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, we are Tim Sheldon and Jane Fields, Research Associates at the Center for Applied Research and Educational Improvement (CAREI) at the University of Minnesota. We serve as external evaluators for EngrTEAMS, a five-year, $8 million project funded by the National Science Foundation. The project is a partnership involving the University of Minnesota’s Science, Technology, Engineering, and Mathematics Education Center (the STEM Center) and Center for Compact and Efficient Fluid Power (CCEFP); Purdue University’s Institute for P-12 Engineering Research and Learning (INSPIRE!); and several school districts. EngrTEAMS is designed to increase students’ learning of science content, as well as mathematical concepts related to data analysis and measurement, by using an engineering design-based approach to teacher professional development and curriculum development.

Context:

As the external evaluators for this project, we based our evaluation framework on Guskey’s five levels of professional development (PD) evaluation (Guskey, 2002). He suggests evaluating (1) participant perceptions of the PD; (2) the knowledge and skills gained by participants; (3) the support from, and impact on, the organization; (4) participants use of their new knowledge and skills; and (5) the impact on student outcomes. In Guskey’s model, the aspects to be evaluated begin after delivery of the PD; that is, the framework does not specifically suggest assessing differences in participants or organizations prior to the delivery of the PD.

In the case of EngrTEAMS and other PD we have evaluated, we have noticed that even though participants receive the same training (i.e., the same “treatment”), their capacity to apply the new knowledge and skills (Guskey level 4) is not the same. What might explain this? We suggest that one way to better understand and explain these differences in implementation (and eventually student outcomes) is to also better understand participants and their organizations prior to the PD. Not all participants start the PD in the same place; for example, participants come to the PD with different levels of prior knowledge, different attitudes about the PD, different classroom management abilities, and different levels of organizational support.

Lesson learned:

When possible, assess implementation readiness of participants and their organizations prior to the delivery of the PD. This may include obtaining information about organizational readiness to support novel approaches, as well as participants’ prior content knowledge and classroom experience, their perception of school or district buy-in, and participants’ attitudes about the training and future adoption of what they will be learning.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi! This is Sheena Horton again, President-Elect and Board Member for the Southeast Evaluation Association (SEA). I wanted to close out SEA’s AEA365 week by providing you with a few tips for stimulating and maintaining your professional growth. In a similar vein of thought as evaluation being an everyday activity, our own professional growth should be approached as an everyday opportunity.  Isaac Asimov once said, “People think of education as something they can finish.”  Learning is a life-long commitment.

The extent to which we seek growth opportunities should not be limited by our current positions, schedules, finances, networks, or fears and hesitations, but be defined by the depth of our intellectual curiosity, aspirations, and commitment to evaluating and bettering ourselves.

Hot Tips:

  • Search YouTube regularly for quick tips or full lessons to develop your knowledge or skills in a specific area, such as in Excel. There are also many free virtual courses and trainings offered at CourseraedXMIT OpenCourseWareFindLectures, and Udemy.
  • Follow the professional development strategy that George Grob suggested at a past AEA Conference: Every year, pick one hard skill and one soft skill to develop over the course of the year.
  • Choose a few bloggers to follow to pick up daily tips and stay up to date on the latest evaluation news. Take it a step further and volunteer to write for a blog or newsletter! AEA365 blog posts are short and allow you to perform a high-level review of a topic of interest or share your experiences and tips with others. SEA’s seasonal newsletter accepts a variety of submissions on evaluation and professional development topics, and article length can vary from a sidebar to a feature article.
  • Volunteer for AEA and SEA short- or long-term projects, or sign up for programs, conferences, and workshops. AEA’s next conference is scheduled for November 6th-11th, 2017 in Washington, DC.  SEA will be holding its 2-day Annual Workshop on February 27th-28th, 2017 in Tallahassee, FL, and will offer in addition to its normal programming a secondary track that will feature Essential Skills training sessions, including as “Evaluation Planning and Design,” “Relating Costs and Results,” and “Effective Presentations.”

Rad Resources:

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi there, Liz Zadnik here, new(ish) member of the aea365 curating team and sometimes Saturday poster. Last year Sheila posed the question What is it that YOU would like to read about on this blog?

One of the responses resonated with me, as it represented my relationship with evaluation as a professional:

I would love to see a post, or series of posts about evaluation from the perspective of practitioners for whom their primary job is not evaluation. Perhaps tips on how to best integrate evaluation into the myriad of other, seemingly more pressing, tasks without pushing it to the back burner.

I work in the anti-sexual violence movement at a state coalition, focusing on prevention strategies, training, and making community-based rape crisis centers accessible to people with disabilities. These three areas are my priorities – there are deliverables and activities that don’t always include evaluation and assessment. Many times – given my love of evaluation – I am the sole voice at the table asking about an evaluation plan. Most of the time we can weave evaluation in from the ground floor, other times it happens a little late(r).

Hot Tip: Ask this (or a similar) question: “How will we know we’ve been successful?” This is the most effective way I have found to help get people thinking about evaluation. It has started some of the most engaging and enlightening conversations I’ve ever had, both about a project and the work of the movement.

Lesson Learned: Sometimes, evaluation takes a backseat to program implementation and grant deliverables. This can be disappointing (to say the least), but I do see a change. Funders are more frequently asking for research, “evidence,” or assessment findings, providing evaluation enthusiasts (like myself) to engage our colleagues in this work.

Lesson Learned: Practice and challenge yourself, even if no one is ever going to see it. One of the ways I “integrate evaluation into the myriad of other, seemingly more pressing, tasks” is evaluating myself and my own performance. I regularly incorporate evaluative questions into training feedback forms, look for ways to assess the effectiveness of my technical assistance provision, and record my professional progress throughout the year. I sit in on as many AEA Coffee Break webinars and other learning opportunities as I can, always practicing the skills discussed and looking for ways to apply them to my work.

I would so appreciate hearing from other practitioners (and evaluators!) about their experiences infusing evaluation into their work. I’d also be happy to answer any questions you might have or write about specific projects in the future. Let me know – the aea365 team is here to please!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Leah Goldstein Moses, founder and CEO of the Improve Group. As a lifelong learner and cheerleader for evaluation, I was completely geeked out when I learned that the 2012 AEA conference was going to be right in my city. And as the leader of a growing team of evaluators, I knew I wanted to get the most out of the conference for myself and the whole company. Some of the things we did might also benefit you if you are planning to attend the 2013 conference in Washington DC with your colleagues.

Hot tip #1: Set team or personal goals early. The Improve Group’s goals were varied: we had some existing relationships we wanted to nurture; we had specific topics we wanted to explore more deeply by attending sessions, and when giving presentations we wanted to try new formats (such as Ignite sessions) and get new ideas for our work.

Hot tip #2: Check in regularly and support communication with technology. In the preceding weeks and during the conference, we talked as a team about our presentation preparations, sessions we planned to attend, and social events we were planning. We planned ahead about how we would use twitter and other social media during the conference. That meant that during the conference we had easy access to information and each other as needed.

Hot tip #3: Capture lessons quickly and continue to nurture ideas. In the week after the conference, everyone in our company captured a few key lessons and related resources on a simple spreadsheet. In November, as we were making our learning and communications plans for 2013, we went back to the spreadsheet to see what we wanted to dig into more deeply and what resources we should access during the year. Finally, staff contributed to our blog throughout the year, describing interesting things they learned and how they used that information in our evaluation work.


Clipped from http://theimprovegroup.com/blog/
(Share Clip)

Our enthusiasm and planning led to some tangible benefits from the conference. In the months since, we’ve been able to pull out ideas for new projects and share lessons with multiple organizations and colleagues.

The American Evaluation Association is celebrating Minnesota Evaluation Association (MN EA) Affiliate Week with our colleagues in the MNEA AEA Affiliate. The contributions all this week to aea365 come from our MNEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Chad Green, Program Analyst at Loudoun County Public Schools in Ashburn, VA. For over seven years I’ve served as an internal evaluator of instructional initiatives sponsored by central office administrators.

Do you have an interest in understanding school-based professional development from a sociocultural learning perspective?  Read on! Years ago I evaluated two school-wide improvement initiatives using an integrated conceptual framework.  The first component was Learning Forward’s original context standards which today serve as its first three standards for professional learning. The purpose of this framework was to constrain the data to essential long-term staff development outcomes.  The second component (Honig, 2008) operationalized the first one into six overlapping sociocultural learning practices, two for each context standard (see below).

Clipped from http://learningforward.org/standards/standards-list

Framework for High-Quality, School-Based Professional Development

I.   Skillful leadership is evidenced when school and central office staff:

  1. Model high quality teaching and learning practices
  2. Boundary span to connect staff with new sources of expertise

II.  Professional learning communities are evidenced when school and central office staff:

  1. Interact at a high level of collaborative inquiry
  2. Engage in joint work on authentic tasks that are meaningful and sustained over time

III. Dedicated resources are evidenced when school and central office staff:

  1. Provide access to ongoing, job-embedded learning opportunities that increase the level of participation in shared work practices (i.e., from novice to expert)
  2. Develop common conceptual and practical tools (e.g., principles, frameworks, routines, language, protocols, templates, materials)

Lesson Learned:  The patterns that emerged from the data were surprising on two levels.  At a superficial level they revealed a continuum of leadership approaches to program implementation ranging from a top-down, hierarchical structure on one end to a more subtle, heterarchical structure on the other. Coincidentally, these leadership structures aligned with the level of diversity (i.e., complexity) of the school’s student populations.  At a deeper level, the findings suggested a connection between each school’s sources of power and knowledge (i.e., truth).  In the top-down structure, tacit knowledge was concentrated in the principal and specialist roles (i.e., authority) whereas in the heterarchical setting knowledge was more explicit in the form of online repositories of co-created tools and resources.

Hot Tip:  Since then, I have learned that I am much more effective when I help central office administrators integrate their prepackaged conceptual frameworks (i.e., programs) into coherent strategic thinking portfolios which facilitate increased experimentation and interconnectedness system-wide.

Rad Resource: Check out Honig’s journal article on district central office as learning organizations.

Final Word: Both schools’ staff development programs were equally effective in the short run with respect to implementation and outcomes.  Which school structure do you think will be more sustainable in the long run?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

I’m Cheryl Poth and I am an assistant professor at the Centre for Applied Studies in Measurement and Evaluation in the department of Educational Psychology, Faculty of Education at the University of Alberta in Edmonton, Canada. My area of research is focused on how developmental evaluators build evaluation capacity within their organizations. My use of mixed methods is pragmatically-driven, that is, I use it when the research/evaluation question(s) require the integration of both qualitative themes and quantitative measures to generate a more comprehensive understanding. Most recently, my work within research teams has provided the impetus for research and writing about the role of a mixed methods practitioner within such teams.

Lessons Learned:

  • Develop and teach courses. In 2010, I developed (and offered) a doctoral mixed methods research (MMR) course in response to the demand from graduate students for opportunities to gain skills within MMR. The course was oversubscribed and at the end of the term we formed a mixed methods reading group, which continues to provide support as students are working their way through their research process. I am fortunate to be able to offer this course again this winter and already it is full!
  • Offer workshops. To help build MMR capacity, I have offered workshops in a variety of locations, most recently at the 9th Annual Qualitative Research Summer Intensive held in Chapel Hill, NC in late summer and at the 13th Thinking Qualitatively Workshop Series offered by the International Institute for Qualitative Methodology held in Edmonton, AB in early summer. These workshops remind me of the reality for many researchers that their graduate programs required completion of an advanced research methods course that was either qualitatively- or quantitatively-focused and of the need to build a community of MM researchers and that the community can exist locally or using technology can exist globally! It has been a pleasure to watch as new and experienced researchers begin to learn about MMR designs and integration procedures.
  • Join a community. One of the places where I have begun to find my community MM researchers was through a group currently working on forming the International Association of Mixed Methods, at the International Mixed Methods conference, and the mixed methods researchers on Methodspace.

Hot Tip:

Rad Resource:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

My name is David Brewer, Senior Extension Associate for the Employment and Disability Institute at Cornell University. Professional development isn’t always about listening to a presenter, watching a slide presentation, and asking a few questions.  True learning with measurable student impact in mind involves relationship building

Lesson Learned:

  • Creating our own Evidence. You don’t have to be a trained researcher to create evidence for use in improving services for students. The following example highlights how a diverse group of stakeholders can collect data, identify a common area of concern, set targets, plan activities, and reflect on results towards future planning and improvements. The Southern Tier Transition Leadership Group (STTLG) was created in 2001 to improve the number and quality of youth referrals to the New York State Vocational Rehabilitation Agency (VR) for services leading to employment and postsecondary outcomes.  This group, which continues to meet, is made up of VR senior staff, school district representatives, State Education Department officials, a Transition Specialist, and other agency representatives. Three times a year, the regional VR office provided youth referral data by school district to measure progress and plan activities. Below are the annual increases before (245 students 2001-02) and five years after STTLG began meeting (593 in 2006-07).

  • Looking at these numbers as a trend line, the three-year average of student referrals to VR before the STTLG was 275 students.  For the three years beginning in 2004-05, the average increased to 557 students – a 103% increase.

Hot Tip:

  • Shared Learning. These sustainable results were achieved without grant funding, but with shared ownership of both process and results.  This is not about VR referrals. This is a description of a shared learning and improvement process between professionals, resulting in measurable change for transitioning students.
  • Applying Lessons Learned. The purpose behind this initiative was to improve organizational capacity through a process of emergent learning.  Given the complexities of improving student achievement and post-school outcomes, finding the right answers is a collaborative process of reviewing data, setting targets, implementing research-based practices, learning from results — and applying lessons learned to future interventions. The process of emergent learning is a journey that requires a long-term commitment to measurable change.

Rad Resources:

  • TransitionSource.org Designed to support educational programs and agencies to advance post-school outcomes of secondary students with disabilities.

New York State Program on Transition to Adulthood for Youth with Disabilities

The American Evaluation Association is celebrating Professional Development Community of Practice (PD CoP) Week. The contributions all week come from PD CoP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

Hi! We are Marguerite Dimgba and Sheila Robinson Kohn, Professional Learning Center Director and Policy Board Chair, from Greece Central School District in Rochester, NY.

While Guskey indicates that this level of evaluation cannot be completed at the end of a professional development (PD) session or even multi-session course, we do begin to gather data from participants at this time, mostly as this is when we have easy access to them.  We use an online PD tool with a required anonymous feedback form participants complete after they have finished their courses.

Lesson Learned:

  • Utilize Participant Feedback. We collaboratively analyze data with policy board members (teachers, administrators, a university professor, parent, student, and private school teacher), understanding that the intended users of these data are individual instructors and the district as a whole. We then modify feedback forms (i.e. questions, response options) in hopes of collecting even better data to help us continue to make high quality programmatic decisions regarding PD for our district.

Hot Tips:

  • Discover Participants’ Learning Use. In addition to questions on participant satisfaction and learning, one question we ask for Level 4 is, “Will you be able to USE what you’ve learned?” As you can see from the graph below, 97% of our PD participants over the last year indicated that that have already applied the content, the content was relevant, or they feel they will be able to apply some of the content to their practice.

It is important to note here, that our online PD tool is not just for teachers. These respondents include any district employee who engages in any type of PD.

  •  Discover Participants’ Learning Needs. Our next question asks, “What do you NEED to further APPLY what you have learned today?” As you can see from the graph below, about one-quarter of our participants indicate they do not need further learning to apply the content, but about 70% feel they do need additional learning opportunities.

  •  Discover Participants’ Learning Application. We then ask, “MOVING FORWARD: HOW will you APPLY new learning in your practice?” The graph below indicates our participants’ varied answers to this.

  •  Discover Participants’ Learning Impact. Finally, we ask participants to indicate how they will measure the impact of what they have learned in the PD experience. Data sets like these serve as important formative evaluation measures for our continued district-wide PD programming.

Rad Resource:

  •  We use MyLearningPlan, a professional development management and evaluation system (PDMES), that features “planning, tracking, and evaluation of all forms of professional learning in one comprehensive online system.”  MLP features customizable evaluation forms and user-friendly output (such as these graphs) to help us analyze the effectiveness of our professional learning opportunities.

The American Evaluation Association is celebrating Professional Development Community of Practice (PD CoP) Week. The contributions all week come from PD CoP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· ·

I’m Donna Campbell, Director of Professional Development Capacity Building at the Arizona Department of Education (ADE). The Professional Development Leadership Academy (PDLA) is a three-year long curriculum of training and back-home application for school and district teams based on the Learning Forward Professional Learning Standards, derived from research.

Lesson Learned:

  • Legislation supports evaluation. I’ve learned it’s easier to train school teams to conduct Guskey Level 3 evaluations of organizational support than to scale this evaluation step to a state level.  The advent of the Common Core Standards (CCS) is raising awareness of the need for ADE to gather Level 3 data.  We are seizing this golden opportunity.
  • Understand significant shifts.  The CCS instructional shifts seem to be a catalyst for education leaders to challenge their assumption that if teachers just attend training sessions their instructional practice will change.
  • Building capacity is often top-down.  An ADE cross-divisional team is designing processes to build school leaders capacity to provide organizational support to teachers including opportunities for collaboration, time to practice new skills, follow-up, and feedback.  Our challenge: apply lessons learned from PDLA to every school and district in Arizona.
  • Teams set the stage. Teams’ attention to strengthening cultures of collegial support sets the stage for monitoring transfer of knowledge to the classroom, Guskey’s Level 4. If complex and large-scale instructional change is to be implemented and sustained, organizational support is essential.  Level 3 has been the missing link in previous standards-based reform efforts.

Hot Tips:

  • Teams develop their capacity to design, implement, and evaluate results-driven professional development (PD) to improve student learning. After focusing the first year on data analyses, goal-setting, theories of action, and planning PD to achieve a well-defined instructional change, teams are introduced to Guskey’s five-level evaluation model in year two.
  • School teams tend to focus Level 3 data gathering on school-level data.  For instance, we invite teams to annually administer two surveys: Learning Forward’s Standards Assessment Inventory (SAI) for teachers; and Education for the Future’s perception surveys for teachers, students, and parents. Teams analyze teacher survey data to assess perceived collegial and principal support over time. They also compare the amount of time designated at their school for professional learning from their start to finish of PLDA. Some routinely review written records of various teams at their school, checking for shared focus and follow-through. Results show examples of Level 3 progress through markers of increased candor and openness among faculty members or increased teacher participation in the PDLA team work.

Rad Resources:

The American Evaluation Association is celebrating Professional Development Community of Practice (PD CoP) Week. The contributions all week come from PD CoP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · · · ·

Older posts >>

Archives

To top