AEA365 | A Tip-a-Day by and for Evaluators

TAG | longitudinal

My name is David Brewer, Senior Extension Associate for the Employment and Disability Institute at Cornell University. Professional development isn’t always about listening to a presenter, watching a slide presentation, and asking a few questions.  True learning with measurable student impact in mind involves relationship building

Lesson Learned:

  • Creating our own Evidence. You don’t have to be a trained researcher to create evidence for use in improving services for students. The following example highlights how a diverse group of stakeholders can collect data, identify a common area of concern, set targets, plan activities, and reflect on results towards future planning and improvements. The Southern Tier Transition Leadership Group (STTLG) was created in 2001 to improve the number and quality of youth referrals to the New York State Vocational Rehabilitation Agency (VR) for services leading to employment and postsecondary outcomes.  This group, which continues to meet, is made up of VR senior staff, school district representatives, State Education Department officials, a Transition Specialist, and other agency representatives. Three times a year, the regional VR office provided youth referral data by school district to measure progress and plan activities. Below are the annual increases before (245 students 2001-02) and five years after STTLG began meeting (593 in 2006-07).

  • Looking at these numbers as a trend line, the three-year average of student referrals to VR before the STTLG was 275 students.  For the three years beginning in 2004-05, the average increased to 557 students – a 103% increase.

Hot Tip:

  • Shared Learning. These sustainable results were achieved without grant funding, but with shared ownership of both process and results.  This is not about VR referrals. This is a description of a shared learning and improvement process between professionals, resulting in measurable change for transitioning students.
  • Applying Lessons Learned. The purpose behind this initiative was to improve organizational capacity through a process of emergent learning.  Given the complexities of improving student achievement and post-school outcomes, finding the right answers is a collaborative process of reviewing data, setting targets, implementing research-based practices, learning from results — and applying lessons learned to future interventions. The process of emergent learning is a journey that requires a long-term commitment to measurable change.

Rad Resources:

  • TransitionSource.org Designed to support educational programs and agencies to advance post-school outcomes of secondary students with disabilities.

New York State Program on Transition to Adulthood for Youth with Disabilities

The American Evaluation Association is celebrating Professional Development Community of Practice (PD CoP) Week. The contributions all week come from PD CoP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

My name is Guili Zhang. I am an Assistant Professor of Research and Evaluation Methodology at East Carolina University. During the last ten years, I have evaluated the National Science Foundation’s SUCCEED program, and developed and analyzed the SUCCEED longitudinal database, which includes data from nine universities and spans 20 years. Our research team’s publications based on this database have received two Best Paper Awards from the American Society of Engineering Education and the Frontiers in Education. Today I’d like to share some information about longitudinal data management and analysis.

Lessons Learned: There are two very different organizations for longitudinal data—the “person-level” format and the “person-period” format. A person-level data set, also known as the multivariate format, has as many records as there are people in the sample. As additional waves of data are collected, the file gains new variables, not new cases. A person-period data set, also known as the univariate format, has multiple records for each person—one for each person-period combination. As additional waves of data are collected, the file gains new records, but not new variables.

Besides the derived variable approach to longitudinal data analysis, which involves the reduction of the repeated measurements into a summary variable, there are two classical approaches: the ANOVA and MANOVA approaches. The ANOVA and MANOVA approaches represent well-understood methodology, and the computer software is widely available. Unfortunately, both models have limited usage in longitudinal data analysis due to their restrictive and often unrealistic assumptions and the effect of missing data on the statistical properties of their estimates. Currently, there are several alternative approaches that overcome the limitations of the traditional approaches, variously known as: mixed-effect regression model, the covariance pattern model, generalized estimating equations model, individual growth model, multilevel model, hierarchical linear model, random regression model, survival analysis, event history analysis, failure time analysis, and hazard model.

Hot Tip #1 – The person-period format most naturally supports meaningful analysis of change over time.

Hot Tip #2 – Most statistical software packages can convert a longitudinal data set from one format to another. For example, in SAS, Singer (1998, 2001) provides simple code for the conversion; in STATA, the “reshape” command can be used.

Rad Resources:

Two introductory books that I have found useful are:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

My name is Brad Coverdale and I am a doctoral student at the University of Maryland, College Park. I am interested in researching post-secondary access as well as success initiatives for students, particularly first-generation and/or low income. One of these initiatives that was very dear to me was Upward Bound. As such, I conducted a program evaluation for my Master’s thesis using data from The National Educational Longitudinal Survey of 1988-2000 (NELS 88:2000).

Rad Resource: Because NELS 88:2000 is a longitudinal study, it met my data needs perfectly. This survey started with a cohort of 8th graders in 1988 and attempted to track their academic pursuits through 2000. By asking the students many questions including whether or not the students participated in pre-college programs like Gear Up and Upward Bound, I was able to create a treatment group and comparison group by matching similar characteristics through propensity score matching. This dataset has also been useful for analyzing psychological responses and educational objectives, finding the highest predictors for particular subjects, among other research questions. Best of all, the dataset is FREE to use.  All you have to do is send an email to Peggy Quinn, the Publication Disseminator (peggy.quinn@ed.gov) with your request for an unrestricted copy of the Data and the electronic codebook.  NCES is in the process of putting together an online application for analysis but for now you can just use Data Analysis System, a product developed for NELS analysis, if you are familiar with the program by going to this link http://nces.ed.gov/dasol/ and selecting the NELS 88/2000 data.

Hot Tip: Remember to use either the panel weights if you are tracking students over time or cross-section weights if you are only interested in a particular study (1988, 1990, 1992, or 2000). Also, be wary as to what students are included as well as excluded from your analysis. Data from students that drop out of school or are removed from the study are not included in the overall results. You may want to consider appending them specifically to your data source.

Want to learn more from Brad? He’ll be presenting as part of the Evaluation 2010 Conference Program, November 10-13 in San Antonio, Texas.

· · · ·

Archives

To top