AEA365 | A Tip-a-Day by and for Evaluators

TAG | tracking

Hello, I am Edith Gozali-Lee, a research scientist at Wilder Research. I work primarily on research and evaluation projects related to education. I am currently working on a multi-site, longitudinal study of an early childhood initiative. The study includes three cohorts of school-based preschool program children in ten schools, five cohorts of community-based child care children in homes and centers, and comparison children with and without prior preschool experience. The study follows children from preschool to third grade. That’s a lot to track, making good data collection critical from the start.

Hot Tips:

These are a few coding tips that will help to ensure good data collection tracking:

  • Anticipate the different groups ahead of time and make intuitive coding to make it easier for the following years’ data tracking and analyses
  • Use categories or codes used by schools to make data analyses process easier when you merge data that you collect with other student data collected by schools (demographic data and student outcomes)
  • Label all instruments (survey and assessment forms) with these codes prior to data collection to reduce coding work after the data collection and errors for data entry

Lesson Learned:

It is helpful to hold regular project debriefs to reflect on what works well and does not work so well. This will make the evaluation process go smoother and faster the next time around.

Rad Resources:

Practical research-based information, visit CYFERnet Children, Youth and Families Education and Research Network

Resources for research in early childhood:

We are looking forward to seeing you in Minnesota at the AEA conference this October. Results of this study (along with other Wilder Research projects and studies) will be presented during a poster session: Academic Outcomes of Children Participating in Project Early Kindergarten Longitudinal Study.

The American Evaluation Association is celebrating with our colleagues from Wilder Research this week. Wilder is a leading research and evaluation firm based in St. Paul, MN, a twin city for AEA’s Annual Conference, Evaluation 2012. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello from Ithaca, NY. I’m Rick Bonney, director of program development and evaluation at the Cornell Lab of Ornithology. I also am on the board of the Visitor Studies Association, and I’m thrilled by several projects that the organization is developing with a multitude of partners who all conduct research and evaluation in the field of informal learning. All three of these projects are funded by the Informal Science Education program of the National Science Foundation (NSF).

Rad Resource: First, we’ve recently heard that we’re receiving another year of funding to continue our partnership with CAISE, the Center for Advancement of Informal Science (http://caise.insci.org/). Our expanding role in CAISE involves bridging the gap between visitor research and the practice of program development through workshops, reports, and online resources. For example, a recent article by Beverley Serrell, “Paying More Attention to Paying Attention,” provides an excellent overview of Tracking and Timing techniques (see http://caise.insci.org/resources/vsa-articles).

Rad Resource: Second, we’ve learned that we will be receiving funding for a project called “Building ISE through informalscience.org,” which will be conducted in partnership with the University of Pittsburgh’s Center for Out of School Learning (UPCLOSE) and the Science Museum of Minnesota. This ambitious project will facilitate the growth and use of informalscience.org by enhancing the site’s already useful databases and integrating with a broader set of other web based resources. The project will also conduct a synthesis of evaluation reports that will cover all available data across all sectors of informal science education.  The synthesis will also produce a framework for coding and organizing both current and future evaluation data. This effort will provide an opportunity for database mining for further research and program planning.  In addition, the grant will allow us to create a new section of the VSA website which will assist project developers in locating evaluators to partner in their work. Evaluators will be able to use the site to post profiles and examples of their work.

Rad Resource: Finally, VSA will be a major partner in a new project that has just been awarded to the Cornell Lab of Ornithology called DEVISE—Developing, Validating, and Implementing Situated Evaluation Instruments for Informal Science Education. Recently the ISE field has seen growing calls for new evaluation instruments, tools, and techniques that can be customized for use across ranges of similar projects. Such resources would bring common measures to evaluation practices, facilitate cross-project comparisons, and most importantly, provide evaluation guidance to project developers who are inexperienced or lack major resources for evaluation. VSA will play several roles in this project including hosting webinars for training people to use of the new tools and techniques.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. We are pleased to welcome colleagues from the Visitor Studies Association – many of whom are also members of AEA – as guest contributors this week. Look for contributions preceded by “VSA Week” here and on AEA’s weekly headlines and resources list.

· · ·

Archives

To top