My name is Randahl Kirkendall. I work part-time as an Evaluator with Ellen Iverson, Director of Evaluation, for the Science Education Resource Center (SERC) at Carleton College, which works to improve education through projects that support educators. Our work is funded primarily through NSF grants. SERC has expertise in geoscience education, workshop leadership, website development and program and website evaluation.
A primary aim of SERC is to help faculty adopt evidence-based teaching behaviors that will enhance student learning. In evaluating the websites at SERC, our interest is in the role of website use in faculty professional development. We use a variety of web analytic tools such as Google Analytics, server-based website statistics, and web page visit logs in combination with data from surveys, interviews, focus groups, and observations to get as complete a picture as possible for how faculty use websites and the impact that their use has on teaching behavior.
Lesson Learned: One of the things we have learned from user interviews is that people generally have poor recall of how they found a website and used it. While they can explain why they go to a website (motivation), they have difficulty recalling at what section of the website they started, what pages they viewed, and the search strategy they used. Website use analytics and web server logs of individual visits provide a richer picture of user behavior and interests via records of the actual pages that they visited.
Lesson Learned: The SERC websites often don’t work in isolation. Our survey of 2,000+ faculty found that a significant number of users were using the websites to compliment other professional development activities such as attending workshops, exchanging ideas with colleagues, or reviewing literature. Thus, it has been prudent that we collect data on these other possible influences on their teaching behavior.
Cool Trick: We sequence or build evaluations incrementally, partially basing data collection and/or analyses on findings from other data collection methods. For example, we use the findings from user interviews to describe predominant motivations for using a website and any changes in behavior (such as teaching practice changes) that users attribute (at least partially) to website use. Those descriptions become a guide for using website analytic data to map particular patterns of use and to identify web use logs that can provide insight into how users may navigate the website.
Cool Trick: We use pop-up surveys to identify users that we might not otherwise reach. The pop-up asks for an email that we can use to follow-up with them for future surveys and interviews.
Want to learn more about Randahl and Ellen’s work? Join over 2500 colleagues at the AEA Annual Conference this November in San Antonio and check out their session in the conference program.
How respondents view and respond to web surveys is often contingent on their choice of OS, screen resolution and browser. We use para-data/ web analytics to determine respondents’ network capabilities and improve future survey experiences.