Oscar Chavez on Lessons Learned in Evaluating High School Mathematics Curriculums

I am Oscar Chavez, assistant professor at the University of Missouri, co-PI of the COSMIC project. We are examining secondary students’ mathematics learning from textbooks embodying two distinct approaches to content organization: an integrated approach (e.g., Core-Plus Mathematics) and a subject-specific course approach (students follow an Algebra I, Geometry, Algebra II sequence). The study was conducted in schools using both approaches, but with different groups of students.

The challenge for curriculum comparison research is to determine to what extent the different curriculum materials used are responsible for differences in student achievement. A causal relationship can only be inferred when certain conditions can be documented: the curriculum materials under study are the primary resource utilized by the teacher and the assessment instruments measure student outcomes that can be related to the content of the textbooks.

How are the programs implemented? We must gather data on how frequently the textbooks are used and how much of the content taught in the classroom is based on the textbook content. Furthermore, textbooks are written under certain assumptions about how the content should be taught.

Hot Tip: It is necessary for evaluators to make these assumptions explicit and to document to what extent teachers’ instructional strategies were based on such assumptions.

How should we measure what students learned from each program? One could use standardized tests, but results from these tests may represent an incomplete depiction of student learning. Moreover, multiple-choice items provide insufficient evidence of students’ capacity for mathematical problem-solving and reasoning and may fail to address topics that enable researchers to compare curriculum programs.

Hot Tip: To compare how different programs help students learn the same concept or skill, identifying common topics is crucial. This analysis should guide the development of appropriate assessment instruments.

Hot Tip: Scrutinize the curriculum materials. We interviewed textbook authors to learn their expectations for a lesson based on their textbooks. We examined teacher guides to understand how the different sections/components from the lessons were meant to be used by teachers. Based on our findings, we developed instruments to document textbook use and content taught: teacher diaries and classroom observation protocols.

Hot Tip: Conduct content analyses of the textbooks to determine the common topics and how they are addressed in each program. Based on our results, we developed objective tests with mostly open-response items. Asking students to explain their reasoning in these items allowed us to analyze responses for a deeper examination of student thinking. Our scoring rubrics were designed taking into account all possible correct answers to each problem, in order to interpret accurately student work or explanations and ensure that our tests were sensitive to different solution strategies.

To learn more about COSMIC, visit our website.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.