Hello everyone! We are Indira Phukan and Rachel Tripathy, graduate students at the Stanford Graduate School of Education. This past summer, we worked with a local environmental education organization to pilot several tools for evaluating learning outcomes related to stewardship and environmental behavior change. When it comes to understanding students’ perceptions of and relationships with nature, traditional assessment strategies often fall short, so we looked to creative, alternative evaluation techniques to understand student learning. We also conducted interviews and observations to understand how these tools might inform our research. One of the tools we explored this summer was an art-based embedded assessment.
In our pilot, five- to seven-year-old students participating in a weeklong camp at a coastal California campus were asked to make drawings before and after a scheduled activity where learning took place. The drawing prompts provided by their educators were broad enough to allow for the students to make choices about what they drew, but were also designed to direct student thinking toward the target activity. We collected student art and analyzed it with a rubric that considered thematic, analytic, informational, and contextual details. It was incredible to see the kinds of observations being made by six-year-olds! Their drawings definitely captured learning details that a written assessment would not, and the children had fun in the process. Moving forward, we are excited to see how this tool works with other age groups, and how it might be adopted as an embedded assessment strategy by other organizations.
Hot Tip: Site observations and interviews with educators can help researchers and practitioners design embedded assessments that fit seamlessly into existing curriculum and programming. The educators will thank you, and your data will reflect a more representative student experience.
Lesson Learned 1: When analyzing subjective student work, like art, the type of rubric being used is exceedingly important. The rubric should be well thought-out, and designed to tease out information that will answer your research questions. Ultimately, an effective rubric will go through various iterations during the pilot phase before a final version is decided upon.
Lesson Learned 2: Informative student art takes time. Initially, we wanted to give students about 5-10 minutes to produce a drawing, but we quickly learned that they needed at least 15 minutes to create something that we could properly analyze for insights about their learning.
Rad Resource 1: Rubric Library is a great online resource for browsing rubrics that others have used, and for finding inspiration for creating your own.
Rad Resource 2: Ami Flowers and her colleagues wrote a great article on using art elicitation for assessment. We drew a lot of inspiration from their findings.
AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.