AEA365 | A Tip-a-Day by and for Evaluators

TAG | Visitor Studies

Hi! I’m Sarah Cohn, Association Manager for the Visitor Studies Association (VSA). VSA and I are a part of the Building Informal Science Education project (BISE). In this post, I’ll share information about VSA and how the Association supported the BISE project.

Lessons Learned: VSA served as a platform for the BISE project to engage with a community of evaluators and researchers throughout the field of informal learning, united around a set of common goals:

  • to ensure the project’s work was grounded in the experiences of these audiences,
  • to ensure that the resulting resources were useful and relevant to the wide range of evaluators that might use them.

At VSA annual meetings, the BISE project team sought input regarding:

  • the development of the BISE Coding Framework,
  • the direction of the project, and
  • the findings from the synthesis papers.

This sort of member-checking is important in qualitative research to ensure that the findings are truly reflective of the audience’s perspectives or information. Evaluators can be just as tricky and diverse in their ideas, needs, and opinions as any other audience! So what did we learn from this process?

  • When conducting research on evaluation, provide multiple venues and points at which evaluators can reflect on the study’s data, ideas, and findings. Find different check-in points over the course of a year or the life of the project; offer different modes for engagement, be they digital, in-person, asynchronously, etc.
  • Be as specific as possible in your requests for feedback. We are reflective, by nature, so your fellow evaluators will provide feedback on every aspect of a project if you let them!

Rad Resources: The Visitor Studies Association is a global network of professionals dedicated to understanding and enhancing learning experiences in informal settings wherever they may occur—in museums, zoos, parks, visitor centers, historic sites, and the natural world—through research, evaluation and dialogue. VSA’s membership and governance encompass those who design, develop, facilitate, and study learning experiences. We offer a number of resources for evaluators to learn more about evaluation in informal settings.

  • An annual conference every summer that brings over 200 professionals together to talk about new advances in the field, current projects, and major issues they are facing.
  • The Visitor Studies journal, which is a bi-annual, peer-reviewed journal that publishes high-quality articles focusing on research and evaluation in informal learning environments, reflections on the field, research methodologies, and theoretical perspectives. The journal covers subjects related to museums and learning in out-of-school settings.
  • Online webinars, produced in partnership with other museum-related associations, such as the Association for Science-Technology Centers.
  • Regional meet-ups, workshops, and an active listserv.

The American Evaluation Association is celebrating Building Informal Science Education (BISE) project week. The contributions all this week to aea365 come from members of the BISE project team. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hi! I’m Amy Grack Nelson, Evaluation & Research Manager at the Science Museum of Minnesota. I’m part of a really cool National Science Foundation-funded project called Building Informal Science Education, or as we like to refer to it – BISE. The BISE project is a collaboration between the University of Pittsburgh, the Science Museum of Minnesota, and the Visitor Studies Association. This week we’ll share what we learned from the project and what project resources are freely available for evaluators to use.

Within the field of evaluation, there are a limited number of places where evaluators can share their reports. One such resource is informalscience.org. Informalscience.org provides evaluators access to a rich collection of reports they can use to inform their practice and learn about a wide variety of designs, methods, and measures used in evaluating informal education projects. The BISE project team spent five years diving deep into 520 evaluation reports that were uploaded to informalscience.org through May 2013 in order to begin to understand what the field could learn from such a rich resource.

Rad Resources:

  • On the BISE project website, you’ll find lots of rad resources we developed. We have our BISE Coding Framework that was created to code the reports in the BISE project database. Coding categories and related codes were created to align with key features of evaluation reports and the coding needs of the BISE authors. You’ll find our BISE NVivo Database and related Excel file where we’ve coded all 520 reports based on our BISE Coding Framework. We have a tutorial on how to use the BISE NVivo Database and a worksheet to help you think about how you might use the resource for your own practice. You can also download a zip file of all of the reports to easily have them at your fingertips.
  • This project wouldn’t be possible without the amazing resource informalscience.org. If you haven’t checked out this site before, you should! And if you conduct evaluations of informal learning experiences, consider sharing your report there.

Lessons Learned:

  • So what did we learn through the BISE project? That you can learn A LOT from others’ evaluation reports. In the coming week you’ll hear from four authors that used the BISE database to answer a question they had about evaluation in the informal learning field.
  • What lessons can you learn from our collection of evaluation reports? Explore the BISE Database for yourself and post comments on how you might use our resources.

The American Evaluation Association is celebrating Building Informal Science Education (BISE) project week. The contributions all this week to aea365 come from members of the BISE project team. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Rucha Londhe, a Research Associate/Project Manager at Goodman Research Group, Inc., a research company that specializes in evaluation of programs, materials, and services. I recently evaluated Black Holes Exhibit Gallery (BHEG) a traveling exhibit on black holes and the accompanying materials produced by the Smithsonian Astrophysical Observatory (SAO), which aimed at engaging museum visitors and youth collaborators on the topic of black holes.

Rad Resource: One of the innovations of the project evaluated was the use of networked exhibit technologies to personalize and enhance the visitor experience of science inquiry, both within and beyond the exhibit gallery. Visitors at the exhibit became black hole explorers. They got an opportunity to make predictions, gather evidence, and draw their own conclusions. Visitors could also extend their learning beyond the exhibit experience; using the identification tags (the Black Holes Explorer Card below) they could personalize their observations, create a personal journal and website, and access it from home. The technology also proved to be an evaluation tool. As a part of the sign-in process, the visitors answered a question related to their attitude toward, knowledge about, or interest in the concept of black holes. At the end of the exhibition at the sign-out station the visitors answered another question from the above pool of questions. The answers to the questions at the sign-in stations provided the pre-experience data and those at the sign-out station provided the post-experience data for analysis. For more information on the exhibit visit

http://web-bh.cfa.harvard.edu/BlackHoleExhibit/About_the_exhibit.aspx

Black Holes Explorer Card

Hot tip: The use of innovative network technology, which turned out to be the highlight of the BHEG project, had three advantages:

  • The technology had a novelty appeal. It gave a personal effect. You could take a part of your experience with you after you left the museum.
  • The technology enhanced visitors’ outcomes. The use of the card made a difference in visitors’ time spent, their interest, and their learning at the exhibit. It helped optimize the exhibit experience for the visitors. In the future, the Explorer’s Card could be adapted and used in various types of exhibit. The look of the card would vary depending upon the topic at hand.
  • Finally, the network technology proved to be an extremely useful evaluation tool, providing an opportunity for the embedded assessment of visitor outcomes at the exhibit.

To learn more about this evaluation visit www.grginc.com

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello everyone. My name is Ioana Munteanu, and I am a Social Science Analyst with the Smithsonian Institution’s Office of Policy and Analysis. The Smithsonian consists of 19 Museums and the National Zoo; 9 Research Centers; and many other centers, programs and projects. Central to the Office of Policy and Analysis mission is to assist upper management at all levels in making sound decisions on improving the Institution’s exhibitions and programs for physical and virtual visitors and for stakeholders. Dr. Carole Neves directs our Office, which is composed of 12 skilled staff with diverse backgrounds, assisted by fellows and interns from both the United States and other countries.  Upon request we conduct formative, process and summative evaluations of both formal and informal programs and exhibitions that are offered on-site, off-site and online; the studies may be Institution-wide or focused on a particular Smithsonian unit, or department or program within a unit. The wonderful news is that over 100 of our studies are available online for FREE.  The link to our website is discussed below.

Rad Resource: Studies of visitors to the Smithsonian provide a glimpse into: who comes here for general museum visits and publicly available offerings, and answers why; how satisfied they are with their visit and what experiences they had; and what factors contributed to their satisfaction and experiences. These studies include formative assessments conducted during preparatory phases, as well as those looking at the output and impact of offerings. Staff employs a wide range of methodologies including, but not limited to: quantitative surveys—in person and online; qualitative interviewing; focus groups; observations; visitor tracking; and other methods such as card sorting or concept mapping.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Hey there. My name is Kathleen Tinworth and I am the Director of Visitor Research & Program Evaluation at the Denver Museum of Nature & Science. It’s my pleasure to finish up the week of visitor studies posts.

For many of you AEA365 readers, this week was the first time you’d ever heard of visitor studies. I know the feeling! I didn’t always work in a museum; I have only been in the sector for 3 years. I remember when I first read the job description for my position. I thought, “Evaluation in a museum? Incredible! That sounds like the coolest job ever.” Turns out I was right! At the time, I hadn’t considered the valuable role evaluation can and should play in cultural institutions.

Before jumping into visitor studies, I designed and conducted evaluations for criminal justice agencies and child welfare organizations. I got linked into AEA through that work and have stayed connected. One of the things I love about AEA is the exposure to and connections across different disciplines. I am always inspired by how adaptable and relevant methodologies, instruments, and data collection techniques can be across diverse fields. I hope this week’s AEA365 provided you with some new, fun, and creative ways to think about evaluation in your own work through the lens of visitor studies.

The VSA AEA group is really looking forward to meeting many of you at the AEA conference this November in San Antonio! Make sure you look out for us and say hello!

Here are some of my favorite “rad resources” online to further introduce you to the world of visitor studies:

Rad Resource: The Visitor Studies Association (VSA, http://visitorstudies.org/) is committed to understanding and enhancing visitor experiences in informal learning settings through research, evaluation, and dialogue. All the AEA365 bloggers this week are active members (including the current VSA president herself, Kirsten Ellenbogen). An absolute essential on the website is the full searchable archive (http://www.visitorstudiesarchives.org/) where you can access and download some fantastic articles and papers—for FREE!

Rad Resource: CARE (which stands for Committee on Audience Research and Evaluation, http://www.care-aam.org/) is a standing professional committee within the American Association of Museums. Check out their directory of evaluators (http://www.care-aam.org/documents/directory_of_evaluators/directory_of_evaluators_2009.pdf) to find someone in your area!

Rad Resource: Informal Science online (http://informalscience.org/) isn’t all about science. It’s a phenomenal resource and online community for informal learning projects, research and evaluation. You will find news, interviews from the field, funding opportunities, upcoming conferences and events—not to mention a virtual clearinghouse of published and unpublished studies and results. Like the VSA archives, it’s FREE!

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. We are pleased to welcome colleagues from the Visitor Studies Association – many of whom are also members of AEA – as guest contributors this week. Look for contributions preceded by “VSA Week” here and on AEA’s weekly headlines and resources list.

· ·

My name is Kirsten Ellenbogen. I’m Director of Research & Evaluation at the Science Museum of Minnesota and President of the Visitor Studies Association.  I hope you’re enjoying VSA Week on AEA365.

Rad Resource: Development of individual identity has for some time been considered an outcome of informal learning environment experiences. But identity has recently become more central in the field with the 2009 report by the National Academy of Sciences (NAS): “Learning Science in Informal Environments.” The report identifies and provides evidence for six “strands of learning” that occur in informal learning environments. What’s so rad about this? NAS reports are based on systematic reviews of literature that use strict criteria for what counts as good evidence. This report is unique in the strength and systematic nature of the evidence for learning in informal environments that it provides. You can read (and search) the entire book online or purchase a copy:  http://www.nap.edu/catalog.php?record_id=12190

Cool Trick: Two evaluation approaches that are particularly useful for gathering data about identity development in informal learning environments are embedded evaluation and reflective interviews. Embedded evaluation integrates “invisible” tools for evaluation into the existing program activities. For example, in a youth program that has a focus on interactive media, the projects produced by youth are posted to online environment. http://info.scratch.mit.edu/ScratchR:_The_Online_Community The projects can be downloaded by others, modified, and reported. All activity in the online community can be tracked, and the ongoing development of the youth’s projects can be analyzed for more detail.

Cool Trick: Another evaluation approach useful for gathering data on identity development in informal learning environments is video-based reflective interviews. For example, videotaping a museum visitor using an exhibition (using IRB-approved informed consent as appropriate). In the post interview, after the initial set of questions, show the visitor a video segment of his or her interactions with the exhibition that was taped just moments before. Use a semi-structured interview approach and ask the visitor to narrate their video and tell you more about what they were doing. This approach can become somewhat automated using technologies like Video Traces. http://depts.washington.edu/pettt/projects/videotraces.html

Hot Tip: There’s an app for that. There are reflective tools that support annotation of images, audio or video diaries, and other approaches that support the evaluation of identity development.  Take a look at Everyday Lives or Storyrobe as a great starting point. These apps are useful for you to use as the evaluator or can be added to the participants phone, iPod, iPad or other device. Adding a tool like this to a device that a participant regularly carries around allows ongoing data collection that is reflective and in some instances, embedded. This makes them ideal tools for monitoring identity development.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. We are pleased to welcome colleagues from the Visitor Studies Association – many of whom are also members of AEA – as guest contributors this week. Look for contributions preceded by “VSA Week” here and on AEA’s weekly headlines and resources list.

· · ·

Hi! I’m Carey Tisdal, Director of Tisdal Consulting, an independent firm that evaluates informal learning environments. Informal learning environments include museums (art, history, science, and children’s museums), science-technology centers, zoos, aquaria, parks, television, and radio. I worked as an internal evaluator for nine years and for six as an external evaluator. Recently, field-building and professional developments have been a focus of several projects funded by the National Science Foundation. I am evaluating one of these projects, ExhibitFiles. ExhibitFiles is an online community for exhibit designers and exhibition developers. One goal of the site is to provide a place where exhibition developers find out about each other’s work. Members can upload case studies, reviews of exhibits they have visited, and useful “bits” about the exhibit design processes and materials. Evaluation reports may be attached to case studies. A related goal is the development professional networks for the sharing of expertise. Registered members post profiles and contact information. My Visitor Studies Week blog for AEA365 shares an important insight about continuing to learn as we do our work.

Lessons Learned: Actually, lessons re-learned! In this project, the client and I have found formal theory very helpful in thinking about the site and understanding how people use it. I was reminded of Kurt Lewin’s wonderful 1951 pronouncement that “There is nothing so practical as a good theory.” We found theories comparing and contrasting communities of practice and communities of interest using of digital information (Hoadley & Kilner, 2005) especially helpful in understanding how exhibition developers incorporated the site experience into their work. For example, specific reviews are sometimes serving as boundary objects for people working in different disciplinary areas and with different training and experiences to develop a common language about a design topic. Since this site is only one element in a range of professional development activities, we have used concepts about the ecology of learning (Brown, 1999) to start understanding the role of ExhibitFiles as one among a set of professional development activities in which exhibition developer participate. Using a theoretical lens as part of the evaluation has helped the project team (clients) and the evaluators develop a common language and set of ideas to support their decisions about updating site and planning its future. Formal theory can sometimes be a boundary object for evaluators and clients.

Rad Resource

Brown, J.S. (1999). Presentation at the Conference on Higher Education of the American Association for Higher Education. Retrieve August 15, 2010 from http://serendip.brynmawr.edu/sci_edu/seelybrown/.

Rad Resource

Hoadley, C.M. & Kilner, P.G. (2005). Using technology to transform communities of practice into knowledge-building communities. SIGGROUP Bulletin, 25(31).

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. We are pleased to welcome colleagues from the Visitor Studies Association – many of whom are also members of AEA – as guest contributors this week. Look for contributions preceded by “VSA Week” here and on AEA’s weekly headlines and resources list.

· ·

My name is Cheryl Kessler and I am an independent consultant doing visitor studies. I conduct program and exhibit evaluations in museums and libraries to understand what and how visitors/users learn from their experiences, what they might do with that information, and how new information is integrated with existing knowledge. My Visitor Studies Week blog is about some of my favorite methods.

Visitor studies utilize a number of familiar evaluation and research methodologies. I lean toward and enjoy qualitative methods such as focus groups, drawings, Personal Meaning Mapping, and timing and tracking. Recently, I have been conducting telephone focus groups, which have disadvantages but get the job done when working from a distance with limited travel funds. I have used drawings with elementary school children to document immediate impact, using a rubric created collaboratively with the program coordinator to score vocabulary and concept learning. Personal Meaning Mapping (PMM) is a concept-map-like methodology developed by John F. Falk, et al. (1998) to assess individual learning in informal settings across four dimensions: Extent, Breadth, Depth and Mastery. Originally designed for summative evaluation, PMM is very adaptable for prototyping, topic and label testing. Timing and tracking, the ultimate in observation in my opinion, is useful for understanding how the public uses or attends to entire exhibitions or individual exhibits within a larger exhibition. I have done timing and tracking studies for formative and summative evaluations in a natural history museum, reflective tracking in a front-end study of living history sites; and as part of strategic planning for reinterpretation of a historic cultural space to understand visitor pathways and engagement.

Rad Resources: Some favorite and time-tested resources include:

  • Krueger, R.A., and Casey, M. Focus Groups: A Practical Guide for Applied Research (3rd edition), Thousand Oaks, CA: Sage Publications, 2000.
  • Yalowitz, S., and Bronnekant, K. Timing and Tracking: Unlocking Visitor Behavior. Visitor Studies, Volume 12, Issue 1, January 2009, pages 47 – 64
  • Serrell, B. Paying Attention: Visitors and Museum Exhibitions, Washington: American Association of Museums, 1998.
  • Falk, J. H., Moussouri, T., & Coulson, D. (1998). The effect of visitors’ agendas on museum learning. Curator, 41(2), 107 – 120. A Google search for “personal meaning mapping” results in a number of studies using this methodology.
  • Informalscience.org a repository of research and evaluation reports, searchable by institution, methods, author, etc.
  • Institute for Learning Innovation has conducted many studies using all of these methods for over 20 years. Most studies are unpublished but may be available by request.
  • Science Museum of Minnesota, Research and Evaluation, The Big Back Yard Study Series (2006) includes a nice timing and tracking report.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. We are pleased to welcome colleagues from the Visitor Studies Association – many of whom are also members of AEA – as guest contributors this week. Look for contributions preceded by “VSA Week” here and on AEA’s weekly headlines and resources list.

· ·

Hello from Ithaca, NY. I’m Rick Bonney, director of program development and evaluation at the Cornell Lab of Ornithology. I also am on the board of the Visitor Studies Association, and I’m thrilled by several projects that the organization is developing with a multitude of partners who all conduct research and evaluation in the field of informal learning. All three of these projects are funded by the Informal Science Education program of the National Science Foundation (NSF).

Rad Resource: First, we’ve recently heard that we’re receiving another year of funding to continue our partnership with CAISE, the Center for Advancement of Informal Science (http://caise.insci.org/). Our expanding role in CAISE involves bridging the gap between visitor research and the practice of program development through workshops, reports, and online resources. For example, a recent article by Beverley Serrell, “Paying More Attention to Paying Attention,” provides an excellent overview of Tracking and Timing techniques (see http://caise.insci.org/resources/vsa-articles).

Rad Resource: Second, we’ve learned that we will be receiving funding for a project called “Building ISE through informalscience.org,” which will be conducted in partnership with the University of Pittsburgh’s Center for Out of School Learning (UPCLOSE) and the Science Museum of Minnesota. This ambitious project will facilitate the growth and use of informalscience.org by enhancing the site’s already useful databases and integrating with a broader set of other web based resources. The project will also conduct a synthesis of evaluation reports that will cover all available data across all sectors of informal science education.  The synthesis will also produce a framework for coding and organizing both current and future evaluation data. This effort will provide an opportunity for database mining for further research and program planning.  In addition, the grant will allow us to create a new section of the VSA website which will assist project developers in locating evaluators to partner in their work. Evaluators will be able to use the site to post profiles and examples of their work.

Rad Resource: Finally, VSA will be a major partner in a new project that has just been awarded to the Cornell Lab of Ornithology called DEVISE—Developing, Validating, and Implementing Situated Evaluation Instruments for Informal Science Education. Recently the ISE field has seen growing calls for new evaluation instruments, tools, and techniques that can be customized for use across ranges of similar projects. Such resources would bring common measures to evaluation practices, facilitate cross-project comparisons, and most importantly, provide evaluation guidance to project developers who are inexperienced or lack major resources for evaluation. VSA will play several roles in this project including hosting webinars for training people to use of the new tools and techniques.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. We are pleased to welcome colleagues from the Visitor Studies Association – many of whom are also members of AEA – as guest contributors this week. Look for contributions preceded by “VSA Week” here and on AEA’s weekly headlines and resources list.

· · ·

Hi! I’m Joe E. Heimlich; some days I’m not sure what I do. I’m a professor at Ohio State University, an Extension Specialist based at COSI, a large science center, and Director for the Institute for Learning Innovation in Edgewater, MD.

Why I’m not sure what I do relates specifically to the topic of this blog: Visitor Studies and Evaluation—Looking for the Intersection. My work has historically been studying visitors in zoos, parks, nature centers, botanical gardens, etc. In the last decade, I’ve done a lot more with museums, science centers, historical centers, and performing arts. I study visitors. And I do evaluation. And sometimes they overlap and sometimes they’re not at all the same.

Lessons Learned: Hallie Preskill’s wonderful visual of the hourglass as a representation for how research and evaluation intersect and diverge is a useful tool for framing evaluation and visitor studies. Although sometimes the methods are identical, and sometimes the philosophy and use are the same, there are inherent differences shaped by context. This blog is the first in a series being written by AEA members who are also members of the Visitor Studies Association. Through this short series, we hope to introduce fellow evaluators to some of the interesting differences in our work. (Hourglass reprinted with permission below and available full size in AEA Public eLibrary.)

Using Hallie’s hourglass… First, in terms of the philosophy: visitor studies advocates for the visitor. In most of our settings, visitors come by choice. And in most of these settings, their choices are not driven by “learning.” The settings in which visitors studies work nearly uniformly have education as a part of their mission, however. Visitor studies focuses in part on who comes, why they come, what they desire, what they are open to, and how we can integrate messages into the visitors’ desired outcomes. Much of our work is around shaping messages to meet visitors where they are. And we do program evaluation for the institutions.

Second, in terms of methods: our evaluation methods are severely constrained by the nature of the visit. We often have a maximum of 3-5 minutes of a visitor’s time to get from them the data we need. Visitors are dealing with time budgets, family issues, social pressures and more. We often must track visitors to fully understand their visit and what they got from it. More on this in a later post and in a couple of very cool sessions for AEA!

Finally, in terms of use: here is where we most look like the rest of the field of evaluation. Visitor studies is utilization focused, and our points of pride are when programs, exhibits, and yes, even visitors change because of the work we do. Welcome to the visitor studies aea365 Tip-a-Day posts.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. We are pleased to welcome colleagues from the Visitor Studies Association – many of whom are also members of AEA – as guest contributors this week. Look for contributions preceded by “VSA Week” here and on AEA’s weekly headlines and resources list.

·

Archives

To top