AEA365 | A Tip-a-Day by and for Evaluators

TAG | web analytics

Hello! We are Xin Wang, Neeley Current, and Gary Westergren. We work at the Information Experience Laboratory (IE Lab) of the School of Information Science & Learning Technologies at the University of Missouri.  The IE lab is a usability laboratory that conducts research and evaluates technology. What is usability? According to Jakob Nielsen’s definition, usability assesses how easy user interfaces are to use. With the advancement of Web technology, in the past eight years, our lab has successfully applied a dozen of usability methods into the evaluation of educational and commercial Web applications. The evaluation methods that we have frequently used include: heuristic evaluation, think-aloud interviews, focus-group interviews, task analysis and Web analytics. Selecting appropriate usability methods is vital and should be based on the development life cycle of a project. Otherwise, the evaluation results would not be really useful and informative for the Web development team. In this post, we focus on some fundamental concepts regarding one of the most commonly adopted usability evaluation methods–Think-Aloud protocol.

Hot Tip: Use think-aloud interviewing! Think-aloud interviewing is used to engage participants in activities and then ask users to verbalize their thoughts as they perform the tasks. This method is usually applied during the mid or final stage of Website or system design.

Hot Tips: Employing the following procedures are ideal:

  1. Recruit real or representative users in order to comply with the User-Centric Design principles
  2. Select tasks based on frequency of use, criticality, new features, user complaints, etc.
  3. Schedule users for a specific time and location
  4. Have users operate a computer accompanied by the interviewer
  5. Ask users to give a running commentary (e.g., what they are clicking on, what kind of difficulty they encounter to complete the task)
  6. Have interviewer probe the user about the task s/he is asked to perform.

Pros:

  1. When users verbalize their thoughts, evaluators may identify many important design issues that caused user difficulties, such as poor navigation design, ambiguous terminology, and unfriendly visual presentation.
  2. Evaluators can obtain users’ concurrent thoughts rather than just retrospective ones, so it may avoid a situation where users may not recall their experiences.
  3. Think aloud protocol allow evaluators to have a glimpse into the affective nature (e.g., excitement, frustration, disappointment) of the users’ information seeking process.

Cons:

  1. Some users may not be used to verbalizing their thoughts when they perform a task.
  2. If the information is non-verbal and complicated to express, the protocol may be interrupted.
  3. Some users may not be able to verbalize their entire thoughts, which is likely because the verbalization could not keep pace with their cognitive processes–making it difficult for evaluators to understand what the users really meant.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello, my name is Juan Paulo Ramírez, independent consultant, sole owner of “GIS and Human Dimensions, L.L.C.” As many of you may know Google Analytics (GA) allows you to track down the number of visitors that a website receives during a certain period of time. But GA does a lot more than that. If you have installed the GA code into a website, GA offers a number of visualization tools that will allow you to analyze what is working and what is not working in your website, and ways to improve it. The following are two of the visualization tools that I like the most offered by Google Analytics:

Rad Resource: Google Analytics – Map overlay

Map overlay allows you to identify from where you are getting visitors. This is a great tool since it identifies your audience by geographic location and then potentially you can customize your website to the characteristics of that audience based on their demographics, culture, or interests. A coropleth world map separated by countries is displayed with the capacity to zoom in to take a more detailed look from which particular regions you are receiving visitors. If you click in the U.S. you can hover the cursor of the mouse over any state and a textbox will pop up with the frequency of visitors. Using the Map Overlay tool you may be able to identify if you need to translate the contents of your website to a specific language, for instance if you are receiving many visitors from non-speaking English countries or communities.

To learn more about map overlay, view Google Analytics in 60 Seconds: Location Targeting on YouTube

Rad Resource – Google Motion charts

Motion charts allows you for instance to identify keywords that people have used to find your website. Keywords can be displayed as dynamic charts using bubbles or bars. A bubble chart may describe the average number of pages per visit using a specific keyboard. What is nice about the motion chart is that allows you to see changes in the use of keywords over time, which may indicate some trends that people are following influenced by a professional forum discussion, participation in events, or particular interests brought up by your followers. As people change their interests and ideas, this is a great information tool for you to adjust the contents of your website according to the needs of your visitors.

To learn more about motion charts, view Motion Charts in Google Analytics on YouTube

The American Evaluation Association is celebrating Data Visualization and Reporting Week with our colleagues in the new DVR AEA Topical Interest Group. The contributions all this week to aea365 come from our DVR members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting DVR resources. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice.

· · · ·

My name is Randahl Kirkendall. I work part-time as an Evaluator with Ellen Iverson, Director of Evaluation, for the Science Education Resource Center (SERC) at Carleton College, which works to improve education through projects that support educators.  Our work is funded primarily through NSF grants.  SERC has expertise in geoscience education, workshop leadership, website development and program and website evaluation.

A primary aim of SERC is to help faculty adopt evidence-based teaching behaviors that will enhance student learning. In evaluating the websites at SERC, our interest is in the role of website use in faculty professional development. We use a variety of web analytic tools such as Google Analytics, server-based website statistics, and web page visit logs in combination with data from surveys, interviews, focus groups, and observations to get as complete a picture as possible for how faculty use  websites and the impact that their use has on teaching behavior.

Lesson Learned: One of the things we have learned from user interviews is that people generally have poor recall of how they found a website and used it. While they can explain why they go to a website (motivation), they have difficulty recalling at what section of the website they started, what pages they viewed, and the search strategy they used. Website use analytics and web server logs of individual visits provide a richer picture of user behavior and interests via records of the actual pages that they visited.

Lesson Learned: The SERC websites often don’t work in isolation. Our survey of 2,000+ faculty found that a significant number of users were using the websites to compliment other professional development activities such as attending workshops, exchanging ideas with colleagues, or reviewing literature. Thus, it has been prudent that we collect data on these other possible influences on their teaching behavior.

Cool Trick: We sequence or build evaluations incrementally, partially basing data collection and/or analyses on findings from other data collection methods. For example, we use the findings from user interviews to describe predominant motivations for using a website and any changes in behavior (such as teaching practice changes) that users attribute (at least partially) to website use. Those descriptions become a guide for using website analytic data to map particular patterns of use and to identify web use logs that can provide insight into how users may navigate the website.

Cool Trick: We use pop-up surveys to identify users that we might not otherwise reach. The pop-up asks for an email that we can use to follow-up with them for future surveys and interviews.

Want to learn more about Randahl and Ellen’s work? Join over 2500 colleagues at the AEA Annual Conference this November in San Antonio and check out their session in the conference program.

My name is Randahl Kirkendall. I am a public health manager turned evaluator. Platometrics is the name of my consulting business, which for the past three years has been focused on program research, planning, and evaluation. I am also a part-time evaluator for the Science Education Resource Center at Carleton College, which provides faculty professional development programs using a combination of workshops and web-based resources.

Four years ago while overseeing the development of two websites I learned how to use Google Analytics to track and measure website use. My first contract to evaluate website content was two years ago. Since then, I have learned much about evaluating program websites, but still consider myself to be on a steep learning curve in this area. Here is a little bit of what I have learned.

Lesson Learned: Using multiple and mixed evaluation methods that include both quantitative and qualitative metrics is the best way to fully understand the processes by which a website is being used as well as the outcomes that result. Web analytics can reveal much about how users navigate a website, which is something that users have difficulty recalling. Surveys and interviews can measure their motivations behind their website use, the impacts and outcomes of using a website, and descriptive information about the users themselves. Combining the two helps to provide a more complete picture that may also include the interplay between the website and other aspects of a program, such as a workshop or printed material.

Rad Resource: Occam’s Razor by Avinash Kaushik (www.kaushik.net/avinash). This website is built around a blog by an expert in web analytics who presents information in an easy to understand and good humored way. You might also want to check out his book, Web Analytics 2.0.

Hot Tip: I am currently developing a short Guide to Evaluating Program Websites, which I will post on www.platometrics.com later this month. If you would be interested in reviewing a draft or would like to be notified when it is posted, send me a note at rk@platometrics.com.

This is a relatively new and rapidly evolving area of evaluation, so if you know of any other good resources or ideas, please share them.

This contribution is from the aea365 Tip-a-Day alerts, by and for evaluators, from the American Evaluation Association. If you’d like to learn more from Randahl, consider attending his session at the AEA Annual Conference this November in San Antonio. Search the conference program to find Randahl’s session or any of over 600 to be presented.

· ·

Archives

To top