AEA365 | A Tip-a-Day by and for Evaluators

CAT | Distance Education and Other Educational Technologies

Hello, my name is Kim Manturuk and I’m the Program Evaluator at Duke University’s Center for Instructional Technology. I get to evaluate a lot of interesting projects related to teaching and technology, but one of my biggest jobs is to evaluate Duke’s Massive Open Online Classes (MOOCs).

MOOCs are free, non-credit classes offered by universities and other institutions to anyone in the world. Since 2012, Duke has developed over 30 classes that have amassed over 2 million registrations. It’s my job to evaluate how well students in these classes are learning what the instructors want them to learn.

To accomplish this, I send out a lot of surveys – over 2 million surveys (and counting) in less than 3 years! When I started this project, I would be lucky to get 5% of people who registered for a class to fill out the pre-course survey, and the post-course survey response rates often hovered around 1%. It’s practically impossible to say anything evaluative with a 1% response rate, so I tried a lot of different things to get more people to fill out surveys. Some worked and some didn’t, but I learned several good lessons along the way.

Lesson Learned: When sending a survey to an online class, sign the email invitation with the class instructor’s name (with permission, of course). People are more likely to respond when the invitation comes from someone they know and respect.

Lesson Learned: Avoid sending surveys on Mondays when they are more likely to be ignored or accidently deleted during the first of the week email cleaning. It is better to send surveys from mid-morning on Tuesday through Friday.

Cool Trick: Tell people in the survey invitation how many questions they will be asked and how long it will take. I set it up so that students are asked just ten questions and then they are automatically brought back to their class.

Hot Tip: Be thoughtful about what demographic questions you ask in online classes. In some cultures, questions about race or gender are considered confusing, intrusive, or even offensive.

Rad Resource: Would you like to take a free class from Duke University? Visit our list of classes at https://www.coursera.org/duke. And if you do register for a class, be sure to fill out the course surveys!

The American Evaluation Association is celebrating DEOET TIG Week with our colleagues in the Distance Education and Other Educational Technologies Topical Interest Group. The contributions all this week to aea365 come from our DEOET TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Braddlee and I serve as the Dean for Learning and Technology Resources for the Annandale Campus of Northern Virginia Community College. Over the past several years, I have helped facilitate the adoption of Open Educational Resources, or OER, to increase student success and lower educational costs. I recommend that evaluators in education look at OER when considering issues of access and expense, or when conducting cost-benefit analyses. Often, administrators, faculty, and teachers may not be aware of free resources available to their students that can reduce textbook costs and provide useful learning materials.

OER include teaching, learning, and research materials that reside in the public domain. These are sources that have intellectual property licenses that allow for their free use in education. In the past, OER were often isolated learning objects or modules. Now, many resources are available online, including complete courses, course materials, modules, textbooks, streaming videos, tests, software, and other materials for both K-12 and higher education. OER have been around for more than a decade, but now represent a growing alternative in online learning materials.

Lesson Learned: The National Association of College Stores claims a new textbook costs, on average, $62. The U.S. Pirg Education Fund estimate that higher education students pay approximately $100 per course for texts. Expensive texts present obstacles for students who may forego purchasing course materials due to costs, or incur larger debt to pay for school. Costs have also been shown to slow time to degree because dollars spent on texts cannot be spent on tuition.

Hot Tip: Creative Commons licensing is a system of “some rights reserved” fostering sharing of a given piece of content. Details can be found at Creative Commons . “5 R’s” of the Creative Commons (CC) license framework are the rights to “Retain, Reuse, Revise, Remix, and Redistribute” resources. The most popular, and most flexible license simply requires that credit be attributed to the author(s).

Hot Tip: The UKOER Evaluation Toolkit provides a broad ranging evaluation framework around OER use, ranging from initial awareness and adoption to institutional policies and sustainability.

Rad Resource: In higher education, innovative models are emerging including the OpenStax College, a nonprofit designed to assist with the creation of content.

Rad Resource: Projects like Open Washington

and the Zx23 Project in Virginia demonstrate state and national-level OER adoptions.

Rad Resource: The OER Research Hub presents research including distribution and impact analyses on OER.

The American Evaluation Association is celebrating DEOET TIG Week with our colleagues in the Distance Education and Other Educational Technologies Topical Interest Group. The contributions all this week to aea365 come from our DEOET TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I’m Jessica Hearn, Director of the Evaluation Center and graduate faculty at the University of Kentucky’s College of Education. While there are many approaches to evaluation, we at the Center like Patton’s Utilization Focused Evaluations because it emphasizes the utility or usefulness of evaluation for stakeholder decision-making and program improvement.

Working with diverse stakeholders over the years, I discovered it is not only important what information is collected, but also how that information is presented to clients. Recently, it became clear to me that clients were often not reading the Center’s reports. These evaluations were not providing the service they should, and that in part, that was because more traditional and academic formats for presenting findings were not engaging or understandable.

Now, whether working with clients or teaching graduate students, I ask that evaluators consider how information is presented to meet the needs of varied stakeholders. This is accomplished by considering how information can be more strikingly portrayed using infographics and visualization techniques.

Lessons Learned: It benefits stakeholders to move away from traditional reports and include graphics, callouts, colors, and more. Data visualization and infographics goes a long way towards telling the evaluation story and free and inexpensive tools are available online.

Cool Trick: Although not new, a Wordle to form word clouds helps to quantify and visualize qualitative data. The more frequently occurring words appear larger and reflect the more salient theme. It is like a visual frequency table of key words. It works very effectively for reporting on open ended questionnaire responses or synthesizing concepts from a stakeholder webpage.

Hot Tip: For a dynamic presentation of complex data, you can’t beat the work of Hans Rosling. His videos on the graphical Gapminder website can help evaluators at all level learn to develop motion charts. This dynamic approach of presenting is fresh, entertaining, intuitive, and educational.

Rad Resource: For print and web graphics, the online design program Canva helps to present information in interesting and professional ways. It takes some practice to create, but there are tutorials to help.

Rad Resource: For a more academic approach to visualizing data, New Directions for Evaluation dedicated two issues (139 and 140) to the topic.

Rad Resource: The Better Evaluation website has descriptions and resources to complete evaluations using the Utilization Focused Evaluations (UFE) approach.

The American Evaluation Association is celebrating DEOET TIG Week with our colleagues in the Distance Education and Other Educational Technologies Topical Interest Group. The contributions all this week to aea365 come from our DEOET TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I’m Michael J. Culbertson, team member on the Technical Evaluation Assistance in Mathematics and Science (TEAMS) project and research associate at RMC Research in Denver. We at TEAMS noticed that most webinar evaluations rely almost exclusively on post-webinar surveys of participant satisfaction. It’s true that satisfaction is an important indicator of webinar quality, but we felt that evaluators also need a more comprehensive picture of the different aspects of quality that come together in a great webinar.

We devised a rubric that covers seven key components:

  • Recruitment: Exciting participants before the event begins.
  • Technology: Wrangling the software and hardware that bring us together.
  • Content: Starting with a solid foundation.
  • Organization: Knitting everything together to make sense.
  • Delivery: Conveying a captivating message.
  • Visual Aids: Stimulating both sides of the brain.
  • Participant Interaction: Bringing the best out of the people in the virtual room.

We welcome feedback on the rubric and how it was helpful (or not so helpful) in your own project or evaluation! Tweet @teams4msp or visit the project TEAMS website to get in touch!

Lesson Learned: Make sure the technology is working. It is too easy for participants at their desk to get distracted, check an email, and 10 minutes later decide they are hopelessly lost.

Lesson Learned: A bad presentation makes for a bad webinar, but a good in-person presentation doesn’t necessarily translate directly into a good webinar without some modification. The webinar provides unique opportunities to connect at a distance, but also demands attention to flow, timing, imagery, content, and interaction.

Lesson Learned: One key quality of webinars is the potential for participants to interact with one another and with the presenters. The rubric examines whether each component of the webinar (from invitations to software) supports or hinders participant engagement.

Cool Trick: The TEAMS Webinar Rubric includes a user’s guide, the webinar rubric, and a set of considerations and self-reflection questions for webinar developers and evaluators.

Rad Resource: Look at 18 Tips on How to Conduct an Engaging Webinar for quick tips for upping your webinar game.

Rad Resource: To put on a great webinar from start to finish, look at Best Practices for Webinars which provides a thorough tour and great examples.

The American Evaluation Association is celebrating DEOET TIG Week with our colleagues in the Distance Education and Other Educational Technologies Topical Interest Group. The contributions all this week to aea365 come from our DEOET TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I’m Braddlee, and I serve as Dean of Learning and Technology Resources at Northern Virginia Community College in Annandale, Virginia. While there are a range of definitions of blended (also known as hybrid) online learning in higher education including those from the Online Education Consortium and Clayton Christensen Institute. Essentially, blended courses typically bring together both face-to-face and online instruction. This raises complications for administrators, instructors, and evaluators wanting to know when blended learning should be used and what makes for a well-designed course.

Blended learning has often been described as the “best of both worlds” allowing students the flexibility of online together with the richness of the face-to-face learning experience. When programs and courses are well designed and implemented, blended has the potential to improve learning outcomes, reduce time-to-degree, and expand access for learners who might otherwise not be able to participate. It has the potential to increase access in higher education, but is also used in PK-12 when digital and online sources expand course offerings.

Lesson Learned: Definitions for blended learning vary, but broadly include formal education that happens in part through online learning and in part in brick-and-mortar locations. There is also an expectation that at least some of the learning is student-controlled taking place remotely.

Lesson Learned: Courses and teaching require redesign when moving from face-to-face to partially online. Blended learning should be evaluated differently from either face-to-face or totally online courses. Key elements include how instructors use face time, and how successfully students can learn when working on their own.

Hot Tip: For those looking to review current scholarly research, a new title from Routledge, Blended Learning: Research Perspectives, Vol. 2 contains an excellent selection of the latest research on blended learning, including an section specifically devoted to the evaluation of blended learning.

Rad Resource: The Blended Learning Toolkit from the University of Central Florida contains a wealth of resources, including professional development resources, course examples, and links to its own Massive Open Online Course (MOOC) on blended learning.

Rad Resource: Planning and Designing an Online Blended Course from the University of New South Wales, Australia includes not only an in-depth set of resources for faculty, but also starting points for evaluation of blended courses.

Rad Resource: Online Learning Consortium, formerly known as Sloan-C, remains a go-to site for professional development, state-of-evolution, and research on online and blended learning. In addition to the well-known Quality Matters program the OLC’s Blended Learning Mastery Series is a well-developed boot camp program covering the research, teaching, and assessment of courses.

The American Evaluation Association is celebrating DEOET TIG Week with our colleagues in the Distance Education and Other Educational Technologies Topical Interest Group. The contributions all this week to aea365 come from our DEOET TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I’m Tara Shepperson, Chair of the Distance Education and Other Educational Technologies (DEOET) TIG of AEA, and Associate Professor in Educational Leadership at Eastern Kentucky University. Over the past two years, several issues have converged leading this TIG in a more focused direction:

Growth of elearning is focusing our members on the big topics around teaching and learning using technologies. No longer is it about the latest technology. We have all learned that is in a constant state of change. So, we are now exploring those larger elements that encapsulate managing, teaching, and learning across space, both in real time and at the time of participants’ choosing.

This week of tips will include some ideas how as evaluators we may consider a host of perspectives that impact teaching, design, resources, and learning. These revolve around what I like to call the six S’s: strategies, structures, spaces, students, styles, and sources.

With fewer or no face-to-face interactions, teaching strategies change. Instructor lectures, student discussions, and the give-and-take of traditional classrooms take on new forms and demand new strategies. Structures from the syllabus to presentations must be reworked to better meet the interactive and more visual designs of new learning spaces. With these new forms, comes increased student-control of learning and the need to reconsider how teaching accommodates diverse learning styles. Finally, the availability of sources for course content and student referencing must be considered.

Lessons Learned: Distance learning and educational technology is about much more than the technological tools. It is not an either-or. Rather, interactions take place on a broad spectrum from fully remote and student-centered to blended or hybrid (with some face-to-face or other real time interaction).

Lessons Learned: Distance learning and educational technology also includes a growing list of multi-media options for class work, meetings, or teacher-student conferencing that influence learning and training experiences.

Hot Tip: Be thoughtful about the types of information you seek in an evaluation. Often course online formats and end-of-course surveys or the same throughout a district, college, or university. If you want answers about course development or teacher/student experiences, the availability of that information may be more challenging.

Rad Resource: An ongoing forum about teaching and especially online learning, the Faculty Focus Newletter offers suggestions to instructors and ideas for evaluators.

Rad Resource: JOLT– the Journal of Online Learning and Teaching provides peer-reviewed articles on web-based instruction.

Rad Resource: The non-profit organization EDUCAUSE focused on the role of instructional technology at college and universities, covering research, security, and other issues relevant to Higher Education.

The American Evaluation Association is celebrating DEOET TIG Week with our colleagues in the Distance Education and Other Educational Technologies Topical Interest Group. The contributions all this week to aea365 come from our DEOET TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings from the University of Chicago! We are Courtney Heppner and Sarah Rand, Associate Project Directors for Research and Evaluation at the Center for Elementary Mathematics and Science Education (CEMSE) at the University of Chicago. We will be presenting at AEA’s Annual Conference in DC about producing online evaluation reports. Sarah has blogged about online evaluation reports before here. Our team recently completed a new online report on computer science education. You can view that report here.

Lessons Learned:

Online reports allow us to present information in a powerful and dynamic way: Long gone are the days of PDF reports full of tables, charts and lots of text. Online reporting allows us to insert audio and video clips into the report as well as interactive infographics, making the information more engaging for the reader.

Online report development requires the right team of people and is a true test in collaboration: Online reporting requires more than just the evaluation team – it also requires web developers and designers. This team must come together to agree on a process and execute within the set timeline. For more on our work in developing an infographic with a team from Visual.ly, read Sarah’s previous post here.

Online report development requires both time and money: Having more people contribute to the reporting process in turn requires more time and money than it takes to create a traditional PDF report. Our team now builds this cost in to new evaluation budgets.

Rad Resource: One of the best resources we’ve found in our online reporting endeavor has been feedback from our clients. We encourage anyone who decides to develop an online report to follow up with the client to get feedback. Questions we asked the client included:

  • Which part(s) of the report were the most valuable to you?
  • Did you share the online report with anyone? If so, what kind of response did you get about the report?
  • How did you share the report with others? What social media tools did you use?
  • How could we improve our online reporting in the future?
  • Was the online report more useful to your organization than a PDF?

Rad Resource:  Us!  Join us to learn more about online reporting during our session at the AEA Annual conference on Saturday October 19th!

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from theAmerican Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Courtney and Sarah? They’ll be presenting as part of the Evaluation 2013 Conference Program, October 14-19 in Washington D.C.

Hello colleagues. My name is Susan Lowes and I am currently Director of Research and Evaluation at the Institute for Learning Technologies at Teachers College, Columbia University. One of my challenges is to make the evaluation process interesting to those being evaluated so I am always searching for assessments that go beyond surveys and interviews.

Pile sorts, also called card sorts, are used to elicit participants’ understanding of a domain by observing how they group items in that domain. Pile sorts are engaging for participants while revealing information not easily available through direct questioning.

Hot Tip: Begin an interview with a card sort. All you need is a set of cards with words or pictures that you ask the participants to sort into piles of similar items. As they do this, you often ask them to explain their reasoning. Card sorts get people talking, work well with children or adults, and do not depend on facility with English.

Rad Resource:

An easily accessible article that describes how to get at cultural categories by using pile sorts is John B. Gatewood, “Culture … One Step at a Time,” online at http://www.lehigh.edu/%7Ejbg1/cogmeth.htm. Gatewood’s domain is fish, but you can substitute a domain that fits your own work.

Hot Tip: Pile sorts have been used in many different disciplines—for example, by social psychologists to study racial stereotyping, by anthropologists to study social class, by information architects in usability studies, and to distinguish experts from novices in a number of different domains, and by ourselves to explore students’ understanding of who is and is not an engineer, their understanding of sensors, and gendered perceptions of video games.

Rad Resource:

Another interesting article is Gun Roos, “Pile Sorting: Kids Like Candy.” She is looking at children’s perceptions of different types of food. It is in Victor D. de Munck and Elisa J. Sobo, eds., Using Methods in the Field: A Practical Introduction and Casebook (Walnut Creek, CA: Altamira Press, 1998).

Hot Tip: Pile sorts are amenable to simple or sophisticated statistical analysis, including various types of multivariate analyses.

Rad Resource: G. Rugg and P. McGeorge’s article, “The sorting techniques: a tutorial paper on card sorts, picture sorts and item sorts,” is a good introduction to the different methods of analysis. It is in Expert Systems, vol. 14, no. 2 (May 1997): 80-93.

Hot Tip: In our work, we have developed an online version of the pile sort that we have found to be even more engaging than the paper version, while providing more easily accessible data for both pre-post and process analysis.

Rad Resource: Come to our workshop at the AEA meetings to learn how the digital version works.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from theAmerican Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Susan? She’ll be presenting as part of the Evaluation 2013 Conference Program, October 14-19 in Washington D.C.

·

My name is Matt Galen. I’m a PhD student in Program Evaluation and Applied Research Methods at Claremont Graduate University, and I’m going to give you a few practical guidelines for putting on an “Open Conference.” First, a brief bit of background about how I became interested in the idea of open conferences. Over the past couple of years, in collaboration with the Rockefeller Foundation, UNICEF, and other organizations, I have coordinated webinars and developed content for the free international evaluation e-learning program that has been discussed elsewhere in AEA365. More information about the e-learning program can be found here.

Galen 1E-learning programs are typically designed for an entirely online or virtual audience. In contrast, Open Conferences are designed to broadcast a live conference event to a virtual audience. The energy of a live conference event, combined with the mixture of in-person and virtual participants, creates a unique dynamic that can either go extremely well, or terribly wrong. Having facilitated open conferences for the African Evaluation Association (AfrEA) conference, Claremont Graduate University’s Professional Development Workshops, and several other events, I can proudly say that it is possible to bring the chances of a “terribly wrong” scenario down to about zero. Here are some lessons I have learned along the way – I hope that others find them useful guidelines for facilitating open evaluation conferences in the future.

Lesson Learned: Why have an open conference: There are two primary purposes for webcasting conference sessions and developing evaluation e-learning programs:

(1)   To increase access to the latest thinking in program and policy evaluation for people who are not able to attend (due to lack of funds, travel time, disability, etc.)

(2)   To expand the “brand visibility” of a conference

(3)   To expand and enhance global communities and networks of professional evaluators

Galen 2Hot Tip: What you will need to put on a successful open conference:

(1)   A carefully developed plan for which conference sessions you will be broadcasting

(2)   Tools

  1. Laptop(s) – preferably powerful
  2. Webcam(s) – preferably high resolution (720p and 1080p are the current standard)
  3. Microphone(s) – preferably USB-input, preferably with voice-tracking capabilities. This is the most important tool, as audio can make or break an online experience
  4. Web conferencing software – many competitors in this arena, and the jury is still out on a clear winner

(3)   Other

  1. Web access – either via a LAN cable or wireless network
  2. Trained webcasters
  3. An eager audience – it is very important to effectively market an open conference via multiple channels (social networks, email listservs, etc.).

Hot Tip: integrating the online audience:

(1)   Remind presenters ahead of time to ask questions of the online audience

(2)   Ask live audience members to speak loudly and clearly when asking questions or making comments

(3)   For large audiences, can make use of automated audience question moderation tools like this.

Interested in learning more, or considering putting on an open conference but don’t know where to start? Want to talk about ideas related to open conferences? I love a good chat. Please leave a note in the comments or send me an email at matthew.galen@cgu.edu.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

·

Hi. My name is A. Rae Clementz and in addition to being the co-chair of the Graduate Student and New Evaluator TIG, I am also a techie.  I believe technology is of value when it helps us accomplish our goals in ways that are better, easier, and/or cheaper. I have evaluated several educational technology integration programs. Consistently one of the biggest barriers to successful implementation is teachers’ perceptions of the tool’s cost-benefit ratio. If the cost is too high, it’s a non-starter; the program or cool new toy will never fit in their school’s stretched budget. Even if the tool is free, if it’s too hard to use or doesn’t add some new or improved dimension to student learning, it’s not worth the effort.

I often feel similar time and budget constraints in my evaluations. Below are some cheap, efficient, and effective tools for two common evaluation tasks.

Rad Resource for conducting & recording interviews:

  • Google Voice | I’m one of those people who only has a cell phone. To avoid burning minutes during the day, I make my calls with Google Voice. Google voice uses the internet connection on either your computer or cell phone to make calls. Bonus feature: incoming calls can be recorded, and Google Voice automatically creates a transcript and .mp3 recording of the call in your Google Voice Inbox!
  • Skype + Evaer or Pretty May | Skype is one of the most common video and voice conferencing tools and its basic levels are free. Evaer and Pretty May are programs that record the voice and video feeds of Skype and save them out as either .mp3 or .wav files. Pretty May is free, as is the basic version of Evaer. Full version of Evaer is $20 with lifetime support and upgrades.

Lesson Learned:

It is critical when recording anything that you inform everyone that you’re recording the call, for what purposes, and ask them if they agree to be recorded. Many states have laws prohibiting unauthorized recording of phone conversations.

Rad Resource for disseminating evaluation findings:

  • Weebly + Scribd | Weebly is a simple, free, drag-and-drop, web-based, website design program. If you can use e-mail and PowerPoint, you can create a website using Weebly. Scribd is a free online publishing site. You can upload documents and either direct people to them on Scribd or embed them in websites or other social media sites.

Lesson Learned 1: Sadly, just because you built it doesn’t mean they’ll come.  But having a website for the evaluation is still a good way to provide transparency, encourage comment from stakeholders, and disseminate findings to broader audiences. The process of building the site also promotes more organized communication about the evaluation.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Rae? She’ll be presenting as part of the Evaluation 2012 Conference Program, October 24-27 in Minneapolis, MN.

· ·

Older posts >>

Archives

To top