AEA365 | A Tip-a-Day by and for Evaluators

TAG | technology

My name is Sophia Guevara and I am the Special Libraries Association Information Technology Division (SLA IT) Chair and American Evaluation Association Social Network Analysis Topical Interest Group Co-Chair. At SLA IT, I currently lead the Executive and Advisory boards. In an effort to bring the members of these boards together, I asked that the board work collaboratively on presentation using Google Slides in order to showcase the accomplishments as a team.

Rad Resource: Google Slides

I chose Google Slides as I had experience collaborating with others using the Google Docs tool. Creating a slide document was quite easy and after developing introductory slides, I inserted blank slides for each member of our executive and advisory board. This was done to provide each participant with an opportunity to share his/her accomplishments over the past few months.  Using Slides’ sharing option, I emailed an invite that provided edit access to each board member.

Rad Resource: Freeconferencepro

Once the presentation was developed, we used Freeconferencepro to deliver the presentation in conjunction with Google Slides.  For those who are unaware of this tool, it provides you with an opportunity to develop slides for free either by yourself or with others you choose.  This allowed board members, conference attendees, and others to access the information without regard to where they were located. In addition, for those that were unable to attend this meeting, Freeconferencepro’s recording option allowed me to develop a meeting recording that others could view at a later time.

Lessons Learned

The project required several follow-up reminder emails encouraging each board member to complete his/her slide. In these reminders I included a link to view the presentation, however, this seemed to confuse some who let me know that the link provided gave them no permissions to edit. The lesson learned was to send a reminder with a link with edit permissions so that it wouldn’t confuse those that were being reminded to complete their slide.

With that being said, one board member indicated that while he did not have experience with Google Slides prior to this project, he had previously used Google Docs and found that it was very similar. In addition, after the experience, his opinion of this tool was that it was an “effective way to communicate main points of a discussion or reports” and that the combination of Google Slides and Freeconferencepro was an effective way to share information among a distributed group.

The American Evaluation Association is celebrating Nonprofit and Foundations TIG Week with our colleagues in the NPF AEA Topical Interest Group. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Are you an experienced evaluator who thinks technology could help your work, but have no idea where to begin? I’m Aditi Patel, a recent graduate of the Fletcher School of Tufts University, where I worked with Cheyanne Scharbatke-Church to understand how seasoned evaluators–such as you, the AEA blog reader!–can think about where and how technology fits into evaluations in conflict contexts.

There are lots of evaluators out there who are digital natives and completely comfortable with technology, or digital immigrants who have learned how to be so. However, there are also seasoned evaluators who are digital refugees: experts in their field who are being forced to adopt technology, without a clear sense of what it is or what potential it holds for their evaluations. We intend for this paper to help them get started on their technological journey.

Lessons Learned:

  • Technology for evaluations doesn’t stop at the evaluation design stage! A lot of “how-to” guides focus on using technology for data collection, but there is great potential for using technology to help achieve the evaluation purpose (such as enhancing accountability or capacity-building), driving evaluation use, and assisting in evaluation management.
  • The AEA Guiding Principles can provide a handy framework for ensuring that the use of technology in an evaluation is held to the highest professional standards–technology can be a tool to help drive evaluation good practice.
  • Before using technology in your evaluation, this five-step “filter” can help an evaluator decide if it’s suitable. The decision filter establishes a process by which evaluators can discern when and how to integrate technology into program evaluations in a manner that increases effectiveness and efficiency of the process, while still being conflict-sensitive. This filter acknowledges that technology can very rarely serve as a stand-alone solution to common evaluation challenges. Nor will it replace strong research and evaluation skills; used correctly, though, it has great potential to strengthen evaluations in conflict contexts.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

My name is Connie Viamonte and I am a Research and Evaluation Associate at Q-Q Research Consultants based out of Miami, Florida.

Today I want to share some of our experiences utilizing new technologies. We’ve found that when we’re unafraid to try new things, technology is a valuable tool in the evaluation world.

Most recently, our firm was hired to evaluate a teen pregnancy prevention program here in South Florida. Our survey meant to ask some very personal questions and we worried about administering the survey and how comfortable youth would feel responding.

The plan was to administer the survey using Fire HD tablets. Our firm had used the tablets most recently for some work we did in Haiti. We had to work without an internet connection and found a simple program, droidSurvey, that would allow us to collect survey responses without WiFi and then download them later through a Cloud function.

Lesson Learned: The added benefit of using the tablet with the teens seemed to be their excitement. Even the adults were impressed with the mode of survey delivery. Our team realized that the psychology of the tablet plays an important role. Although the survey was fairly long, students were so engaged with the novelty of the Fire HD, that they stuck it out and, from cursory observations, seemed willing to be open and honest in their responses. Additionally, the tablets were easy to hold close to the body with one hand and the screen angle did not allow others to easily see their peer’s responses.

Hot tip: Don’t be afraid to explore what technology can offer you. It’s a wonderful way to reach new generations of tech-savvy respondent and it even strikes a chord with those who are not as “techy”. Easy-to-use tablets are one way.

Our firm is now experimenting with a “clicker” system that we would like to introduce for focus groups to solicit responses to multiple-choice questions and project results to the group in order to guide and facilitate discussion.

We are trying to expand our services by trying out fun and friendly ideas that make our job easier and the experience of our respondents much more pleasant!

We’re celebrating 2-for-1 Week here at aea365. With tremendous interest in the blog lately, we’ve had many authors eager to share their evaluation wisdom, so for one special week, readers will be treated to two blog posts per day! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Theresa Murphrey and I am on the faculty at Texas A&M University. I teach both traditional classroom based student as well as distance-based learners from around the world. I have been working with a new tool that I have found to be extremely useful in facilitating teaching and learning.

2014 Update: I have now used this tool for the past several years and still find it to be extremely useful in facilitating teaching and learning.

Rad Resource: Jing is a technology that allows one to capture what is seen on the computer screen and audio using a microphone to create a video that can be easily shared online. The use of Jing can enhance, extend, and support the delivery of course content by having students apply what is presented in the course. Technologies such as Jing offer ways to allow students to express themselves and increase sensory input, thus increasing the chance that we can engage the student and enhance learning. Use of such technologies in an experiential process allows students to gain ownership of their ideas and communicate these ideas more clearly.

Approaches to Using Jing for Assignments:

  • Ask students to find an answer to a question using the Internet and share that answer using Jing in a recording of the Internet site and their explanation.
  • Ask students to review specific material and relate a particular finding to themselves personally in a recording using Jing.
  • Ask students to create a presentation and record that presentation using Jing.
  • Verbally annotate electronically-submitted student work products, providing detailed and contextually clear feedback and guidance

Jing may be found online at http://jingproject.com

2014 Update:  I demonstrated Jing as part of the AEA Coffee Break Demonstration Series on February 18, 2010. To learn more, click here. The webinar recording is free for AEA members. Click here for a handout on Jing, free for everyone!

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, I am Kerry Bruce, Director of Results and Measurement at Pact and I’m currently based in Madagascar and support Pact programs in Africa, Asia and Eastern Europe. In 2012 we started to roll out the use of mobile technology in our programs including evaluation.  This third post in the series on how to get started with mobile phone technology in evaluation focuses on training and taking your data collection to the field.

Hot Tips:

  • When you are training people to use mobile phone technology for evaluation, here are some things that you’ll need to consider:
    • Are the people you are training familiar with mobile phones or will they need significant training and mentoring?  What type of phone will be easiest for them to use?
    • Make sure that people are VERY familiar with the survey instrument and VERY familiar with the phone operation before they get to the field.  Otherwise they spend the entire interview looking at the phone and never look at the person they are interviewing.
    • Make sure respondents understand that data are being entered on the phone, but that data are encrypted and will be stored confidentially.
  • When you go to the field to collect data using mobile technology:
    • Take along a few paper copies of the survey instrument, both as a back-up in case you run out of power or lose the phone, but also to refer to during questionnaire administration.
    • If a question is not working or needs to be changed this can be done even as the survey is being conducted.  A change is made on the central server and data collectors can be notified via SMS or phone to upload the newest version of the survey.  This is best done at the end of the day.  This is very handy for dealing with troublesome questions that made it through pre-testing and piloting.

Lesson Learned:phone

  • Mobile technology will not improve your survey design or instrument.  It will improve the timeliness and accuracy of your data collection, but it won’t magically insert a missing variable.
  • The structure of field supervision changes with mobile technology and happens best from a central location with a strong internet connection.  Supervisors check the data as it comes in and can call data collectors directly with feedback while they are still in the field if there is an issue.

Rad Resources:

Online mobile technology training for a variety of uses is available from TechChange.

A great training resource is available from the CLEAR Initiative.

More information about using mobile phones for data collection is available from the BetterEvaluation page on Mobile Data Collection.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello, I am Kerry Bruce, the Director of Results and Measurement at Pact.  I’m currently based in Madagascar and support Pact programs in Africa, Asia and Eastern Europe. I am part of Pact’s central technical team that provides monitoring and evaluation support to more than 20 country offices and more than 70 projects around the world.  In 2012 we started to roll out the use of mobile technology in our programs including evaluation.  This is the second post on mobile technology and focuses on choosing your platform. Check out my first post on this topic: Getting Started with Mobile Phones.

Hot Tips:

  • Look at a wide range of available platforms and ask yourself:
    • What is my budget for phones?  Some platforms work better than others with entry level (vs. Android) smartphones.
    • What is my budget for the data collection?  Will my data collection reoccur frequently (on-going evaluation) or is this a one time event?  Each platform has a different pricing structure and each lends itself to different types of data collection.
    • Platform operators will promise you the moon – but will their platform deliver?  Test basic issues such as skip logic, ease of set-up and use, how data download and dashboards work before you buy.  Most platforms have a trial version that you can use and some allow small data collection projects for free.
    • Will I need help to set up my survey, or do I have the skill set to set it up in house?  Some platforms offer survey set up and technical support (useful for complicated data collection exercises) and some are all do-it-yourself.
    • What language will the survey be in and can the platform support it?  This is especially important for non-Latin alphabets.
    • Get a reference.  All these platforms should be able to provide you with a reference from someone who has used them before and can tell you what is good and what needs work.

Lesson Learned: Evaluate two or more platforms before you decide which one to use. 

  • Some have recurrent or annual costs and others only charge for the data that you collect.  Others are free up to a certain level of data collection.
  • Each platform has its strengths (and weaknesses) – you’ll need to understand what you need it to do and shop around until you find it.
  • Just because a platform can not do something today does not mean they won’t be able to do it tomorrow, check back and give feedback.  This technology is rapidly adapting.

Rad Resources: Here is a list of some of the mobile technology platforms that are commercially available today.

Bruce mobile tech

*These are platforms I have used.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello!  We’re Michelle Landry and Judy Savageau from the University of Massachusetts Medical School’s Center for Health Policy and Research.

As Sean Allen Levin suggested in a recent aea365 post, organizing an evaluation project or multiple projects can be daunting. The details of timelines, responsibilities, and deliverables can easily be lost if they are only in the project manager’s head. As with any project, maintaining a quality and structured work environment benefits the entire team and clients. To expand on Sean’s blog, we’d like to share hot tips and lessons learned during our recent review of project management tools and how it’s brought to light not only the vast sea of options, but also what our needs/wants are from the tool.

Lesson Learned: Work backwards to determine your specific needs. What outputs are most useful? What information is needed to report regularly? How large is the project or how many projects need to be captured in the database?

Lesson Learned: Determine if you have a budget or spending limit to support a new tool. There are affordable options that aren’t readily obvious. However, if you have the budget, there are options to satisfy your every whim.

Hot Tip: For no/low cost options, look to Microsoft Office. This is affordable because most users already have the software; e.g., MS Excel, MS Access, and MS Project, which have onsite relational databases with ranges in user-friendliness/abilities. A number of software vendors sell robust project management tools, but they come with a price tag. We reviewed tools used by our university colleagues; e.g., Quickbase (now piloting) and Journyx. Many others are available with websites offering comparisons among the applications.

Lesson Learned: Review your needs against the software’s options. Many websites allow testing project management tools through a virtual tour. Take advantage of this; it’s best to see how user-friendly the software is before purchasing.

Lesson Learned:  Each application has budgetary implications, so if, of necessity, you’re budget conscious, check into the vender’s software offerings. Is it a one-time cost or license needing annual renewal? Does it require a monthly user fee?

Hot Tip: Customize, customize, customize… Most software packages are customizable. Do not take it “out of the box” and assume that’s all you get. Many vendors offer customization options to meet your needs. Customizing can take a few rounds as test-driving one change often uncovers additional changes. Customizing the software may reduce frustration and better meet your needs. See if there’s someone “in-house” who can customize your software before paying the vendor – a great budget saver!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello, we are Linda Cabral and Laura Sefton from the Center for Health Policy and Research at UMass Medical School. We often collect qualitative data from interviews and focus groups. One challenge we frequently face is how to quickly and efficiently transcribe audio data. We have experimented using voice recognition software (VRS), and we’d like to share our approach.

You will need headphones, a microphone (stand-alone or attached to a headset), and a computer with audio playback and VRS installed on it. We use Dragon Naturally Speaking Premium Version 11.5 voice recognition software, however other VRS is available. Use of audio playback software will allow you to control the playback speed, so you can slow it down, pause, fast forward, and rewind as needed.

Open the audio file in the playback software and open a new document in the VRS. While listening to the audio via the headphones, repeat what you hear into the microphone. During this step, you can format the document to indicate who is speaking and to add punctuation. Because VRS works best when trained to understand a single voice, a designated team member should repeat all spoken content, regardless of how many voices are in the audio file.

This process will generate a document in the VRS that can be saved to your computer as a Word file. As a final review, read through the Word file while listening to the audio file and make needed corrections. This could be done by another member of the project team as a double check of the document’s accuracy.

Hot Tips:

  • Spend time training the VRS to recognize your voice. A few practice sessions with the software may be needed where you can read dummy data into the software in order for it to learn your voice. This will improve the transcription quality, minimizing the time spent editing.
  • Train the VRS to recognize project-specific acronyms or terminology prior to starting transcription.

Lessons Learned:

  • Often, financial resources for evaluation projects are limited. In an effort to keep the transcription process in-house, our administrative staff transcribed the audio files. By using the VRS and someone from our project team familiar with the data as the designated recorder, we have found savings in time and efficiencies.
  • No transcription yet has captured 100% content accurately the first time. Therefore, build in time to listen to the recording and to make manual edits.

Rad Resources:

These resources may be helpful as you explore whether VRS is right for you.

  • VRS products Review by consumersearch: “In reviews, it’s generally Dragon vs. Dragon”

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, I am Kerry Bruce, Director of Results and Measurement at Pact.  I am part of Pact’s central technical team that provides monitoring and evaluation support to more than 20 country offices and more than 70 projects around the world.  In 2012 we started to roll out the use of mobile technology in our programs. We have begun integrating mobile technology into our programs by using mobile phones for baseline and endline data collection.

Hot tips:

  • Bruce Mobile phones 1Mobile technology has advanced significantly since the last time you likely considered using it and now is the time to invest in learning about this technology.  Many of the early bugs have been worked out and the commercially available platforms make collection of data via mobile phone or tablet quite easy.
  • New platforms are easy to use, there are many to choose from and most include built in dashboards that help you to review and visualize your data.
  • A careful assessment of network coverage, power and power back-up should be done before you decide on a type of phone and platform.  While you don’t necessarily need a signal to use mobile phones to collect data (you can collect data offline) you will need a phone with long battery life! Many phones are now GPS enabled—and you should consider these if you would like to collect GPS waypoints and conduct geospatial analyses.
  • Understand the skills and competencies of your data collectors.  Will they be people who are familiar with mobile phones or will they need significant training and mentoring?  What type of phone will be easiest for them to use?
  • If you are using mobile phones for data collection of a baselines survey, for example, will you have a follow on use for the phones? You’ll want to consider what type of phone will be most useful for future activities so that you can yield a higher return on investment of your initial purchase.

Lessons Learned:  Bruce Mobile phones 2

  • A careful assessment of your data collection needs, logistical issues, and possible future projects is necessary before you start utilizing mobile technology.
  • Because not everyone sees the benefits of mobile technology, a basic overview of the advantages of this innovation is helpful to get your co-workers on board.

Rad Resources:

  • Online mobile technology training for a variety of uses is available for a fee from TechChange.
  • There is a free online mobile data collection selection assistant at NOMAD.

*Thank you to Mobenzi Researcher and DataWinners (DataWinners free data collection App for Android devices built using Open Data Kit tools) for the use of their images in this post.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I am Tarek Azzam, assistant professor at Claremont Graduate University and associate director of the Claremont Evaluation Center.

Today I want to talk about crowdsourcing and how it can potentially be used in evaluation practice. Generally speaking, crowdsourcing is the process of using the power of the many individuals (i.e. the crowd) to accomplish specific tasks. This idea has been around for a long time (e.g. the creation of the oxford dictionary), but due to recent developments in technology, the ability to access the power of the crowd has become much easier.

I will focus on just one crowdsourcing website because Amazon’s Mechanical Turk is the most widely known, used, and studied crowdsourcing site. This site helps to facilitate the interactions between “requesters” and “workers” (see figures below). A requester can describe a task (e.g. please complete a survey), set the payment and allotted time for completing a task, and determine the qualifications needed to finish the task. This information is then posted on MTurk website, and interested individuals who qualify can complete the task for the promised payment.

Clipped from https://www.mturk.com/mturk/welcome

This facilitated marketplace has some really interesting implications for evaluation practice. For example, evaluators can use MTurk to establish the validity and reliability of survey instruments before giving them to intended participants. By posting a survey on MTurk and collecting responses from individuals with similar background characteristics as your intended participants, an evaluator can establish the reliability of a measure, get feedback on the items, and if needed translate the items into another language. All this can be accomplished in a matter of days. For me personally I’ve been able to collect 500 responses for a 15 minute survey, at a cost of 55 cents per survey in less than three days.

Hot Tip: when selecting the eligibility criteria for MTurk participants choose those with 95% or higher approval ratings.

There are other uses that I am currently experimenting with. For example:

  • Can MTurk respondents be used to create a matched comparison group in evaluation studies?
  • Is it possible to use MTurk respondents in a matched group pre-post design?
  • Is it possible to use MTurk to help with the analysis and coding of qualitative data?

These are things that are yet to be known but I will keep you updated as we progress in exploring the limits of crowdsourcing in evaluation practice.

Clipped from https://requester.mturk.com/create/projects/new

Hot Tip: I will be presenting a Coffee Break Demonstration (free for American Evaluation Association (AEA) members) on Crowdsourcing on Thursday April 18, 2013 from 2:00-2:20pm EDT. Hope to see you there.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

Older posts >>

Archives

To top