AEA365 | A Tip-a-Day by and for Evaluators

TAG | Data Collection

Hi, my name is Rhodri Dierst-Davies, an evaluation specialist with Deloitte Consulting LLP working out of our federal practice. Many times federal programs are, by design, implemented differently across states and municipalities. These programs rely on inputs from local stakeholders and policy makers to ensure they are tailored to the needs of the communities they serve. While this can help maximize benefits to beneficiaries, it creates challenges for federal evaluators as they try to demonstrate generalizable benefits across an entire system. With an increased emphasis on evaluations that can provide both national and local benefits, I will explore potential solutions that may help solve common national evaluation challenges.

Lesson Learned: Generate common goals and objectives specific to all programmatic aspects. This way individual jurisdictions can create tailored evaluation frameworks that focus on what is relevant to them.

Hot Tip: Consider offering capacity building grants that are directly focused on evaluation. Such grants are effective at helping individual jurisdictions build their evaluation instruction, as some requirements may be difficult to implement.

Lesson Learned: Offer a data collection warehouse that contains a set of easily accessible common data collection measures. While there are always a set of core variables that must be collected (e.g. socio-demographic characteristics), offering a set of validated measures focused on other factors important to local jurisdictions (e.g., needs assessments, benefits utilization, mental health, stigma), from which a jurisdiction may pick from to measure local benefits, can help facilitate analysis of individual programmatic elements which are not uniformly implemented.

Hot Tip: Consider creating a secure web-based data collection portal that providers can easily use to collect and store evaluation data. This may reduce burden on local jurisdictions who will not have to rely on creating their own local systems. It may also help reduce reporting burdens of individual jurisdictions, as data collection will be semi-automated.

These ideas may help streamline the evaluation efforts of national programs in multiple ways. For an individual jurisdiction, it can reduce burdens around data collection while still providing implementation flexibility. At the national level, it can streamline data collection and reporting methods. Taken together, these suggestions can facilitate both the timely reporting and integrity of data, aspects important for successful federal evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from Toyin Owolabi at the Women’s Health Action Research Center (WHARC) in Nigeria and Susan Igras at Georgetown University’s Institute for Reproductive Health (IRH). Last year, we joined together on a cross-country project to build capacity in designing and evaluating programs for younger adolescents.

Younger adolescent programming and related program evaluation is nascent in the international arena. Nigeria is a leader in Africa in adolescent health programming and research but, like many countries, has not yet focused much on the developmental needs and concerns of 10-14 year olds, who are often lumped into all-adolescent program efforts. Younger adolescents’ cognitive skills are still developing and traditional focus group discussions and interviews do not work well. Games and activity-based data collection techniques can work much better in eliciting attitudes, ideas and opinions.

Going beyond knowledge to assessing more intangible program outcomes such as gender role shifts, IRH has been using participatory methodologies drawn from rapid rural appraisal, advertising, and other disciplines, and adapting them for evaluation.

Staff from WHARC, a well-respected research and advocacy organization, were oriented to and used many of these methodologies for a first-time-ever needs assessment with younger adolescents in Ibo State. The assessment provided data to advocate for age-segmented program approaches for adolescents and inform program design. Some of the things we learned:

HOT TIPS:

Make data collection periods brief for short attention spans. Build in recess periods (and snacks!) if data collection takes longer than 20-30 minutes.

Challenge your comfort level in survey development. Standard adolescent questions may not apply. Younger adolescents’ sexual and reproductive health issues generally revolve around puberty, self-efficacy, emerging fertility, gender formation, and body image, and NOT pregnancy and HIV prevention.

Youth engagement is important, and older adolescents may contribute better to evaluation design. Having recent recall of the puberty years, they also bring more abstract reasoning skills than younger adolescents.

COOL TRICK:

“Smile like you did when you were 13 years old!” This opened one of our meeting sessions and startled quite a few participants. It is really important to help adults get into the ‘younger adolescent zone’ before beginning to think about evaluation.

RAD RESOURCES:

This article by Rebecka Lundgren and colleagues provides a nicely-described, mixed method evaluation of a gender equity program (2013): Whose turn to do the dishes? Transforming gender attitudes and behaviours among very young adolescents in Nepal.

The Population Council is revising its seminal 2006 publication, Investing when it counts: Generating the evidence base for policies and programmes for very young adolescents. A guide and toolkit. Available in late 2015, it contains evaluation/research tool kit references available from various disciplines.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Our names are Jessica Manta-Meyer, Jocelyn Atkins and Saili Willis and we are evaluators at Public Profit, an evaluation firm with a special focus on out-of-school time programs for youth.

We usually evaluate networks of after school programs as a whole (some of which serve more than 20,000 youth, where a survey is indeed one of the best approaches). However, we particularly enjoy opportunities to build the capacity of youth programs to solicit feedback through creative ways that align with best youth development practices.

Here are some of the methods that have been most popular with these programs:

Cool Trick – Journals: At the start of a program, provide journals for all youth in the program and ask them to write something related to the program goals. Is one of the program’s goals to develop leadership skills? They can ask the youth to respond to this question: “In what ways are you a leader?” Is one of the goals to increase enjoyment of reading? “What do you like about reading?” Then, at the end of the program, youth can read what they wrote the first day and write “How would you answer that question differently, now?” or some other question to get them to reflect on how they’ve changed in the program.

Cool Trick – Candy surveys: Ask students to answer surveys questions by putting certain colors of candy in a cup then tally the candy colors to get your responses. Have the youth tally the results themselves. They can even make a bar chart on chart paper by taping the actual candy to the paper. The youth can then eat the candy after they’ve tallied the results.

Hot Tip – used wrapped candy! Starburst works well and is what this summer program used:

editpic1

Cool Trick – 4 Corners Activity: Youth leadership programs do this all the time. They ask youth to “take a stand” next to signs that are marked Strongly Agree, Agree, Disagree or Strongly Disagree in response to a statement like “youth should be able to vote at age 16.” Once the youth stand next to one of the signs, the group can talk out their different perspectives. Programs can also use this to collect both quantitative (how many stand where) and qualitative (what they say about why they are standing where they are) data.

Hot Tip: For more Creative Ways, come to our Skill-Building Workshop Saturday at 8am. Yes, it’s early, but we promise to have you moving, interacting and creating. Plus, there will be candy.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hi, I’m Audrey Rorrer and I’m an evaluator for the Center for Education Innovation in the College of Computing and Informatics at the University of North Carolina at Charlotte, where several projects I evaluate operate at multiple locations across the country.  Multisite evaluations are loaded with challenges, such as data collection integrity, evaluation training for local project leaders, and the cost of resources. My go-to resource has become Google because it’s cost-effective both in terms of efficiency and budget (it’s free). I’ve used it as a data collection tool and resource dissemination tool.

Lessons Learned:

Data Collection and Storage:

  • Google Forms works like a survey reporting tool with a spreadsheet of data behind it, for ease in collecting and analyzing project information.
  • Google Forms can be sent as an email so that the recipients can respond to items directly within the email.
  • Google documents, spreadsheets, and forms can be shared with any collaborators, whether or not they have a gmail account.
  • Google Drive is a convenient storage source in ‘the cloud.’

Resource Dissemination:

  • Google Sites provides easy to use website templates that enable fast website building for people without web development skills.
  • Google Groups is a way to create membership wikis, for project management and online collaboration.

Rad Resource: Go to www.google.com and search for products. Then scroll down the page to check out the business & office options, and the social options.

For a demonstration of how I’ve used google tools in multisite evaluation, join the AEA Coffee Break Webinar on February 27, “Doing it virtually: Online tools for managing multisite evaluation.” You can register here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

My name is Caren Oberg and I am the Principal and Owner of Oberg Research. I am a proud late adopter. Proof? I still use a paper calendar and have Moleskin notebooks dating back years. But I have joyfully embraced tablet applications for data collection. The applications below, not to mention many others, have made the process cheaper, greener, less prone to human error and more innovative.

Rad Resources: All resources below work on iPads and Android tablets, except Story Kit, which is iPad only.

TrackNTime is designed for tracking participant interactions or behaviors in a learning environment.

QuickTap Survey is a survey platform designed specifically for tablets. It is easy to read, pretty to look at, and you can collect data without an internet connection.

Sticky Notes come pre-installed on most tablets. Participants can move sticky notes around the screen, grouping and regrouping, based on questions you ask.

Story Kit allows your participants to recreate their experiences through images and text by using an electronic storybook.

Hot Tips: Consider the type of data you are trying to collect. The majority of tablet apps I have come across can do one type of data collection extremely well, but are not yet built for multi-method data collection. That said, you can easily switch back and forth between two applications and link the data manually by assigning a single id number to both.

Apps eliminate data entry. They do not eliminate data cleaning, nor do they do advanced analyses. Yet.

Lessons Learned: The number of applications developed specifically for evaluators is small. Learning to manipulate applications to fit my needs has been very important. As well as letting go of an app when it is just not going to work for me. Knowledge sharing is also important. I was made aware of Quicktap Survey and StoryKit from my colleague Jennifer Borland of Rockman, et al, who in turn learned about StoryKit at Evaluation 2013.

In that vein I will be talking about all four resources as an AEA Coffee Break webinar on February 20, 2014. Hope you can join.

Clipped from http://www.quicktapsurvey.com/

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

We are Viengsavanh Sanaphane and Katie Moore from Catholic Relief Services.  We are working on an Inclusive Education (IE) project in Southeast Asia.

Currently, IE efforts are expanding. Experienced partners that have worked on IE for years in the local context were planning to share qualitative data about their project with stakeholders new to the initiative; qualitative data is defined here as previous project stakeholders’ perceptions on the projects’ successes, challenges, and how they overcame challenges during implementation.

We had the question:  How may experienced IE stakeholders’ share their project’s qualitative data with incoming stakeholders in a dynamic and engaging way?  The idea for previous stakeholders to use a play to share their qualitative data and new stakeholders to use reflection journals to record qualitative data instead of using Powerpoint for data sharing and traditional surveys to collect data was conceptualized.

The qualitative data for the data sharing event was presented by experienced stakeholders in the form of a play that they wrote and practiced for the explicit purpose of presenting data dynamically.  This was followed by a Q&A session and one-on-one interviews between former and current project implementers.  Stakeholders new to IE efforts were able to use their journals to record what they determined was relevant qualitative data from the play and discussions needed for implementing IE in their own contexts.  New stakeholders requested keeping their journals as an ongoing tool to refer to while strategically planning how they would implement an IE project.

While not trialed yet, the qualitative data that new project stakeholders recorded as relevant could be implemented in their own project, analyzed by again reading the journals, and analyzed at the project’s end to see the impacts of data sharing events between two similar contexts implementing the same type of project.   Further, stakeholders may refer back to their own work, thus lending to an empowerment and utilization approach whereby stakeholders are able to use locally produced data for local decisional making processes and other personal/professional needs.

Rad Resource:  Use free 2013 templates from Microsoft to create booklets/journals that allow stakeholders to record qualitative data presented in a refreshing way that lends towards an empowerment M&E approach.

Lesson Learned:  For an empowerment approach, have stakeholders design questions to be included in the journals in order to be the “owners” of their own processes and work.  Facilitate a “practice-run” of the book with a small group to identify aspects that needed to be modified prior to using with a large group.

Hot Tip: Data collectors or project implementers may photocopy or take pictures with Smartphones, tablets and/or cameras of the journals, given informed consent/permission of stakeholders, as a way to record the qualitative data for later analysis.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! We are Stacy Johnson and Cami Connell from the Improve Group. At Evaluation 2013, we had the opportunity to present on our experiences using a unique mixed methods approach to collecting data.

Your data collection strategy has the potential to seriously impact your evaluation. You might ask yourself questions like: How do we make sure we are getting the whole story? What if one method isn’t appropriate for gathering all the information you need from a single source? How do you engage people in data collection in a way that makes them understand and want to use the findings? One way to address these questions is to think about each stage of data collection as a layered process by directly connecting quantitative and qualitative methods to complement each other and build a more in-depth and accurate story.

How is this different from how we traditionally think about data collection? We still access the same key sources to answer our evaluation questions, but the design includes a feedback loop to allow the evaluator to immediately integrate any initial findings into the data collection process as they emerge. This often means intentionally including additional interviews or focus groups after an initial stage of data collection to present data back to stakeholders and ask for feedback and relevant background about emerging themes.

Lesson Learned: Provide an orientation to data. Not everyone looks at data every day! Walking stakeholders through data increases the chances that they will want to use it to inform decisions.

Hot Tip: Create easy to interpret graphics to make data more accessible.

Lesson Learned: Make it a mutually beneficial process. In addition to gathering important information for the evaluation, it is equally important to make sure people feel like they are heard and that sharing their experiences can positively impact their work.

Hot Tip: Facilitate discussion about how data applies in day-to-day work.

Hot Tip: Encourage problem solving and planning for how data can inform changes or improvements.

Lesson Learned: Understand the stakes and relationships. Depending on the nature of relationships and potential consequences of the evaluation, there is a risk of people painting an overly positive or overly negative picture. In addition, when presenting data from one source to another, careful attention should be paid in masking the identity of the original source, especially when there are easily identifiable groups or an existing adversarial relationships.

Hot Tip: Include people with different perspectives and roles in the data collection process to uncover any underlying dynamics.

Hot Tip: Try to be aware of any adversarial or contentious relationships that may exist. This approach is not always appropriate depending on existing relationships.

Hot Tip: Mask the original source of data as appropriate.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello, I am Kerry Bruce, Director of Results and Measurement at Pact and I’m currently based in Madagascar and support Pact programs in Africa, Asia and Eastern Europe. In 2012 we started to roll out the use of mobile technology in our programs including evaluation.  This third post in the series on how to get started with mobile phone technology in evaluation focuses on training and taking your data collection to the field.

Hot Tips:

  • When you are training people to use mobile phone technology for evaluation, here are some things that you’ll need to consider:
    • Are the people you are training familiar with mobile phones or will they need significant training and mentoring?  What type of phone will be easiest for them to use?
    • Make sure that people are VERY familiar with the survey instrument and VERY familiar with the phone operation before they get to the field.  Otherwise they spend the entire interview looking at the phone and never look at the person they are interviewing.
    • Make sure respondents understand that data are being entered on the phone, but that data are encrypted and will be stored confidentially.
  • When you go to the field to collect data using mobile technology:
    • Take along a few paper copies of the survey instrument, both as a back-up in case you run out of power or lose the phone, but also to refer to during questionnaire administration.
    • If a question is not working or needs to be changed this can be done even as the survey is being conducted.  A change is made on the central server and data collectors can be notified via SMS or phone to upload the newest version of the survey.  This is best done at the end of the day.  This is very handy for dealing with troublesome questions that made it through pre-testing and piloting.

Lesson Learned:phone

  • Mobile technology will not improve your survey design or instrument.  It will improve the timeliness and accuracy of your data collection, but it won’t magically insert a missing variable.
  • The structure of field supervision changes with mobile technology and happens best from a central location with a strong internet connection.  Supervisors check the data as it comes in and can call data collectors directly with feedback while they are still in the field if there is an issue.

Rad Resources:

Online mobile technology training for a variety of uses is available from TechChange.

A great training resource is available from the CLEAR Initiative.

More information about using mobile phones for data collection is available from the BetterEvaluation page on Mobile Data Collection.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I am Kerry Bruce, Director of Results and Measurement at Pact.  I am part of Pact’s central technical team that provides monitoring and evaluation support to more than 20 country offices and more than 70 projects around the world.  In 2012 we started to roll out the use of mobile technology in our programs. We have begun integrating mobile technology into our programs by using mobile phones for baseline and endline data collection.

Hot tips:

  • Bruce Mobile phones 1Mobile technology has advanced significantly since the last time you likely considered using it and now is the time to invest in learning about this technology.  Many of the early bugs have been worked out and the commercially available platforms make collection of data via mobile phone or tablet quite easy.
  • New platforms are easy to use, there are many to choose from and most include built in dashboards that help you to review and visualize your data.
  • A careful assessment of network coverage, power and power back-up should be done before you decide on a type of phone and platform.  While you don’t necessarily need a signal to use mobile phones to collect data (you can collect data offline) you will need a phone with long battery life! Many phones are now GPS enabled—and you should consider these if you would like to collect GPS waypoints and conduct geospatial analyses.
  • Understand the skills and competencies of your data collectors.  Will they be people who are familiar with mobile phones or will they need significant training and mentoring?  What type of phone will be easiest for them to use?
  • If you are using mobile phones for data collection of a baselines survey, for example, will you have a follow on use for the phones? You’ll want to consider what type of phone will be most useful for future activities so that you can yield a higher return on investment of your initial purchase.

Lessons Learned:  Bruce Mobile phones 2

  • A careful assessment of your data collection needs, logistical issues, and possible future projects is necessary before you start utilizing mobile technology.
  • Because not everyone sees the benefits of mobile technology, a basic overview of the advantages of this innovation is helpful to get your co-workers on board.

Rad Resources:

  • Online mobile technology training for a variety of uses is available for a fee from TechChange.
  • There is a free online mobile data collection selection assistant at NOMAD.

*Thank you to Mobenzi Researcher and DataWinners (DataWinners free data collection App for Android devices built using Open Data Kit tools) for the use of their images in this post.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello, I am Edith Gozali-Lee, a research scientist at Wilder Research. I work primarily on research and evaluation projects related to education. I am currently working on a multi-site, longitudinal study of an early childhood initiative. The study includes three cohorts of school-based preschool program children in ten schools, five cohorts of community-based child care children in homes and centers, and comparison children with and without prior preschool experience. The study follows children from preschool to third grade. That’s a lot to track, making good data collection critical from the start.

Hot Tips:

These are a few coding tips that will help to ensure good data collection tracking:

  • Anticipate the different groups ahead of time and make intuitive coding to make it easier for the following years’ data tracking and analyses
  • Use categories or codes used by schools to make data analyses process easier when you merge data that you collect with other student data collected by schools (demographic data and student outcomes)
  • Label all instruments (survey and assessment forms) with these codes prior to data collection to reduce coding work after the data collection and errors for data entry

Lesson Learned:

It is helpful to hold regular project debriefs to reflect on what works well and does not work so well. This will make the evaluation process go smoother and faster the next time around.

Rad Resources:

Practical research-based information, visit CYFERnet Children, Youth and Families Education and Research Network

Resources for research in early childhood:

We are looking forward to seeing you in Minnesota at the AEA conference this October. Results of this study (along with other Wilder Research projects and studies) will be presented during a poster session: Academic Outcomes of Children Participating in Project Early Kindergarten Longitudinal Study.

The American Evaluation Association is celebrating with our colleagues from Wilder Research this week. Wilder is a leading research and evaluation firm based in St. Paul, MN, a twin city for AEA’s Annual Conference, Evaluation 2012. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top