AEA365 | A Tip-a-Day by and for Evaluators

CAT | Integrating Technology into Evaluation

Hello aea365ers! I’m Susan Kistler, Executive Director Emeritus of the American Evaluation Association, professional trainer and editor, and all around gregarious gal. Email me at susan@thesmarterone.com if you wish to get in touch.

Rad Resource – Padlet: The last time I wrote about Padlet for aea365, exactly two years ago on September 12 of 2012, it was still called Wallwisher. One name change, two years, and a number of upgrades since then, this web-based virtual bulletin board application is worth a fresh look.

Padlet is extremely easy to set up – it takes under 10 seconds and can be done with or without an account; however, I highly recommend that you sign up for a free account to manage multiple bulletin boards and manipulate contributions.

Padlet is even easier to use, just click on a bulletin board and add a note. You can add to your own boards, or to other boards for which you have a link. I’ve set up two boards to try.

Hot Tip – Brainstorming: Use Padlet to brainstorm ideas and get input from multiple sources, all anonymously. Anonymously is the keyword here – the extreme ease of use (no sign in!) is balanced by the fact that contributions only have names attached if the contributors wish to add their names.

Hot Tip – Backchannel: Increasingly, facilitators are leveraging backchannels during courses and workshops as avenues for attendees to discuss and raise questions. Because Padlet is a platform/device independent application (PIA) accessed through the browser, and does not require a login to contribute, it can make an excellent backchannel tool.

The uses are almost endless – any time you might try sticky notes, Padlet may be a virtual alternative.

***IF YOU ARE READING THIS POST IN EMAIL, PLEASE CLICK BACK TO THE AEA365 WEBSITE TO TRY IT OUT!***

This board illustrates the linen background (there are 15+ backgrounds from which to choose) with contributions added wherever the contributor placed them (the owner may then move them). Just click to give it a try. Please.

Created with Padlet

This board illustrates the wood background with contributions organized as tiles (a new option).

Created with Padlet

The size is small when embedded on aea365, go here to see the same board in full page view.

Hot Tip – Multimedia: Padlet can accommodate pictures, links, text, files, and video (when hosted elsewhere).

Hot Tip – Export: A major improvement to Padlet’s functionality has been the addition of the capacity to export the contributions to Excel for analysis, sharing, etc.

Rad Resource – Training: I’ll be offering an estudy online workshop in October on collaborative and participatory instrument development. We’ll leverage Padlet as an avenue for stakeholder input if you’d like to see it in action. Learn more here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Susan Kistler, contributing editor at TheSmarterOne.com (we finally launched this month – yay!) and former Executive Director for the American Evaluation Association. aea365 contributors have written before about producing online reports (for example here and here), wherein we learned, among other things, that evaluators are still feeling their way in terms of online reporting. The cost of entry (monetary, time, learning curve, etc.), and the dearth of examples, can make the possibility of reporting online seem daunting. Today’s tool can help.

Rad Resource – ThingLink: ThingLink is an image interaction tool that allows you to add small icons with rich media tags to images which may be viewed on their website or embedded in your own website or blog. I wrote a complete review of ThingLink here and I encourage you to take a look to understand the platform more fully.

Lessons Learned: Using ThingLink it takes from 3-10 minutes to add a set of rich media tags to an image, assuming that you have the image and the content for the tags identified.


STOP HERE! If you are reading this via email PLEASE click back to the aea365 site (just click on the post title in your email). ThingLink is worth seeing in action, but it is a web-based application and works only online. In email, you likely can see the images below, but you won’t be able to interact with them.


Lessons Learned – ThingLink for Graphs and Charts: ThingLink can be used to annotate graphs, such as the one below that I made from data from a US Department of Education report. In this example, the “picture” being annotated is an image of a graph. The ThingLink rollover annotations that I added provide detail to the graph’s categories, scale options, and sourcing information.

Hot Tip: The positioning of ThingLink rollover icons changes slightly among different browsers and platforms. Try to put icons with open space around them and then check how they look.

Lessons Learned – ThingLink for Qualitative Data Gathering and Reporting: ThingLink can be used to annotate photographs. The settings may be configured so that anyone can edit a photo – evaluators could share an image and ask stakeholders to provide written or oral annotations. Alternatively, evaluators may create an annotated photo as part of report, integrating the authentic voice of stakeholders through oral or written annotations. The ThingLink below is annotated with narrative (and a yelp review – wanted to demo that as well!).  

Hot Tip – Try it! As long as you are on the aea365 website (but not in your email), you can add a tag to the image of the bus below. You don’t need to log in. Just click on the pencil icon in the top left of the picture and select edit. Once in edit mode, click anywhere on the picture to add an icon and rollover annotation.

No tags

I am Bob Kahle, a veteran evaluator and frequent AEA workshop presenter. As owner of Kahle Research Solutions, a research and evaluation firm with a qualitative methods focus, I am writing today to offer practical tips to leverage new tools enabled by advancing technology.

Lesson Learned: Ease into it. Most evaluators are aware of tools like Bulletin Board Focus Groups (BBFG), web enabled telephone focus groups or in-depth interviews and using mobile devices to collect data in text, audio or visual forms. Knowing about these techniques is a good start, but many of us get stuck at how to actually implement some of these new digital methods, as it seems risky and may be out of our personal comfort zones. Consider using some of the new methods in combination with existing approaches to gain experience, confidence and rich insights.

Hot Tip: Using new online tools is not an all or nothing situation. Consider hybrid designs where you couple traditional tried and true techniques with new digital methods. For example, if you just completed focus groups with your target population in traditional face-to-face settings, consider inviting back the most articulate “rock star” respondents to participate in an online (BBFG). In this way, you can still gain the benefit of in-person discussions, but can leverage technology by bringing together especially insightful participants in a virtual and convenient asynchronous data collection mode. Ease the burden on respondents by letting them work in their environment and on their schedule. Finally, if you are like me, you always have the nagging feeling after the last focus group of “I wish I would have followed up on….” Instead of beating yourself up, organize and implement a BBFG to ask those follow-up items you did not have the time for (or think of) in the face-to-face setting. BBFG results are often so rich and detailed that your new problem becomes synthesizing and organizing the wealth of information so clients can digest.

Rad Resource: If you want to learn more about new digital methods and how to apply them, consider attending “Digital Qualitative: Leveraging Technology for Deeper Insights” an AEA hosted eStudy which I will conduct on May 20 and 22.

Rad Resource: Attend the AEA Summer Evaluation Institute June 1-4 in Atlanta, GA for an array of great sessions including one with the same title as above.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings AEA365! My name is Miki Tsukamoto and I am a Senior Monitoring and Evaluation Officer at the Planning and Evaluation Department in the International Federation of Red Cross and Red Crescent Societies (IFRC).

What if you had an opportunity to add a human face to baseline surveys and reflect numbers in a more appealing way?

In a joint initiative with the Uganda Red Cross Society (URCS) and the Swedish Red Cross,  I recently had such an opportunity. We piloted video as a tool to complement a baseline survey which had been carried out for URCS’s community resilience programme. The video aimed to capture stories from communities according to selected objectives/indicators of the programme, with the idea that in three-years’ time this tool could then be used again to measure and demonstrate change or highlight gaps in programming.

Lessons Learned: Baseline data is important for planning, monitoring and evaluating a project’s performance. In many organizations, the end product of such a survey can sometimes result in a report filled with numbers; which, although useful for some purposes, is not always understood by all stakeholders, including some of the communities we aim to assist. Taking this into consideration, video seemed to be an ideal medium for what the IFRC needed since it:

  • Offers visual imagery and can transcend language barriers if needed;
  • Allows community(ies) with an opportunity to participate and directly express their views during the interviews; and
  • Provides a more appealing way to capture and report on the baseline.

Here are 3 lessons that I took away from this experience:

GatekeepersGatekeepers: It is important to identify your gatekeeper(s), since this will be necessary for meeting community members on the ground, and in obtaining their permission to film and in accepting the presence of the film crew in the community(ies) and in the randomly selected individual households.


Independent InterpreterIndependent Interpreter:
If interpretation is necessary, an independent interpreter is key since s/he serves as the voice of the interviewee, as well as the interviewer. S/He has an important role in reducing bias and providing a comfortable environment for an honest dialogue during the interview process.

Community buy-inCommunity buy-in: The filming process and the community’s better understanding of the aims of the video project, can help build a stronger buy in from the community(ies) for your programme overall.

Rad Resources: We have two version of the baseline video (if you are reading this via email that does not support embedded video, please click through back to the online post):

Short Version: 

Long Version: 

Hot Tip: For those interested in innovations in the field of humanitarian technology and its practical impact Humanitarian Technology: Science, Systems and Global Impact 2014 conference is coming up soon  in Boston, MA from 13 to 15 May 2014.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m David Fetterman, evaluator, author, entrepreneur, and Google Glass user. Yesterday, we talked about what Google Glass is and how it can revolutionize communications. Today, let’s turn to thinking about how Glass could be used as an evaluation tool.

David Fetterman's Son

Hot Tips – Glass for Empowerment Evaluation: Youth (with parental permission) can wear the Glass to produce photovoice productions, sharing their pictures of their neighborhoods and videos of the activities. It’s easy (and fun) – that’s my son over on the right trying out Glass. Their stories can be used as part of their self-assessment, gaining insight into their lives and potentially transforming their worlds.

Community and staff members can post their digital photographs (and videos) on a common server or blog while conducting their self-assessment with the blink of an eye. This ensures community access, a sense of immediacy, and transparency.

Community and staff members can use Google Hangout on Glass to communicate with each other about their ratings, preliminary findings, and plans for the future.

Hot Tips – Glass for Traditional Evaluation: Evaluators can use it to communicate with colleagues on the fly, share data (including pictures and video) with team members, and conduct spontaneous videoconference team meetings. Note that everyone doesn’t need to have Glass, as Glass users can leverage its capabilities while connecting with others who are using Smartphones or computers.

Glass stamp dates photos, videos, and correspondence, ensuring historical accuracy.

Glass can be used as an effective “ice breaker” to gain access to a new group.

Evaluators can also solicit feedback from colleagues about their performance, with brief videos of their data collection and reporting behavior. There is a precedent for this type of critique – assessments of student teaching videos.

Glass can be used to provide “on the fly” professional development with streaming video of onsite demonstrations for colleagues working remotely.

In addition, Glass can help maximize evaluator’s multi-tasking behavior (when appropriate).

Lessons Learned – Caveats:

Take time to get to know people before disrupting their norm with this innovation.

Plan to use it over time to allow people to become accustomed to it and drop their company manners.

Respect people’s privacy. Ask for permission to record any behavior.

Do not use it in bathrooms, while driving, or in areas requiring additional sensitivity, e.g. bars, gang gatherings, and funerals.

In the short term, expect the shock factor, concerns about invasion of privacy, and a lot of attention. Over time, as the novelty wears off and they become more common place, Glass will be less obtrusive than a bag of digital cameras, laptops, and Smartphones.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

David Fetterman“Ok, glass.” That’s how you activate Google Glass. I’m David Fetterman and that’s me to the right wearing Google Glass. I’m an empowerment evaluation synergist and consultant, busy father and spouse, and owner of Fetterman & Associates.

Rad Resource – Google Glass: Google Glass is a voice and gesture activated pair of glasses that lets you connect with the world through the internet. You can take a picture, record a video, send a message, listen to music, or make a telephone or video call – all hands free.

Hot Tips – Redefining Communications: Google Glass is not just another expensive (currently about $1500) gadget. It can free us up to do what we do best – think, communicate, facilitate, and, in our case, assess. Here is a brief example.

I said “Ok, Glass,” then “make a call to Kimberly James.” She is a Planning and Evaluation Research Officer I am working with at the W.K. Kellogg Foundation.

Kimberly asked how the evaluation capacity building webinar is coming along. Via Glass, I took a screenshot and mailed it to her so we can discuss it. When a colleague is mentioned, with a few swipes of my finger on the frame, I find a picture on the web, and miraculously remember who we are talking about.

Mid-conversation, Kimberly needed to step away briefly. While on hold, I sent a note to colleagues in Arkansas to ask them to check on the data collection for our tobacco prevention empowerment evaluation.

Kimberly returned to the call and we discussed a recent survey. With a simple request, the display of our results appeared, reminding me what the patterns look like.

Did I mention that I did all of these things while making lunch, picking up my son’s clothes off the floor, letting the dogs out, and emptying the dishwasher?

Later in the day, with a tap on the frame, I confirmed our scope of work with Linh Nguyen, the Vice President of Learning and Impact at the Foundation, while dropping my son off for piano lessons.

Later in the week I plan to use Google Hangout to videoconference with another colleague using Glass. When she connects during a project site visit, she will be able to take pictures and stream video of her walk around the facilities, bringing me closer to the “hum and buzz” of site activities.

Lessons Learned:

Respect people’s privacy – do not wear Google Glass where it is not wanted, will put people off, or will disrupt activities. Do not take pictures without permission. Remove it when you enter a bathroom.

Rad Resources

Hot Tip: Stay tuned for Part II tomorrow when I will cover using Google Glass as an evaluation tool.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Enhanced by Zemanta

My name is Susan Kistler and I am on a crusade to expand our reporting horizons. Earlier this month, we looked at little chocolate reports. Today, let’s consider adding videos to your evaluation reporting toolbox.

aea365_videos_suck_cover

Get Involved: But first, a little incentive for you to share your best alternative reporting ideas. And possibly get a reward for doing it. In the notes to this blog, or via twitter using the hashtag #altreporting, share either (a) your best unique evaluation reporting idea, or (b) a link to a great alternative evaluation report, and in either case note why you love it. I’ll randomly draw one winner from among the commenters/tweeters and send you a copy of “How to Shoot Video That Doesn’t Suck,” a book that can help anyone create video that isn’t embarrassing. Contribute as often as you like, but you will be entered only once in the random drawing on May 1.

Back to our programming. If you are reading this via a medium that does not allow you to view the embedded videos, such as most email, please click back through to the blog now by clicking on the title to the post.

Rad Resource – Unique Reporting Videos: Kate Tinworth, via a post on her always thought-provoking ExposeYourMuseum blog, recently shared three wonderful short video reports made by her audience insights team when she was working at the Denver Museum of Nature and Science. Each uses everyday objects to help visualize evaluation findings in an engaging way.

This video is my favorite of the three. It introduces the evaluators, reports demographics via a stacked bar chart built from jellybeans, and is at once professional and accessible.

Cool Trick: Kate’s team met museum volunteers and staff at the door with small bags of jellybeans that included a cryptic link to the report in order to get people to view the video.

Rad Resource – Unique Reporting Videos: This video from a team in Melbourne, Australia, shares findings from an evaluation of a primary school kitchen gardening program. It introduces the key stakeholders and deepens our understanding of the program without listing its components.

Rad Resource – Unique Reporting Videos: I wrote before on aea365 about getting this mock reporting video made for $5. I can still envision it embedded on an animal shelter’s website, noting how the shelter is using its evaluation findings. My favorite part is that it talks about evaluation use – how things are changing because of the evaluation at a small business.

Rad Resource: Visit the Alternative Reporting – Videos Pinterest Page I’m curating for TheSmarterOne.com for more reporting video examples and commentary.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I’m Audrey Rorrer and I’m an evaluator for the Center for Education Innovation in the College of Computing and Informatics at the University of North Carolina at Charlotte, where several projects I evaluate operate at multiple locations across the country.  Multisite evaluations are loaded with challenges, such as data collection integrity, evaluation training for local project leaders, and the cost of resources. My go-to resource has become Google because it’s cost-effective both in terms of efficiency and budget (it’s free). I’ve used it as a data collection tool and resource dissemination tool.

Lessons Learned:

Data Collection and Storage:

  • Google Forms works like a survey reporting tool with a spreadsheet of data behind it, for ease in collecting and analyzing project information.
  • Google Forms can be sent as an email so that the recipients can respond to items directly within the email.
  • Google documents, spreadsheets, and forms can be shared with any collaborators, whether or not they have a gmail account.
  • Google Drive is a convenient storage source in ‘the cloud.’

Resource Dissemination:

  • Google Sites provides easy to use website templates that enable fast website building for people without web development skills.
  • Google Groups is a way to create membership wikis, for project management and online collaboration.

Rad Resource: Go to www.google.com and search for products. Then scroll down the page to check out the business & office options, and the social options.

For a demonstration of how I’ve used google tools in multisite evaluation, join the AEA Coffee Break Webinar on February 27, “Doing it virtually: Online tools for managing multisite evaluation.” You can register here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

My name is Caren Oberg and I am the Principal and Owner of Oberg Research. I am a proud late adopter. Proof? I still use a paper calendar and have Moleskin notebooks dating back years. But I have joyfully embraced tablet applications for data collection. The applications below, not to mention many others, have made the process cheaper, greener, less prone to human error and more innovative.

Rad Resources: All resources below work on iPads and Android tablets, except Story Kit, which is iPad only.

TrackNTime is designed for tracking participant interactions or behaviors in a learning environment.

QuickTap Survey is a survey platform designed specifically for tablets. It is easy to read, pretty to look at, and you can collect data without an internet connection.

Sticky Notes come pre-installed on most tablets. Participants can move sticky notes around the screen, grouping and regrouping, based on questions you ask.

Story Kit allows your participants to recreate their experiences through images and text by using an electronic storybook.

Hot Tips: Consider the type of data you are trying to collect. The majority of tablet apps I have come across can do one type of data collection extremely well, but are not yet built for multi-method data collection. That said, you can easily switch back and forth between two applications and link the data manually by assigning a single id number to both.

Apps eliminate data entry. They do not eliminate data cleaning, nor do they do advanced analyses. Yet.

Lessons Learned: The number of applications developed specifically for evaluators is small. Learning to manipulate applications to fit my needs has been very important. As well as letting go of an app when it is just not going to work for me. Knowledge sharing is also important. I was made aware of Quicktap Survey and StoryKit from my colleague Jennifer Borland of Rockman, et al, who in turn learned about StoryKit at Evaluation 2013.

In that vein I will be talking about all four resources as an AEA Coffee Break webinar on February 20, 2014. Hope you can join.

Clipped from http://www.quicktapsurvey.com/

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Greetings from the University of Chicago!  We are Sarah Rand, Amy Cassata, Maurice Samuels and Sandra Holt from Outlier Research and Evaluation at the Center for Elementary Mathematics and Science Education. Our group recently evaluated two Chicago-based elementary education programs: Purple Asparagus, a nutrition education program and Science, Engineering, and Technology for Students, Educators, and Parents (SETSEP), a science and engineering program for students and their parents.

Rand 3

Purple Asparagus App

Rand 4

Purple Asparagus App

Purple Asparagus was interested in the program’s impact on children’s willingness to try new foods, attitudes about eating fruits and vegetables, and actual eating habits.  A major challenge we faced was how to measure changes in the eating habits and attitudes of 6- and 7-year-olds, an age that is considered too young to complete a traditional survey. In the SETSEP evaluation, our challenge was finding an efficient way to collect student outcome data from 1st-3rd grade students that minimized the amount of time taken away from program activities. We also needed a method of data collection that would capture students’ attention and accurately represent their perception of an engineer.

Lesson Learned: It was because of these dilemmas that we explored the potential to create iPad apps to measure student outcomes for each program evaluation. We worked iteratively with an app developer to create engaging, child-friendly platforms for students to share their experiences and feelings.

 

Click here to see a video of the Purple Asparagus App in action.

Click here to see a video of the SETSEP app in action.

Rand 2

SETSEP App

 

Rand 1

SETSEP App

 

 

The administration of the iPad app was a great success. We collected data from almost 200 participating students. Students remained focused on the survey for the five to ten minutes it took to complete. Most students had previous experience using an iPad and were familiar with the touch screen. Student response to the app was very positive and many students commented that taking the survey was a fun experience!

Rad Resource: We worked with Matt Jankowiak from Region Apps to program these apps.

Cool Tricks: We bought these iPad cases and these headphones, which were great for students.

Lessons Learned: App development takes time. We were new to app development, so the learning curve was steep at first. There are many pieces to consider including ease of use for young children, audio recordings, and collecting images for the app. It’s also a good idea to test the app with students before administration.

Note: Survey questions for the engineering app were developed by Engineering is Elementary at the Museum of Science, Boston and were used with permission.

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Older posts >>

Archives

To top