AEA365 | A Tip-a-Day by and for Evaluators

I’m David Fetterman, evaluator, author, entrepreneur, and Google Glass user. Yesterday, we talked about what Google Glass is and how it can revolutionize communications. Today, let’s turn to thinking about how Glass could be used as an evaluation tool.

David Fetterman's Son

Hot Tips – Glass for Empowerment Evaluation: Youth (with parental permission) can wear the Glass to produce photovoice productions, sharing their pictures of their neighborhoods and videos of the activities. It’s easy (and fun) – that’s my son over on the right trying out Glass. Their stories can be used as part of their self-assessment, gaining insight into their lives and potentially transforming their worlds.

Community and staff members can post their digital photographs (and videos) on a common server or blog while conducting their self-assessment with the blink of an eye. This ensures community access, a sense of immediacy, and transparency.

Community and staff members can use Google Hangout on Glass to communicate with each other about their ratings, preliminary findings, and plans for the future.

Hot Tips – Glass for Traditional Evaluation: Evaluators can use it to communicate with colleagues on the fly, share data (including pictures and video) with team members, and conduct spontaneous videoconference team meetings. Note that everyone doesn’t need to have Glass, as Glass users can leverage its capabilities while connecting with others who are using Smartphones or computers.

Glass stamp dates photos, videos, and correspondence, ensuring historical accuracy.

Glass can be used as an effective “ice breaker” to gain access to a new group.

Evaluators can also solicit feedback from colleagues about their performance, with brief videos of their data collection and reporting behavior. There is a precedent for this type of critique – assessments of student teaching videos.

Glass can be used to provide “on the fly” professional development with streaming video of onsite demonstrations for colleagues working remotely.

In addition, Glass can help maximize evaluator’s multi-tasking behavior (when appropriate).

Lessons Learned – Caveats:

Take time to get to know people before disrupting their norm with this innovation.

Plan to use it over time to allow people to become accustomed to it and drop their company manners.

Respect people’s privacy. Ask for permission to record any behavior.

Do not use it in bathrooms, while driving, or in areas requiring additional sensitivity, e.g. bars, gang gatherings, and funerals.

In the short term, expect the shock factor, concerns about invasion of privacy, and a lot of attention. Over time, as the novelty wears off and they become more common place, Glass will be less obtrusive than a bag of digital cameras, laptops, and Smartphones.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

David Fetterman“Ok, glass.” That’s how you activate Google Glass. I’m David Fetterman and that’s me to the right wearing Google Glass. I’m an empowerment evaluation synergist and consultant, busy father and spouse, and owner of Fetterman & Associates.

Rad Resource – Google Glass: Google Glass is a voice and gesture activated pair of glasses that lets you connect with the world through the internet. You can take a picture, record a video, send a message, listen to music, or make a telephone or video call – all hands free.

Hot Tips – Redefining Communications: Google Glass is not just another expensive (currently about $1500) gadget. It can free us up to do what we do best – think, communicate, facilitate, and, in our case, assess. Here is a brief example.

I said “Ok, Glass,” then “make a call to Kimberly James.” She is a Planning and Evaluation Research Officer I am working with at the W.K. Kellogg Foundation.

Kimberly asked how the evaluation capacity building webinar is coming along. Via Glass, I took a screenshot and mailed it to her so we can discuss it. When a colleague is mentioned, with a few swipes of my finger on the frame, I find a picture on the web, and miraculously remember who we are talking about.

Mid-conversation, Kimberly needed to step away briefly. While on hold, I sent a note to colleagues in Arkansas to ask them to check on the data collection for our tobacco prevention empowerment evaluation.

Kimberly returned to the call and we discussed a recent survey. With a simple request, the display of our results appeared, reminding me what the patterns look like.

Did I mention that I did all of these things while making lunch, picking up my son’s clothes off the floor, letting the dogs out, and emptying the dishwasher?

Later in the day, with a tap on the frame, I confirmed our scope of work with Linh Nguyen, the Vice President of Learning and Impact at the Foundation, while dropping my son off for piano lessons.

Later in the week I plan to use Google Hangout to videoconference with another colleague using Glass. When she connects during a project site visit, she will be able to take pictures and stream video of her walk around the facilities, bringing me closer to the “hum and buzz” of site activities.

Lessons Learned:

Respect people’s privacy – do not wear Google Glass where it is not wanted, will put people off, or will disrupt activities. Do not take pictures without permission. Remove it when you enter a bathroom.

Rad Resources

Hot Tip: Stay tuned for Part II tomorrow when I will cover using Google Glass as an evaluation tool.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Enhanced by Zemanta


My name is Susan Kistler and I am on a crusade to expand our reporting horizons. Earlier this month, we looked at little chocolate reports. Today, let’s consider adding videos to your evaluation reporting toolbox.

aea365_videos_suck_cover

Get Involved: But first, a little incentive for you to share your best alternative reporting ideas. And possibly get a reward for doing it. In the notes to this blog, or via twitter using the hashtag #altreporting, share either (a) your best unique evaluation reporting idea, or (b) a link to a great alternative evaluation report, and in either case note why you love it. I’ll randomly draw one winner from among the commenters/tweeters and send you a copy of “How to Shoot Video That Doesn’t Suck,” a book that can help anyone create video that isn’t embarrassing. Contribute as often as you like, but you will be entered only once in the random drawing on May 1.

Back to our programming. If you are reading this via a medium that does not allow you to view the embedded videos, such as most email, please click back through to the blog now by clicking on the title to the post.

Rad Resource – Unique Reporting Videos: Kate Tinworth, via a post on her always thought-provoking ExposeYourMuseum blog, recently shared three wonderful short video reports made by her audience insights team when she was working at the Denver Museum of Nature and Science. Each uses everyday objects to help visualize evaluation findings in an engaging way.

This video is my favorite of the three. It introduces the evaluators, reports demographics via a stacked bar chart built from jellybeans, and is at once professional and accessible.

Cool Trick: Kate’s team met museum volunteers and staff at the door with small bags of jellybeans that included a cryptic link to the report in order to get people to view the video.

Rad Resource – Unique Reporting Videos: This video from a team in Melbourne, Australia, shares findings from an evaluation of a primary school kitchen gardening program. It introduces the key stakeholders and deepens our understanding of the program without listing its components.

Rad Resource – Unique Reporting Videos: I wrote before on aea365 about getting this mock reporting video made for $5. I can still envision it embedded on an animal shelter’s website, noting how the shelter is using its evaluation findings. My favorite part is that it talks about evaluation use – how things are changing because of the evaluation at a small business.

Rad Resource: Visit the Alternative Reporting – Videos Pinterest Page I’m curating for TheSmarterOne.com for more reporting video examples and commentary.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Road Less TraveledHello fellow evaluators! My name is Ann Price and I am President of Community Evaluation Solutions, near Atlanta, Georgia. A few weeks ago a friend and I spent the weekend in the Georgia Mountains at the Hike Inn, a state park only accessible via a 5 mile “moderate” hike. There is no cell phone, no tv, no internet. It was nice to disconnect and reflect on life and work. This blog about my reflections over the weekend as an external evaluation consultant.

My friend and I have found over the years that even though we work in different areas, our processes and our relationships with clients are quite similar. We both have a penchant for metaphor so we had fun over the weekend applying metaphors to our clients and our work.

The first thing we did was spend ½ hour just trying to find the trail head. I told my friend this was similar to programs not doing the ground work for an evaluation (i.e. failing to design a program logic model or a strategic plan or in our case, having the map but not following it). When all else fails, read the directions….

The hike was a lovely, albeit up and down trek. So the second thing we learned was something my son’s scout leader once said, “Everyone is on their own hike.” We reminded ourselves of that as folks of all ages passed us by (that was a bit discouraging). But the main point is to start on the path. Similarly, you may not have the biggest, most well-funded program. But it is important to start the evaluation journey or you will never “get there.” You do this by building your program’s organizational and evaluation capacity.

Tips and Tricks:
The hike was pretty steep at times, so we had to stop every once in awhile and catch our breath. We kept ourselves motivated by setting goals (Let’s just make it to the next tree! Think benchmarks and indicators). Evaluation work is the same way. It’s important to take a break and look at your data. If you don’t you might miss some pretty awesome sites (or findings). So stop every once in awhile and see where you are. Is your program where it needs to be? If your program is not, make an adjustment. And if you need help, here are a few great resources to guide you on your way.

Rad Resources:
Start with baby steps if you must. There are plenty of free resources out there to help you on your journey:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings to my fellow #DataNerds! My name is Jordan Slice. I am a Research Specialist at Richland One, an urban school district in Columbia, South Carolina. In addition to being a full-time evaluator, I create handmade pieces for my Etsy shop, resliced.

As a handmade business owner, many of the sales I make are custom orders. People really appreciate when something is tailored to meet their needs. The same is true for evaluation stakeholders: your results are much more likely to be appreciated (and used!) if they answer the questions your stakeholders need to know.

Lesson Learned: Whether I’m making a custom purse (that’s one of my bags to the right) or designing a program evaluation, clear communication is key. For example, if a customer sends me her grandfather’s favorite shirt and requests that I make her a purse using the fabric, it is imperative that we come to a clear agreement about the design of the purse before I start constructing. Similarly, when evaluating a program, it is imperative that you consult with the stakeholders before developing your evaluation if you expect the results to be utilized.

Hot Tip: Keep it simple. While you and I may love geek speak, flooding your stakeholders with evaluation jargon may impair their ability to understand your results. Whether you are talking with stakeholders, constructing a presentation, or writing a report, commit to the mantra that less is more. Once I gather my summary in writing, I use a two step revision process. First, I focus on organizing the content for better flow. Second, I put on my minimalist cap and cut out all the excess fluff (usually repetitive statements or unnecessary detail). Before finalizing any report, always ask a colleague (or stakeholder when appropriate) to proof and provide feedback. I employ the same technique when I am building newsletters (Rad Resource: Mail Chimp – free & user-friendly!) or item listings on Etsy.

Rad Resource: Stephanie Evergreen has some really great posts (like this one!) on her blog with tips for creating better visualizations with your data.

Another Hot Tip: Allow yourself time to focus on something creative (even just a daydream) several times a week. This can give your mind the break it needs to process information and improve your focus. Pursue a new hobby or build on an existing interest. You may be surprised at how this new skill can help you grow as an evaluator.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Kylie Hutchinson.  I am an independent evaluation consultant with Community Solutions Planning & Evaluation.  In addition to evaluation consulting and capacity building, I tweet at @EvaluationMaven and co-host the monthly evaluation podcast, Adventures in Evaluation along with my colleague @JamesWCoyle.

When I started out in evaluation 26 years ago, I was focused on being a good methodologist and statistician.  After deciding to work primarily with NGOs I learned the importance of being a good program planner.  Employing a participatory approach required me to become a competent facilitator and consensus-builder.  These days, the increased emphasis on utilization and data visualization is forcing me to upgrade my skills in communications and graphic design.  New developments in mobile data collection are making me improve my technical skills.  A recent foray into development evaluation has taught me the important role that a knowledge manager plays in evaluation. Finally, we are starting to understand evaluation capacity development as a process rather than a product, so now I need expertise in organizational development, change management, and the behavioral sciences.  Whoa.

HutchDon’t get me wrong, I’m not complaining.  Every day I wake up and think how lucky I am to have picked such a diverse career as evaluation. But with all these responsibilities on my plate, my toolbox is starting to get full and sometimes keep me awake a night.  How can I manage to be effective at all of these things?  Should I worry about being a Jack of all trades, Master of none?

Hot Tip:  You don’t have to do it all.  Determine your strengths and outsource your weaknesses. Pick several areas of specialization and ask for assistance with the others.  This help may come in the form of other colleagues or departments.  For example, if you think you need help with change management, sub-contract an organizational development consultant to your team.  If you work in an organization with a communications or graphic design department, don’t forget to call on their expertise when you need it.

Hot Tip:  Take baby steps.  If you want to practice more innovative reporting, don’t assume you have to become an expert in communication strategies overnight. Select one or two new skills you want to develop annually and pick away at those.

Hot Tip:  If you can, strategically select those evaluations that will expose you to a new desired area, e.g. mobile data collection or use of a new software.

Rad Resource:  Even if you’re not Canadian, the Canadian Evaluation Society’s Competencies for Canadian Evaluation Practice provide a great basis from which to reflect on your skills.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I’m Sheila B. Robinson, aea365′s Lead Curator and sometimes Saturday contributor with a new cool tool to spice up your evaluation presentations and reports!

Do you know the feeling you get when you stumble upon something so good you want to share it, but then again, part of you wants to keep it all to yourself? It will be apparent from this post which side won out for me.

Lesson Learned: Based on advice from respected presentation and information designers, I now shy away from canned, cliche, or clip art images, including charts and diagrams. I’m no designer though, and often find it challenging to start with a blank page when I have something to share that calls for a good visual representation of a relationship.

I’ve enjoyed Microsoft’s SmartArt graphics that come with Office, and they are quite customizable, but with only 185 choices or so, I find I start recognizing them in other people’s presentations, especially when they are not customized by the user, and they begin to remind me of the overused, 20th century clip art we’ve all come to loathe.

Rad Resource: Turns out, one of my favorite presentation designers, Nancy Duarte, has offered her expertise in a fabulous resource she has made available to all of us, and it’s FREE! Diagrammer™ is “a visualization system” featuring over 4,000 downloadable, customizable diagrams. Duarte, Inc. makes it easy to search for exactly what you need by allowing you to search all diagrams, or filter by relationship (flow, join, network, segment, or stack), style (2D or 3D), or number of nodes (1-8) needed.

Once you choose a diagram (and “shopping” for one is half the fun!), you simply download it as a PowerPoint slide, and fill in your text, or customize the various components. You can change shapes, colors, sizes and more. Diagrams range from the very simplest to somewhat complex. Here are just a few examples:

Diagram 1 Diagram 2 Diagram 3Diagram 4

Most diagrams you see come in a variety of configurations. Each of the above examples are also available with different numbers of nodes.

Hot Tip: Duarte’s diagrams are in a gorgeous color palette if you ask me, but often it’s the colors you want to customize to match your report style or the colors of your organization. Here’s a before and after with the original digram, and my redesign.

Network hub original

Custom diagram 2

Cool Trick: Take some time searching diagrams as you’re thinking about the relationship you want to communicate. This added reflection time will give you the opportunity to dig a little deeper into your data and you may be rewarded with new insights.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


No tags

My name is Cheryl Peters and I am the Evaluation Specialist for Michigan State University Extension, working across all program areas.

Measuring collective impact of agricultural programs in a state with diverse commodities is challenging. Many states have an abundance of natural resources like fresh water sources, minerals, and woodlands. Air, water and soil quality must be sustained while fruit, vegetable, crop, livestock and ornamental industries remain efficient in yields, quality and input costs.

Extension’s outreach and educational programs operate on different scales in each state of the nation: individual efforts, issue-focused work teams, and work groups based on commodity types. Program evaluation efforts contribute to statewide assessment reports demonstrating the value of Extension Agricultural programs, including public value. Having different program scales allows applied researchers to align to the same outcome indicators as program staff.

Hot Tip: Just as Extension education has multiple pieces (e.g., visits, meetings, factsheets, articles, demonstrations), program evaluation has multiple pieces (e.g., individual program evaluation about participant adoption practices, changes in a benchmark documented from a secondary source, and impact assessment from modeling or extrapolating estimates based on data collected from clientele).

Hot Tip:  All programs should generate evaluation data related to identified, standardized outcomes. What differs in the evaluation of agriculture programs is the evaluation design, including sample and calculation of values. Impact reports may be directed at commodity groups, legislature, farming groups, and constituents. State Extension agriculture outcomes can use the USDA impact metrics. Additionally, 2014 federal requirements for competitive funds now state that projects must demonstrate impact within a project period. Writing meaningful outcomes and impact statements continues to be a focus of USDA National Institute of Food and Agriculture (NIFA).

Hot Tip: Standardizing indictors into measurable units has made aggregation of statewide outcomes possible. Examples include pounds or tons of an agricultural commodity, dollars, acres, number of farms, and number of animal units. Units are then reported by the practice adopted. Dollars estimated by growers/farmers are extrapolated from research values or secondary data sources.

Hot Tip: Peer-learning with panels to demonstrate scales and types of evaluation with examples has been very successful. There are common issues and evaluation decisions across programming areas. Setting up formulas and spreadsheets for future data collection and sharing extrapolation values has been helpful to keep program evaluation efforts going. Surveying similar audiences with both outcomes and program needs assessment has also been valuable.

Rad resource: NIFA  provides answers to frequently asked questions such as when to use program logic models, how to report outcomes, and how logic models are part of evaluability assessments.  

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! This is Laura Downey with Mississippi State University Extension Service. In my job as an evaluation specialist, I commonly receive requests to help colleagues develop a program logic model. I am always thankful when I receive such a request early in the program development process. So, I was delighted a few weeks ago when academic and community colleagues asked me to facilitate the development of a logic model for a grant proposing to use a community-based participatory research (CBPR) approach to evaluate a statewide health policy. For those of you who are not familiar with CBPR, it is a collaborative research approach designed to ensure participation by communities throughout the research process.

As I began to assemble resources to inform this group’s CBPR logic model, I discovered a Conceptual Logic Model for CBPR available on the University of New Mexico’s School of Medicine, Center for Participatory Research, website.


Clipped from http://fcm.unm.edu/cpr/cbpr_model.html

Rad Resource:

What looked like a simple conceptual logic model at first glance was actually a web-based tool complete with metrics and measures (instrument) to assess CBPR processes and outcomes. Over 50 instruments related to the most common concepts in CBPR, concepts such as organizational capacity; group relational dynamics; empowerment; and community capacity are profiled and available through this tool. The profile includes the instrument name; a link to original source; the number of items in the instrument; concept(s) original assessed; reliability; validity; and identification of the population created with.

With great ease, I was able to download surveys to measure those CBPR concepts in the logic model that were relevant to the group I was assisting. Given the policy-focus of that specific project, I explored those measures related to policy impact.

Hot Tip:

Even if you do not typically take a CBPR approach to program development, implementation, and/or evaluation, the CBPR Conceptual Logic Model website might have a resource relevant to your current or future evaluation work.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, I’m Siri Scott, and I work as a Graduate Assistant with the University of Minnesota Extension.

Conducting interviews with youth is one way to consider gathering information for program improvement, especially when you want to bring youth voice into your improvement process. While more time intensive than doing a survey, interviews will provide you with rich contextual information. Below is a brief overview of the planning process as well as tips for conducting interviews.

Hot Tips: Planning Phase

One main decision is whether or not you will need IRB approval for conducting interviews. Even when done for program improvement purposes, it is a good idea to comply with IRB regulations for data practices and protection for youth. Other major decision points in the planning process include how many individuals you will interview, how you will choose your participants, and how you will collect and analyze your data. In addition, you must decide what type of interview you want to conduct, the purpose of the interview, and then create an interview protocol (if appropriate).

Hot Tips: Conducting interviews

Here are some tips for conducting interviews:

  • Practice: Test the use of the protocol with a colleague (or a young person who you know well) and ask for feedback about the questions themselves and how they fit together.
  • Space: Find a quiet, secluded space for the interview in a public setting (or in a private home if the young person’s guardian can be present). You don’t want other people overhearing your conversation or your participant being distracted by anything.
  • Warm up: Start the interview with some informal chit chat. This will build your rapport with the participant and ease the participant’s (and your) nerves.
  • Probe: If you are not doing a structured interview, make sure to ask participants to clarify or elaborate on their responses. This will provide you with much better data.
  • Notes: If you are using an audio recorder, don’t trust it (fully). Jot down some quick notes during the interview. You can elaborate on these later if the audio recorder malfunctions.
  • Relax! If you mess up, that’s okay. Also, if you’re nervous and tense, the participant will sense that. Do whatever you can to put the participant (and yourself) at ease.
  • Learn More:  A good resource for learning about how to conduct interviews is this newly released, comprehensive overview of the interviewing process:  InterViews: Learning the Craft of Qualitative Research Interviewing (3rd Edition) by Brinkman and Kvale (2014).

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Older posts >>

Archives

To top