AEA365 | A Tip-a-Day by and for Evaluators

CAT | Collaborative, Participatory and Empowerment Evaluation

Hello aea365ers! I’m Susan Kistler, Executive Director Emeritus of the American Evaluation Association, professional trainer and editor, and all around gregarious gal. Email me at susan@thesmarterone.com if you wish to get in touch.

Rad Resource – Padlet: The last time I wrote about Padlet for aea365, exactly two years ago on September 12 of 2012, it was still called Wallwisher. One name change, two years, and a number of upgrades since then, this web-based virtual bulletin board application is worth a fresh look.

Padlet is extremely easy to set up – it takes under 10 seconds and can be done with or without an account; however, I highly recommend that you sign up for a free account to manage multiple bulletin boards and manipulate contributions.

Padlet is even easier to use, just click on a bulletin board and add a note. You can add to your own boards, or to other boards for which you have a link. I’ve set up two boards to try.

Hot Tip – Brainstorming: Use Padlet to brainstorm ideas and get input from multiple sources, all anonymously. Anonymously is the keyword here – the extreme ease of use (no sign in!) is balanced by the fact that contributions only have names attached if the contributors wish to add their names.

Hot Tip – Backchannel: Increasingly, facilitators are leveraging backchannels during courses and workshops as avenues for attendees to discuss and raise questions. Because Padlet is a platform/device independent application (PIA) accessed through the browser, and does not require a login to contribute, it can make an excellent backchannel tool.

The uses are almost endless – any time you might try sticky notes, Padlet may be a virtual alternative.

***IF YOU ARE READING THIS POST IN EMAIL, PLEASE CLICK BACK TO THE AEA365 WEBSITE TO TRY IT OUT!***

This board illustrates the linen background (there are 15+ backgrounds from which to choose) with contributions added wherever the contributor placed them (the owner may then move them). Just click to give it a try. Please.

Created with Padlet

This board illustrates the wood background with contributions organized as tiles (a new option).

Created with Padlet

The size is small when embedded on aea365, go here to see the same board in full page view.

Hot Tip – Multimedia: Padlet can accommodate pictures, links, text, files, and video (when hosted elsewhere).

Hot Tip – Export: A major improvement to Padlet’s functionality has been the addition of the capacity to export the contributions to Excel for analysis, sharing, etc.

Rad Resource – Training: I’ll be offering an estudy online workshop in October on collaborative and participatory instrument development. We’ll leverage Padlet as an avenue for stakeholder input if you’d like to see it in action. Learn more here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Jeff Sheldon from the School of Social Science, Policy, and Evaluation at Claremont Graduate University and today I’m introducing the Survey of Empowerment Evaluation Practice, Principles, and Outcomes (SEEPPO). I developed SEEPPO for my dissertation, but more important, as a tool that can be modified for use by researchers on evaluation and evaluation practitioners.

For practitioners, SEEPPO is an 82 item self – report survey (92 items for researchers) across seven sections (nine for researchers).

  • Section one items (“Your evaluation activities”) ask for a report on behaviors in terms of the specific empowerment evaluation steps implemented.
  • Section two (“Evaluation Participant Activities”) asks for observations on the evaluation – specific behaviors of those engaged in the evaluation as they relate to the empowerment evaluation steps implemented.
  • Section three (“Changes you observed in individual’s values”) asks for a report on changes in evaluation – related values by comparing the values observed at the beginning of the evaluation to those observed at the end of the evaluation.
  • Section four items (“Changes you observed in individual’s behaviors”) ask for a report on changes observed in evaluation – related behavior and whether the sub-constructs characterizing the psychological well- being outcomes of empowerment (i.e. knowledge, skills/capacities, self-efficacy) and self – determination (competence, autonomy, and relatedness) were present by comparing observed behaviors at the beginning of the evaluation to those at evaluation’s end.
  • Section five (“Changes you observed within the organization”) items ask for a report on the changes observed within the organization as a result of the evaluation by comparing various organizational capacities at the beginning of the evaluation to those observed at evaluation’s end.
  • Section six (“Inclusiveness”) asks about the extent to which everyone who wanted to fully engage in the evaluation was included.
  • Section seven (“Accountability”) items ask about who the evaluator was accountable to during the evaluation.
  • Lastly, the items in sections eight and nine, for researchers, ask about the evaluation model used and demographics.

This is a brief “snap-shot” of SEEPPO. Item development was based on: 1) constructs found in the literature regarding the three known empowerment evaluation models and their respective implementation steps; 2) the ten principles (i.e., six process and four outcome) of empowerment evaluation; 3) the purported empowerment and self – determination outcomes for individuals and organizations engaged in the process of an empowerment evaluation; and 4) constructs found in the humanistic psychology literature on empowerment theory and self – determination theory.

Sheldon

Hot Tip: Theresults of SEEPPO can be used to: determine whether you or your subjects are adhering with fidelity to the empowerment evaluation model being implemented, the principles of empowerment evaluation in evidence, and the likelihood of empowerment and self – determination outcomes.

Rad Resource: Coming soon! SEEPPO will soon be widely available.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Rachel Becker-Klein and I am an evaluator and a Community Psychologist with almost a decade of experience evaluating programs. Since 2005, I have worked withPEER Associates, an evaluation firm that provides customized, utilization-focused program evaluation and educational research services for organizations nationwide.

Recently I have been used an interview and analysis methodology called Most Significant Change (MSC). MSC is a strategy that involves collecting and systematically analyzing significant changes that occur in programs and the lives of program participants. The methodology has been found to be useful in monitoring programmatic changes, as well as evaluating the impact of programs.

Klein

Lessons Learned: Many clients are interested in taking an active role in their evaluations, but may not be sure how to do so. MSC is a fairly intuitive approach to collecting and analyzing data that clients and participants can be trained to use. Having project staff interview their own constituents can help to create a high level of comfort for interviewees, allowing them to share more openly. Staff-conducted interviews also provides them with a sense of empowerment in collecting data. The MSC approach also includes a participatory approach to analyzing the data. In this way, the methodology can be a capacity building process in and of itself, supporting project staff to learn new and innovative monitoring and evaluation techniques that can be integrated into their own work once the external evaluators leave.

Cool Trick: In 2012, Oxfam Canada contracted with PEER Associates to conduct a case study of their partner organization in the Engendering Change (EC) program in Zimbabwe – Matabeleland AIDS Council (MAC). The EC program funds capacity-building of Oxfam Canada’s partner organizations. This program is built around a theory of change that suggests partners become more effective change agents for women’s rights when their organizational structures, policies, procedures, and programming are also more democratic and gender just.

The evaluation employed a case study approach, using MSC methodology to collect stories from MAC staff and their constituents. In this case study, PEER Associates trained MAC staff to conduct the MSC interviews, while the external evaluators documented the interviews with video and/or audio and facilitated discussions on the themes that emerged from those interviews.

Rad Resources:

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Abe Wandersman and I have been working since the last century to help programs achieve outcomes by building capacity for program personnel to use evaluation proactively.  The words “evaluation” and “accountability” scare many people involved in health and human services programs and in education.   They are afraid that evaluation of their program will prove embarrassing or worse and/or they may think the evaluation didn’t really evaluate their program.   Empowerment evaluation (EE) has been devoted to demystifying evaluation and putting the logic and tools of evaluation into the hands of practitioners so that they can proactively plan, implement, self-evaluate, continuously improve the quality of their work, and thereby increase the probability of achieving outcomes.

Lesson Learned: Accountability does not have to be relegated solely to “who is to blame” after a failure occurs e.g., problems in the U.S. government initial roll out of the health insurance website (and Secretary of Health and Human Services Kathleen Sebelius’ resignation) and the Veterans Administration scandal (and Secretary Shisinski’s resignation). It actually makes sense to think that individuals and organizations should be proactive and strategic about their plans, implement the plans with quality, and evaluate whether or not the time and resources spent led to outcomes. It is logical to want to know why certain things are being done and others are not, what goals an organization is trying to achieve, that the activities are designed to achieve the goals, that a clear plan is put into place and carried out with quality, and that there be an evaluation to see if it worked. EE can provide funders, practitioners, evaluators, and other key stakeholders with a results-based approach to accountability that helps them succeed.

Hot Tip: I am very pleased to let you know that in September 2014, there will be a new EE book: Empowerment Evaluation: Knowledge and Tools for Self-Assessment, Evaluation Capacity Building, and Accountability (Sage:Second Edition) edited by Fetterman, Kaftarian, & Wandersman.   Several chapters are authored by community psychologists including:  Langhout and Fernandez describe EE conducted by fourth and fifth graders; Imm et al. write about the SAMSHA service to science program that brings practice-based programs to reach evidence-based criteria; Haskell and Iachini describe empowerment evaluation in charter schools to reach educational impacts; Chinman et al describe a decade of research on the Getting To Outcomes® accountability approach; Suarez-Balcazar,Taylor-Ritzler,  & Morales-Curtin describe their work on building evaluation capacity in a community-based organization; and Lamont, Wright, Wandersman, & Hamm describe the use of practical implementation science in building quality implementation in a district school initiative integrating technology into education.

Rad Resources:

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi! I’m Wendi Siebold, President of Strategic Prevention Solutions, a consulting firm that works to address and prevent social and health problems through research, evaluation and training. We spend a lot of time in communities working with non-profit organizations to improve staff and organizational evaluation capacity. Currently, we are the “empowerment evaluator” for domestic violence and/or sexual assault organizations in Alaska, Idaho and Florida.

Empowerment evaluators act as coaches, or critical friends, for the people who actually implement evaluation activities. There are a number of tensions in the balancing act of coaching someone’s capacity building. What I’m highlighting today is the tension between organizational capacity for evaluation and realistic expectations of “empowerment.”

Empowerment evaluators have the intention of improving a person or organization’s capacity to a notch above where they start. However, it’s essential for the evaluator and client to be on the same page about the ability of an organization to devote resources to evaluation and to define their desired level of capacity. For example, how much time does staff have to enter survey data? Who can review the data and find the story to report? Does a scale score need to be calculated?Usually people want to evaluate: it’s simply they don’t have the resources. This is why it’s vital to start capacity building only after knowing where you will end. Let’s practice what we preach and determine capacity building goals. Even if you build skills, will this get your client to a finished product? Isn’t the merit of program evaluation to improve the program and reach outcomes? If you never get to the stage of finding the “story” in the data or having findings to use, was the evaluation meaningful?

Hot Tip: Figure out organizational and staff capacity for evaluation immediately, before jumping into building capacity. A fatal flaw of empowerment evaluation is that the true time and resources needed to move from writing outcomes to summarizing findings is greater than most nonprofit staff and even evaluators realize. This requires diligence on the part of the evaluator – you’re the person who understands the reality of how many resources each step of an evaluation process will take. It is only after you have this discussion about feasibility that you can effectively coach your clients to completing evaluation work, and not leave them stranded in a pile of unanalyzed data. That just gives evaluation the bad name people have come to expect, and we’re better than that!Seibold

Rad Resources:

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello. We are Tara Gregory, Director of Research and Evaluation at Wichita State University’s Center for Community Support and Research, and Natalie Wilkins, Behavioral Scientist at the Centers for Disease Control and Prevention. We’re members of the Leadership Council for the Community Psychology TIG and are excited to introduce this week’s blogs highlighting connection among empowerment, evaluation and community psychology.

As community psychologists who are evaluators, we often think of the tenet of meeting people where they are. “Where people are,” related to evaluation may be overwhelmed, confused, and even resistant. This is not a criticism of those trying to make a difference in our communities, but more a recognition of the need for approaching evaluation from an empowerment perspective – both in helping people learn evaluation themselves and in providing results of our own evaluations in a way that helps empower people. Either way, the role of the community psychologist in evaluation is to meet people where they are and walk with them as a partner with the intention of preparing the other to go forward independently.

Lessons Learned:

  • Empowerment evaluation – Listening to key stakeholders is key. Often, people will be resistant to evaluation because they are overwhelmed by the idea of having to do something outside their area of expertise. Listening to stakeholders’ stories about how their program works, and how they know it works can often reveal strengths and evaluation capacity that people and programs never knew they had. Lots of folks have the building blocks of evaluation in place already – they’re just not calling it “evaluation!”
  • Facilitating reflection – Encouraging reflection on evaluation results and helping people come to their own conclusions is a way to create ownership and empowerment to continue good work or make changes where needed.
  • Qualitative methods – Offering an opportunity for people to share their own stories as part of an evaluation can also be empowering, particularly when they’re encouraged to focus on strengths, successes, resiliency or other positives that sometimes get lost.

Hot Tip:

  • Check out the Empowerment Evaluation TIG! They host their own blog weeks, webinars, and many other educational opportunities. Many of us community psychologists belong to this group and gain valuable knowledge and skills through membership.

Rad Resources:

These teaching materials are designed to introduce individuals to empowerment evaluation and intended to be a resource for facilitating an introductory lecture on the topic.

Dr. David Fetterman’s blog provides a range of resources on empowerment evaluation theory and practice, including links to videos, guides and relevant academic literature.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

No tags

Greetings from Mary Crave and Kerry Zaleski, of the University of Wisconsin – Extension and Tererai Trent of Tinogona Foundation and Drexel University. For the past few years we’ve teamed up to teach participatory methods for engaging vulnerable and historically under-represented persons in monitoring and evaluation. We’ve taught hands-on professional development workshops at AEA conferences, eStudies, and Coffee Breaks. “Visionary Evaluation for a Sustainable, Equitable Future” is not only the theme for Evaluation 2014, it is a succinct description of why we believe so strongly in what we teach. Lessons Learned: We’ve noticed during our trainings around the world that there is a continuum or range of what an evaluator might consider to be “participatory”.  Being aware of our own position and philosophy of participatory methods is especially critical when working with persons who traditionally may have been excluded from participation due to income, location, gender, ethnicity or disability. Trent suggests these lenses or levels from low to high: Spectator Participation > Tokenism Participation > Incentive Participation > Functional Participation > Ownership Participation. The more ownership or the higher the level of participation, the more impact a program will have on social justice issues and sustainable, equitable futures for people. Those who want their methods lens to focus on “ownership participation” sometimes have trouble reaching that aim because they have a small tool box or get stuck using the wrong tool in a particular time in the program cycle. Rubrics for success often leave out the voice of the vulnerable, though those voices can also be included using participatory tools. Hot Tips:

  • There are M & E tools especially suited for working with vulnerable persons that allow all voices to be heard, that do not depend on literacy skills, that consider cultural practices and power relationships in decision making and discussion, and that engage program beneficiaries in determining rubrics for success. These tools can be used in the planning, monitoring, data collection, analysis, and reporting stages of the program cycle.
  • You can expand your tool box of methods, and widen your lens on participatory methods at our 2-day workshop at AEA 2104, Reality Counts (Workshop #6) We’ll be joined by Abdul Thoronka, an international community health specialist and manager of a community organization that works with persons with disabilities.

Rad Resources: Robert Chambers 2002 book: Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities. Food and Agricultural Organization (FAO) of the UN: Click on publications; type in PLA in search menu. AEA Coffee Break Webinar 166: Pocket-Chart Voting-Engaging vulnerable voices in program evaluation with Kerry Zaleski, December 12, 2013 (recording available free to AEA members). Want to learn more? Register for Reality Counts: Participatory methods for engaging marginalized and under-represented persons in M&E at Evaluation 2014. This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, my name is Jeanne Hubelbank. I am an independent evaluation consultant. Most of my work is in higher education where, most recently, I help faculty evaluate their classes, develop proposals, and evaluate professional development programs offered to public school teachers. Sometimes, I am asked to make presentations or conduct workshops on evaluation. When doing this, I find it helpful to know something about the audience’s background. Clickers, hand raising, holding up colored cards, standing up, and clapping are ways to approach this. A recent AEA365 post, Innovative Reporting Part I: The Data Diva’s Chocolate Box, that showed how to present results on candy wrappers served as an impetus for another way to introduce evaluation and to assess people’s understanding of it.

Instead of results, write evaluation terms such as use, user, and methods on stickers and place them on the bottom of Hershey’s Kisses®; one word to a kiss. Participants arrange their candy in any format that they think represents how one approaches the process of conducting an evaluation. This can give one a quick view of how the participants view evaluation and most people like to eat the candy afterwards.

Hot tips:

  • Use three-quarter inch dotsHubelbank
  • Hand write or print terms you want your clients to display
  • Besides Hershey’s Kisses® provide Starbursts®, for those who are allergic or adverse to chocolate
  • Use different colored kisses for key terms, such as use and uses in silver and assessment in red, for a quick view on where people place them in the process
  • Wrap each collection of candy terms into a piece of plastic wrap and tie with a curled ribbon
  • Ask people to arrange candy in any format that they think represents how one approaches the process of doing an evaluation
  • You can do this before and after a presentation, but if you do it again, remind people to wait to eat.

Rad Resources:

Susan Kistler’s chocolate results

Stephanie Everygreen’s cookie results and her book Presenting Data Effectively: Communicating Your Findings for Maximum Impact.

Hallie Preskill and Darlene Russ-Eft’s book Building Evaluation Capacity: 72 Activities for Teaching and Training.

Michael Q. Patton’s book Creative Evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m David Fetterman, evaluator, author, entrepreneur, and Google Glass user. Yesterday, we talked about what Google Glass is and how it can revolutionize communications. Today, let’s turn to thinking about how Glass could be used as an evaluation tool.

David Fetterman's Son

Hot Tips – Glass for Empowerment Evaluation: Youth (with parental permission) can wear the Glass to produce photovoice productions, sharing their pictures of their neighborhoods and videos of the activities. It’s easy (and fun) – that’s my son over on the right trying out Glass. Their stories can be used as part of their self-assessment, gaining insight into their lives and potentially transforming their worlds.

Community and staff members can post their digital photographs (and videos) on a common server or blog while conducting their self-assessment with the blink of an eye. This ensures community access, a sense of immediacy, and transparency.

Community and staff members can use Google Hangout on Glass to communicate with each other about their ratings, preliminary findings, and plans for the future.

Hot Tips – Glass for Traditional Evaluation: Evaluators can use it to communicate with colleagues on the fly, share data (including pictures and video) with team members, and conduct spontaneous videoconference team meetings. Note that everyone doesn’t need to have Glass, as Glass users can leverage its capabilities while connecting with others who are using Smartphones or computers.

Glass stamp dates photos, videos, and correspondence, ensuring historical accuracy.

Glass can be used as an effective “ice breaker” to gain access to a new group.

Evaluators can also solicit feedback from colleagues about their performance, with brief videos of their data collection and reporting behavior. There is a precedent for this type of critique – assessments of student teaching videos.

Glass can be used to provide “on the fly” professional development with streaming video of onsite demonstrations for colleagues working remotely.

In addition, Glass can help maximize evaluator’s multi-tasking behavior (when appropriate).

Lessons Learned – Caveats:

Take time to get to know people before disrupting their norm with this innovation.

Plan to use it over time to allow people to become accustomed to it and drop their company manners.

Respect people’s privacy. Ask for permission to record any behavior.

Do not use it in bathrooms, while driving, or in areas requiring additional sensitivity, e.g. bars, gang gatherings, and funerals.

In the short term, expect the shock factor, concerns about invasion of privacy, and a lot of attention. Over time, as the novelty wears off and they become more common place, Glass will be less obtrusive than a bag of digital cameras, laptops, and Smartphones.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

David Fetterman“Ok, glass.” That’s how you activate Google Glass. I’m David Fetterman and that’s me to the right wearing Google Glass. I’m an empowerment evaluation synergist and consultant, busy father and spouse, and owner of Fetterman & Associates.

Rad Resource – Google Glass: Google Glass is a voice and gesture activated pair of glasses that lets you connect with the world through the internet. You can take a picture, record a video, send a message, listen to music, or make a telephone or video call – all hands free.

Hot Tips – Redefining Communications: Google Glass is not just another expensive (currently about $1500) gadget. It can free us up to do what we do best – think, communicate, facilitate, and, in our case, assess. Here is a brief example.

I said “Ok, Glass,” then “make a call to Kimberly James.” She is a Planning and Evaluation Research Officer I am working with at the W.K. Kellogg Foundation.

Kimberly asked how the evaluation capacity building webinar is coming along. Via Glass, I took a screenshot and mailed it to her so we can discuss it. When a colleague is mentioned, with a few swipes of my finger on the frame, I find a picture on the web, and miraculously remember who we are talking about.

Mid-conversation, Kimberly needed to step away briefly. While on hold, I sent a note to colleagues in Arkansas to ask them to check on the data collection for our tobacco prevention empowerment evaluation.

Kimberly returned to the call and we discussed a recent survey. With a simple request, the display of our results appeared, reminding me what the patterns look like.

Did I mention that I did all of these things while making lunch, picking up my son’s clothes off the floor, letting the dogs out, and emptying the dishwasher?

Later in the day, with a tap on the frame, I confirmed our scope of work with Linh Nguyen, the Vice President of Learning and Impact at the Foundation, while dropping my son off for piano lessons.

Later in the week I plan to use Google Hangout to videoconference with another colleague using Glass. When she connects during a project site visit, she will be able to take pictures and stream video of her walk around the facilities, bringing me closer to the “hum and buzz” of site activities.

Lessons Learned:

Respect people’s privacy – do not wear Google Glass where it is not wanted, will put people off, or will disrupt activities. Do not take pictures without permission. Remove it when you enter a bathroom.

Rad Resources

Hot Tip: Stay tuned for Part II tomorrow when I will cover using Google Glass as an evaluation tool.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Enhanced by Zemanta

<< Latest posts

Older posts >>

Archives

To top