AEA365 | A Tip-a-Day by and for Evaluators

CAT | Extension Education Evaluation

Howdy! I am Kevin Andrews, a program specialist at Texas A&M AgriLife Extension Service. In addition to my Extension duties, I co-teach a graduate evaluation course at Texas A&M University.

I came across a post from March about students partnering with community agencies to apply their evaluation skills. I’d like to build upon Dr. Brun’s idea for evaluators who have ties to a university, especially those in Extension.

Many of our students have no idea what extension (or any other agency) is. Any engaged university seeks to tie together the scholarships of teaching, research, and service, and hands-on evaluations are a perfect way to accomplish this.

Lessons Learned: By allowing students to partner with us on evaluations, they not only receive practical experience and make an impact, they also get to learn who we are. This can aid in recruiting talented students to work for the agency; we’ve had several ask about careers in extension.

Hot Tip: Students are going to ask a lot of questions. We can get pretty set in our ways and think we know our agency well. When you have to pause to explain why we do what we do in basic terms, you are forced to reflect on exactly why it is we have been doing things a certain way all these years!

Hot Tip: Our employees just want their voices heard. With students conducting interviews we get far more coverage than a single evaluator using a sample, and employees are able to feel their opinions matter. Our staff is also much more likely to be open with a student than they are a peer.

Lessons Learned: I like to be in total control over my projects, but part of delegating work is letting others do their own thing. By developing goals together early in the project, I can ensure the outcome is as I intended while allowing students to experiment and develop their own processes.

Hot Tip: Often, when a class is over, the student-teacher relationship ends. Keep contact information and follow up with students a year later to let them know the impact of their work. No matter where life takes them, they are your stakeholders and you want them to hold you in high esteem.

Lessons Learned: I’m lucky to get to straddle teaching and Extension. For those who don’t simply reach out and ask! I’ve been approached by others with projects for students, and I’ve approached others with projects of my own. Everyone has something they need done!

Two years ago, I was the student participating in a class evaluation. Three from my class, including myself, now work for Extension and our report generated $200,000 of funding – the model works!

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings! My name is Suzanne Le Menestrel and I am a National Program Leader for Youth Development Research at the 4-H National Headquarters, National Institute of Food and Agriculture, U.S. Department of Agriculture.  4-H is a national youth development organization serving 6 million youth throughout the country. We partner with the nation’s Cooperative Extension system operated by the more than 100 land-grant universities and colleges and with National 4-H Council, our private, non-profit partner. Recent trends in funding have elevated the importance of illustrating impact and accountability for nonformal educational programs.  We were also interested in building capacity for evaluation through the creation of easy-to-use and accessible tools.  We partnered with National 4-H Council, state 4-H program leaders, 4-H specialists and Extension evaluators from around the country to create a national 4-H common measures system that will also enable us to aggregate data across very diverse 4-H programs.

I have learned a number of lessons through the implementation of this new system.

Lessons Learned:

    • Common measures must be developmentally appropriate. Children and youth who participate in 4-H range in age from ages 5 to 19.  Because of concerns about reading levels and developmental appropriateness, we focused the common measures on ages 9 to 18. We also divided up the measures into two levels—one for children and youth in grades 4 through 7 and one for youth in grades 8 through 12.
    • Common measures must have strong psychometric properties.  As much as possible, we drew from existing measures but have been conducting analyses with both pilot and preliminary data.
    • Measures must be applicable to a broad variety of programs. 4-H looks very different from county to county and state to state. We started with the creation of a national 4-H logic model that represents desired program outcomes.

Clipped from http://www.4-h.org/about/youth-development-research/

 

  • Common measures must be available through a flexible, easy-to-use, and robust on-line platform.  This includes the ability to add custom items.
  • Training and technical assistance are key to the implementation of common measures in a complex, multi-faceted organization such as 4-H.
  • Buy-in and support from stakeholders is critical as is creating an ongoing system for soliciting stakeholder feedback.
  • Such a system cannot be developed without sufficient funding to support the on-line platform, technical assistance, and on-going formative evaluation.
  • Common measures are a flexible product that needs to grow and change with the outcomes of the organization.

Rad Resource:

Check out this article written by Pam Payne and Dan McDonald on using common evaluation instruments.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Melissa Cater, and I am an assistant professor and evaluation specialist at Louisiana State University AgCenter. I am also serving as Chair of the AEA Extension Education Evaluation Topical Interest Group (EEE-TIG) this year. The EEE-TIG provides a professional development home for Extension professionals who are interested in program evaluation; we also welcome other individuals who are evaluating non-formal education outreach programs in a community setting. The EEE-TIG goals provide a guiding framework for the membership.

Hot Tip: Our TIG has provided a place for Extension professionals to become more collaborative. If you are searching for a way to become more involved in evaluation, join a TIG. The networking opportunities are endless.

This week’s aea365 blog posts are sponsored by the EEE-TIG. I invite you to learn more about who we through this week’s series of posts. You’ll see that we have a range of interests within our membership from evaluating agricultural programs, to teaching evaluation, to supporting participatory community research, to building evaluation capacity.

Hot Tip: You can learn even more about the EEE-TIG and the varied interests of our members by viewing our archived blog posts.


Clipped from http://www.joe.org

Hot Tip: Want to learn about the diversity of programs that are being evaluated in Extension? Check out the Journal of Extension to see the breadth of topics.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings from the Last Frontier. I’m Alda Norris, webmaster for the Alaska Evaluation Network (AKEN) and evaluation specialist for the University of Alaska Fairbanks Cooperative Extension Service (CES).

The faculty and staff I work with at CES are experts in a variety of fields, from horticulture, entomology and forestry to economics, nutrition and child development. That adds up to quite an interdisciplinary organization! Our diversity makes for fantastic collaborations, as well as complicated syntheses. Lucky for me, my PhD is in interpersonal communication, which applies across the board.

Lessons Learned:  Ask people to tell you the inspiration behind their projects. Every group has a story to tell.What common goals bring these people together?Inquiring about the “why” and not just the “what” of a program really benefits capacity building efforts. I got to know CES better while writing a Wikipedia entry. Hearing and reading about the contributions Extension has made in Alaska since the 1930s deepened my understanding of what led up to each of our program’s current priorities and logic models.

  • Help yourself with history. Too often we are mired in a static view of where an organization is now, rather than having an appreciation for how it has changed, and continues to change, over time. Even in a “young” state like Alaska, there is rich historical data we can learn from.
  • Boost your evaluation planning by gathering information on your/the client organization’s “story” from a variety of sources. Talk to emeritus professors, compare the org chart of today to past decades, and comb through newspaper archives. Becoming familiar with past waves of change is very helpful in understanding the meaning behind current missions, goals and structures (and people’s attachments to them).

Hot tip: Communicate about communication! Add a question about communication preferences to your next needs assessment. Don’t assume you know what level of technology and form(s) of interaction your colleagues and clients are comfortable with. Before you do a survey, figure out what modes of communication the target population values. For example, if oral history is a large part of a sample group’s culture, how well will a paper and pencil form be received?

Rad Resources:

  1. The National Communication Association (NCA) can help you step up your message design game. Take advantage of free advice from experts on verbal and nonverbal communication by reading NCA’s newsletter, Communication Currents.
  2. AnyMeeting is a freetool that you can use to reach a wider audience. With it, you can host online meetings and make instructional videos, both of which are really handy when working in a geographically diverse setting. AnyMeeting also has screenshare clarity in its recordings that Google Hangouts lacks.

The American Evaluation Association is celebrating Alaska Evaluation Network (AKEN) Affiliate Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Melissa Cater, and I am an assistant professor and evaluation specialist at Louisiana State University AgCenter. I also coordinate and contribute to the eXtension Evaluation Community of Practice (Eval CoP) blog.

Rad Resource – eXtension Evaluation Community Blog: The Eval CoP blog is a resource for evaluators of all levels from novice to expert. Content covers the basics of designing and conducting evaluations. It also centers on practical ideas for evaluating programs in nonformal educational settings like Cooperative Extension, libraries, and museums, just to name a few. Our blog is relatively new with content posted one to four times each month on Friday.

Hot Tips – favorite posts: Two recent series of posts focused on survey design and social media evaluation. Here is a sample of posts from those series:

Lessons Learned – why I blog: As professional development needs increased and travel dollars decreased, I found myself turning to more informal types of professional development. Blogs quickly became my favored means of learning. I appreciated the anytime, anyplace access to learning. The more I learned, the more I felt the need to give back to our field through blogging. As our Cooperative Extension evaluation community of practice evolved, our group has embraced the flexibility the blog provides for connecting with colleagues through shared work and a shared product.

Lessons Learned – what I’ve learned: Blogging is hard work. Early lessons I learned were to generate content ideas for several months at a time, to set aside a specific time and day of the week to work on the blog, and to create a system for engaging and following up with contributors.

This winter, we’re continuing our series highlighting evaluators who blog. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Melissa Cater, and I am an assistant professor and evaluation specialist at Louisiana State University AgCenter.

The purpose of the Evaluation CoP is to support people within the Cooperative Extension system who are interested in evaluating Extension programs. While the focus of our group is supporting Extension evaluation efforts, the information is applicable to many program evaluators. Evaluators working in other nonformal educational settings may find useful resources as well as a place for engagement with like-minded professionals.  AEA Topical Interest Groups (TIGs) or new CoPs may consider a similar approach of engaging core members to build synergy with a larger group.

Rad Resource:

One of the resources provided by the Evaluation CoP is a list of 97 frequently asked questions. Answers to questions related to topics like evaluation design and data analysis are provided. Links to archived webinars on evaluation topics ranging from impact reporting to the use of photolanguage are also accessible.

Clipped from http://www.extension.org/program_evaluation/faqs?page=1

Rad Resource:
The core mission of a CoP is to provide a place to share information and experiences among group members. Providing structures to advance group communication is key to group success. The Evaluation CoP offers members an opportunity to connect via social media.

Our CoP Facebook page provides a platform for members to connect, to pose questions to the group, and to share success stories. Twitter offers another means of connection; interested individuals may follow us at @EvalCoP. Our group also sponsors a blog, the eXtension Evaluation Community Blog. While our blog is in its infancy, our vision is for contributors to supply more in-depth evaluation information.

Lessons Learned:

Our Evaluation Community of Practice story illustrates Wenger’s (1998) principles of CoP structure. These principles, mutual engagement, joint enterprise, and shared repertoire (pp. 72-73), are easily applicable to AEA TIGs or to any other group of evaluators who share a common evaluation interest.

  • Mutual Engagement. Shared work has helped our members build relationships that span the nation. This social network forms the foundation of our work as a CoP. We have discovered both our individual and shared strengths and use these to benefit the CoP’s work.
  • Joint Enterprise. By striving together to improve our evaluation practice within the Cooperative Extension system, we have built a shared understanding of our work. We recognize the commonalities we share with other evaluators as well as the unique aspects of our craft as Extension evaluators.
  • Shared Repertoire.  Our ongoing work as a collaborative community, creating resources like the frequently asked questions and webinars, has resulted in shared work and shared products.

In the words of Henry Ford, “Coming together is a beginning. Keeping together is progress. Working together is success.”

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

·

Hi!  I’m Sarah Baughman, Evaluation and Research Leader for eXtension.  eXtension is the virtual presence of the Cooperative Extension System. We work with faculty from land grant universities to provide unbiased, research-based educational resources across a wide range of topics including feral hogs, fire ants, and families through our 70 communities of practice. A major aspect of my daily work is assisting communities of practice with evaluation efforts.  Although most faculty are familiar with evaluation basics the virtual work environment tends to confound faculty.

Hot Tip – Back to Basics – When working with faculty on evaluations for programs that involve social media and/or web based resources I take them back to the basics.  I help them situate their social media and virtual “tools” into the context of their programs by asking lots of questions that point back to evaluation basics such as programmatic mission, purpose and goals.  Why are they tweeting?  What do they hope to achieve through by integrating social media into their programs?

Lesson Learned – Capacity building is an on-going process.  The landscape of our work changes rapidly with new faculty on board, new technologies developed and new communities of practice forming.  As one faculty member embraces evaluation as a critical component of their work, another community of practice changes leadership necessitating renewed capacity building efforts.

Lesson Learned – Another key for working with faculty immersed in their disciplines is to show them how evaluation methodologies are similar to their research methods.  The purpose of evaluation is different than research but the methodologies are the same.

Rad Resource – Google + Hangouts have proven to be an invaluable resources for one on one or group meetings.  Hangouts are free video conferences that allow screen sharing and are mobile device friendly so busy faculty can meet from almost anywhere.  The screen sharing allows me to walk through tools with them or troubleshoot issues that are difficult to describe in other contexts.

Rad Resource – There is a lot of information on social media marketing and measurement but it is primarily aimed at for-profit businesses.  In the world of education and non-profits the goals and outcomes can be fuzzy.  Measuring the Networked Nonprofit by Beth Kanter and Katie Delahaye Paine does an excellent job of describing the importance of measuring social media and more importantly, how measurement can help change practice.

Rad Resource – One of the eXtension communities of practice is devoted to helping Extension professionals evaluate programs and demonstrate programmatic impacts. Their work is helpful to non-formal or informal educators, non-profits and anyone working on evaluation in a complex, decentralized environment. Connect with us at @evalcop or blogs.extension.org/evalcop/.

Clipped from http://blogs.extension.org/evalcop/

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I’m Nancy Franz the Associate Dean for Extension and Outreach and Director of the Iowa State University Extension and Outreach to Families.

Organizational sustainability for many agencies, nonprofit organizations, and groups now relies on translating the value of what they do for a wider audience than in the past. Principles of public sector economics can help program evaluators determine and share the public value of programs. Public value has been defined by Laura Kalambokidis an economist at the University of Minnesota as the value of a program to those who do not directly benefit from the program. This contrasts with the private gain or personal value program participants directly gain from programs such as new knowledge gain or behavior change. Here are lessons I’ve found as a program evaluator helping others measure and articulate the public value of their programs.

Lessons Learned:

-Be proactive with public value stories rather than waiting until decision-makers cut funding or programs.

-Start with early adopters, and nurture the midadopters. Don’t waste time on resistors.

-Build urgency with staff by using real stories about public lack of understanding or misunderstanding of their work (i.e., decision-makers who have cut funding or programs).

– Provide a wide variety of professional development opportunities for staff to enhance their public value thinking, skills, and story development.

– Secure public value champions in the organization at all levels to help catalyze change.

– Use many examples of public value of the organization’s work for staff deeply steeped in the private value of their effort as tangible incentives to change their thinking and practice.

– Don’t underestimate the ability of clientele to determine, measure, and share public value of programs.

– Encourage researchers to conduct research and share results connecting the private value of the organization’s work with public economic, environmental, and social condition changes.

– Create a strong statistical base for the relevance section of public value stories to make them more convincing and to make it easier to measure actual change due to the organization’s programs.

– Bridge field and administrative visions and actions around public value efforts through middle managers in the organization.

– Determine which programs should be supported solely by public funds and solely by private fund by determining the public value of each program.

– Develop organization-wide templates with staff to provide a tangible and safe environment for changing thinking about program value, and help staff improve program development, implementation, and evaluation to more fully address public value.

– Involve economists, program evaluators, communications staff, and stakeholders in developing public value stories to more deeply and authentically tell the story.

– Urge administrators to enhance or catalyze public value throughout the organization.

Rad resource:

Advancing the Public Value Movement: Sustaining Extension During Tough Times

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Kim Norris and I am an Evaluation Coordinator for University of Maryland Extension, specializing in organizational and systems change efforts.

Developing a culture of evaluation for non-formal educational organizations is critical to providing focus and accountability, yet challenging.  Funders and other stakeholders seek clearly communicated, robust, timely information about program impacts for target audiences.  However, if changes occur too quickly or without appropriate employee input, pushback can lead organizations to expend resources regaining employee loyalty.

Hot Tip: Follow these six steps when creating a culture of evaluation:

  1. Work from where your employees are. Communicate with employees about what they perceive to be the greatest benefits and challenges to achieving desired changes. Welcome feedback from employees throughout the process to avoid moving too fast or slowly.
  2. Balance benefits for the employee and organizational evaluation needs.  Every major change may mean additional work for employees. Clearly describe and implement advantages for employees, as well. E.g., if more time is expected to ensure certain data is collected, give employees kudos and reports that help them enrich their Cv.
  3. Develop a timeline with employees. Employees will be making changes along with you and need to agree to the timeline. Include deadlines for processes affecting employee workloads and those benefiting employees.  When possible, roll these out together.
  4. Provide training opportunities and data tools that match the needs of your audiences. Online data collection systems should be simple and intuitive to use. Employees and collaborators need training regarding new expectations, tools, and protocols.  Utilize online short modular trainings that create a “building blocks” approach to training. Provide training timelines that allow learning to occur at reasonable rates.
  5. Link expectations to performance review incrementally.  If no link exists, efforts may still be stagnant; if too much is expected, pushback may occur.  Expect employees to carry out one to two new processes well each cycle – e.g., Year 1: writing higher quality program narratives; Year 2: utilizing quantitative outcomes tools, etc.  Communicate items included in performance reviews in advance; provide strong training and support mechanisms for those items. Include employee performance reviews of the process and your leadership in it; a culture of evaluation includes evaluating your own efforts.
  6. Share good stories and reward good work.  Communicate quality efforts internally and externally.  Provide avenues for reward so that individuals see benefits for their efforts and others see how a job well done looks.

Rad Resource: This table shows a sample timeline for integrating evaluation into multiple levels of programmatic decision-making and reporting.  The timeline shows how an iterative approach was implemented and efforts were made to balance employee gains with increased expectations and accountability.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello, my name is Michael Lambur.  I am currently the Associate Director for Program Development with Virginia Cooperative Extension.  I have been an Extension evaluator since 1985 and recently spent five years as the Evaluation and Research Leader with the eXtension initiative.

Website usability testing measures the suitability of a website for its users and is directed at measuring the effectiveness, efficiency, and satisfaction with which users can achieve specified tasks while using the site.  Basically, a user is given a set of tasks to complete while being videotaped talking out loud about their experience.  In my time with eXtension, we conducted two usability tests of the initiative’s public website.  The first was done face-to-face by a contractor.  It was well done, informative, and rather expensive.  In the more recent one, we used an online usability testing service.  Again, it was well done, informative, amazingly inexpensive, and very timely.

Lessons Learned:

Online usability testing services do work.  We achieved essentially the same results using the online service versus face-to-face at a fraction of the cost.  Our face-to-face usability testing cost $12,000 using 12 participants.  Our online usability testing cost $280 using eight participants.  In addition, the online service provided trained testers based on demographics we provided.  And we received a video of the results in about an hour.

The key to usability testing is a set of tasks that reflects the purpose of the site.  You need to develop a set of very specific tasks the user can move through that truly reflects how you want them to understand and use the site.  For the online service, the tasks needed to be completed in 15 minutes, whereas the face-to-face lasted about 45 minutes.  The quality of feedback received from the online testing was excellent—you can achieve a lot in 15 minutes.

Usability test results can be brutal.  Be prepared to have your bubble burst when viewing the video of people using your website.  What we intend in developing a site and what people experience in using it can be very different.  While it typically isn’t all bad, the results are often eye-opening.  Keep in mind that the end result is an improved website that best serves your users.

Rad Resources:

The two online services we used were:

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

<< Latest posts

Older posts >>

Archives

To top