AEA365 | A Tip-a-Day by and for Evaluators

CAT | Extension Education Evaluation

EEE TIG Week: Tom Archibald on Whose Extension Counts?

Hello, I’m Tom Archibald, Assistant Professor and Extension Specialist in the Department of Agricultural, Leadership, and Community Education at Virginia Tech.

Debates about what counts as credible evidence in program evaluation and applied social science research have been ongoing for at least 20 years.

Rad Resource: Those debates are summarized well in a very helpful book on this very question edited by Stewart Donaldson, Tina Christie, and Mel Mark. In particular, the book provides a balance of viewpoints from both proponents and detractors of the position that experimental approaches are the “gold standard,” the best route to credible evidence.

Even long before that, questions of how to generate valid knowledge of the world around us—and specifically the role of experimentation—animated the scientific and aristocratic classes alike. In Leviathan and the Air Pump, Simon Schaffer and Steven Shapin examined the debate between Robert Boyle and Thomas Hobbes over Boyle’s air-pump experiments in the 1660s, exploring acceptable methods of knowledge production and societal factors related to different knowledge systems.

The point of this post is this: Seemingly esoteric methodological debates about credible evidence are in fact fundamentally important political questions about life. This point is summed up by Bill Trochim and Michael Scriven, who said, respectively:

“The gold standard debate is one of the most important controversies in contemporary evaluation and applied social sciences. It’s at the heart of how we go about trying to understand the world around us. It is integrally related to what we think science is and how it relates to practice. There is a lot at stake.” (W. Trochim, unpublished speech transcript, September 10, 2007)

“This issue is not a mere academic dispute, and should be treated as one involving the welfare of very many people, not just the egos of a few.” (Scriven, 2008, p. 24)

 Hot Tip: Epistemological politics (the ways in which power and privilege position some ways of knowing as ‘better’ and hierarchically ‘above’ other ways of knowing) are inextricably linked with ontological politics (whose reality counts, and how some reals are made to be more or less real, in practice, through various tacit or explicit power plays).

In the context of Cooperative Extension, and more specifically in the search for credible evidence about Extension, this nexus of epistemological and ontological politics raises the question: What is Extension?

For some (according to my research described here), it is a vehicle for dissemination of scientific information. For others, it is a site for grassroots knowledge sharing and deliberative democracy.

And, given that there appear to be (at least) a plurality of metanarratives about what Extension is, or (perhaps) an actual plurality of Extensions, the question then follows (playing on Robert Chambers’ influential title, Whose Reality Counts): Whose Extension counts?

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

I am Teresa McCoy, Assistant Director at the University of Maryland Extension (UME). I was hired 10 years ago with responsibility for evaluation and assessment across all program areas of agriculture, family and consumer sciences, environment and natural resources, and 4-H youth development. There was not a lot of evaluation knowledge and practice in the organization at that time, but there were many high expectations for what I could achieve.

In a situation of n=1 (me), I had to get started on evaluation capacity building (ECB). The critical decision was how to go about that work. Fortunately for me, four circumstances came together to point the way. First, the publication of a special winter issue of New Directions in Evaluation (NDE), “Program Evaluation in a Complex Organizational System: Lessons from Cooperative Extension,” provided much guidance to me.

Rad Resources:

Second, I am fortunate to be married to a software developer who has deep expertise in human interaction design and agile and lean software development. Over the years, I peripherally absorbed some knowledge and listened to stories from his industry. I realized my strategy to build evaluation capacity in UME had to include one of the principles of agile management: “Build projects around motivated individuals. Give them the environment and support they need and trust them to get the job done” (Agile Alliance, nd).

Rad Resources:

  • Read the Agile Manifesto Principles at https://www.agilealliance.org/agile101/
  • Want more in-depth information? Read Learning Agile: Understanding Scrum, XP, Lean, and Kanban by Andrew Stellman & Jennifer Greene

Thirdly, soon after starting in the position, I began working with an agriculture extension educator who had a natural inclination for evaluation work. (In full disclosure, her area of expertise is agricultural economics.) A change in some of her position responsibilities gave us the opportunity to develop an official agreement that she would receive a small administrative stipend for taking on evaluation leadership in agriculture. With that guidance, many of our new Extension Educators enter the job with enthusiasm for evaluation as part of their work.

Hot Tip:

  • The fourth and final circumstance, and the most critical, was that our Program Leaders/Assistant Directors supported me and the effort to build evaluation capacity. Without their support and leadership, building capacity would have been difficult, if not impossible.

About four years ago, UME added a Coordinator of Program Development and Evaluation (PD&E) to my evaluation department. Now, n=2, plus the rest of the capacity built within the program areas. I like to call my approach “Agile and Lean Evaluation Capacity Building” (AL-ECB).

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Chris Okafor and I am a Research for Development Coordinator at the International Institute of Tropical Agriculture (IITA), Bukavu, DRCongo (DRC). I would like to briefly share a tool we are using to assess farmers’ knowledge gain during training.  I am currently coordinating an integrated crops/livestock development project in DRCongo, a position which brings me into contact with small holder farmers. In addition to my doctoral specialization in program evaluation, I have a range of professional experience in capacity building, participatory extension and project management.

During my work with farmers, especially farmer training using the farmer field school (FFS) approach developed by Food and Agriculture Organization (FAO), a tool called Ballot Box Test was used to assess and discuss farmers’ knowledge of good agricultural practices before and after training and determine areas of priority attention. The tool was adapted to assess farmer ability to use a mobile phone to seek agricultural advisory services and market information from extension workers and clients.

In South Kivu Province of DRC, we took 107 farmers including 49 women beneficiaries of the Crops Livestock Integration Project (CLiP), invited for training in Information and Communication Technology (ICT), through the ballot process. The test consisted of 10 questions, each with three answer options (boxes). Each participant has a unique number written on 10 voting cards (one per question). The ballot papers could have the name of the participants on them. However, this is discouraged to avoid turning it into an examination for the participants. Generally, we have observed that the adults tend to shy away from examinations. Participants go through questions in an orderly manner. At the end of “balloting,” the results were tabulated and discussed in a participatory and interactive manner that helped to identify areas of priority attention. Some were coopted as facilitators because of their level of knowledge.  The exercise was conducted again at the end of the training to determine knowledge gain.

The Ballot Box method helped us document knowledge change. The pre-training exercise revealed that approximately 74% of farmers knew and understood basic features of mobile phones including saving contacts, placing a call and sending SMS. The post-training ballot exercise recorded an increase of 9%, to 83%. Less than 10% of the farmers knew they could use mobile phones to seek agricultural advisory services including veterinary services and pass on or receive market information. That percentage increased to 85% after training.

Hot Tip: Besides establishing gaps in farmers’ knowledge, the Ballot Box Test is a farmer-friendly tool that can be used to stimulate farmer interest in learning, increase what they know and encourage sharing of knowledge.

Rad Resource: Here is another example of how the Ballot Box has been used by other groups: http://ffs.ipm-info.org/ballot-box-test/

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Chelsea Hetherington and I’m an Evaluation Specialist with Michigan State University Extension. One of the many benefits of Extension work is that we reach people in local communities all around the country, but we also have the connections and resources of a big university. Several MSU Extension youth programs specifically focus on equipping youth with the skills and knowledge they need to be successful in college. For some programs, like 4-H Exploration Days, youth come to MSU’s campus for several days and stay in the dorms, attending sessions on different topics of interest and experiencing college life. In other programs, like 4-H Great Lakes and Natural Resources Camp, youth spend a week learning about environmental science, like fisheries and wildlife, ecology, and forestry, while also exploring careers and connecting with experts in these fields.

Post-event evaluations show that these programs equip youth with important college readiness skills, like time management, independence, collaboration, and increased knowledge of career fields and college majors. Still, as pre-college programs, a primary goal for these programs is that participation will increase youth’s enrollment in college and subsequent degree attainment.

Rad Resource: National Student Clearinghouse has been an incredible resource for tracking past pre-college program participants. For a small fee, we submit records of our past program participants to National Student Clearinghouse on an annual basis. National Student Clearinghouse returns reports that tell us how many program alums are enrolled in college, as well as what schools they attend. We also re-submit records 6 years after youth graduated high school to get reports on whether they’ve earned a college degree in that time. This data has been instrumental in demonstrating the value of our programs – we can show that alumni of Michigan 4-H pre-college programs are more likely to enroll in college on time and earn college degrees.

Hot Tip: Many states publish statewide education data that reports residents’ rates of college enrollment and degree attainment, and this data can be used to compare those rates for pre-college program alumni. In Michigan, this data is maintained by the Michigan Department of Education and is published online at MISchoolData.org. Data can be parsed out in a number of different ways, which has allowed us to report comparison data by high school graduation year, by county, school district, and more.

Lesson Learned: Leveraging existing data sources can be a great way to present powerful data without needing to track down individual participants. Good record keeping is key in this process! Make sure you keep detailed, well organized records of past program participants, as well as important accompanying information (for us, this includes their date of birth as well as their county of residence).

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hello! My name is Mary Arnold, I am a professor and 4-H Youth Development Specialist at Oregon State University. In my Extension specialist role, I get to work in that liminal space between research and practice. In this space I spend a great deal of time translating research into effective youth development practice in 4-H programs, which lately has focused on the importance of developing programs that provide a nourishing developmental context for young people. By developmental context I mean the settings, experiences, and activities that 4-H programs provide to help youth to grow and thrive. This focus is important for two reasons, first because high quality programs lead to better outcomes, and second, because program quality is something that youth practitioners have a great deal of control over. If practitioners develop and implement high quality programs, my job as an evaluator is much easier.

As the field of youth development took hold in the 1990s, there was an immediate corresponding interest in measuring program outcomes. Unfortunately, outcomes were often measured without consideration of what really happened in the program. Fortunately, research about youth development programs is now increasingly focused on what happens in programs and how what happens leads to outcomes. As an evaluator, improving programs and articulating the theory of action that connects programs to outcomes increases the rigor of the evaluation findings. Program quality matters! And in the case of youth programs, the pieces that make up a developmental context are increasingly understood.

Some of these pieces include:

  • Helping youth find and enhance their “spark”
  • Surrounding youth with development relationships, that express care, challenge growth and share power
  • Paying attention to youth program quality principles such as belonging, safety, mattering, efficacy, and skill development
  • Engaging youth beyond mere attendance in programs to provide an experience that has depth, breadth and duration.

These ingredients are not unique to 4-H programs; any youth program focused on positive youth development can emphasize the same ingredients.

Hot Tip

Consider evaluating your program’s developmental context and use the results of the evaluation to work on program continuous program improvement. High quality programs lead to better outcomes for youth.

Rad Resources

 The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings! We are Alda Norris, Evaluation Specialist at University of Alaska Fairbanks School of Natural Resources and Extension and Kendra Lewis, 4-H Evaluation Coordinator at University of California, Agriculture and Natural Resources. We are the Chair and Secretary, respectively, of the Extension Education Evaluation Topical Interest Group (EEE TIG). This TIG offers professional development for Extension professionals interested in and engaged in program evaluation. Our TIG is entering its 37th year!

This week, some of our members will be sharing their work with you, including tips on program quality from Mary Arnold; tracking college enrollment and degree attainment from Chelsea Hetherington; gathering farmer feedback from Chris Okafor; agile and lean evaluation capacity building from Teresa McCoy; and credible evidence from Tom Archibald.

If you’ve never visited a Cooperative Extension System (CES) office, know they exist across the U.S. and its territories. CES works to “extend” the research knowledge gained at land-grant universities to the public through fact sheets, workshops, consultations, maker spaces and beyond. Learn more from our federal partners at the National Institute of Food and Agriculture (NIFA). Our TIG also has members involved in the agricultural extension systems of other countries; you’ll read more about projects in the Democratic Republic of Congo in Chris’ post.

Lesson Learned: When you have been in an organization for a while, it’s easy to forget not everyone knows, or understands, its mission. CES is often called a “best kept secret” of community resources. Practice your elevator speech!

Hot Tip: Consider joining a TIG for networking. Who does similar or complementary work? Consult a list of members whose work has been highlighted in blogs or awards. Read about some of the past EEE TIG award winners and seek those folks out for a conversation at the annual conference and business meeting.

Hot Tip: Search for AEA affiliates and TIG pages on social media. Veteran evaluators often post helpful resources like evaluator Twitter handle lists. Make yourself a calendar invite to start spending time each week reading posts on discussion lists like EVALTALK.

Hot Tip: Look for other clearinghouses for sharing information and professional development in your discipline. Evaluators can browse the AEA library for past conference presentations, Coffee Break presentations, and other resources. In Extension, a great resource we have is eXtension.org.

Rad Resource: A great place to learn about the breadth of topics and current trends in any arena, including Extension, is to take a look at what folks are publishing about their work. Relevant journals for outreach include Journal of Extension and Journal of Human Sciences and Extension.

Rad Resource: Think beyond print. Listen to podcast interviews with authors from the recently published We’ve Tried That Before: 500 Years of Extension Wisdom.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

My name is Samuel Pratsch and I am proud to say I have been working at the University of Wisconsin-Extension for the past 7 years in program development and evaluation. I was fortunate to work with Ellen Taylor-Powell and I am honored to carry on her legacy working with logic models. Over the years, the University of Wisconsin-Extension has been the go to source for logic model resources; however, and for various reasons, our use and development of logic models internally has failed to live up to our reputation. In Mary Arnold’s 2015 article, “Connecting the Dots: Improving Extension Program Planning with Program Umbrella Models,” she provides a well-reasoned explanation for ways that the University of Wisconsin-Extension can improve on our logical model capacity building work. I agree with many of her ideas.

Lesson Learned:

In my own experiences supporting extension educators and specialists I have noticed that my colleagues had a great understanding of how to fill in the parts of the logic model, and there was an opportunity to increase their awareness of how those parts were connected to one another.

Cool Trick:

In an effort to put the “logic” back into logical models, I have developed an innovative capacity building approach to guide colleagues in making explicit the “pathways of change” for their programs. This approach is focused on increasing knowledge and use of “program logic” and “outcome chains” through a number of hands-on activities. I begin the session with an ice breaker that helps participants think about if/then relationships. We stand in a circle and the first person says an “if” statement and then passes a ball of yarn to someone else while holding on to the string. The next person answers the “if” statement with a “then” statement, which helps encourage thinking about causal relationships.

Next I divide the group into pairs and have them work through a “pathways of change” process. I ask them to write about a change they would like to see happen in the next three years as a result of their program. Then using “forward casting” and/or “backward casting” through a series of if/then statements, I have them sketch out the casual relationships between their activities and intended outcomes of their program. They then review those relationships and look for assumptions and biases in their logic. To date, I have used this approach with a number of different groups at the University of Wisconsin-Extension and in a variety of ways. I have conducted two separate workshops where the focus was on individual programming. I have also used this approach with programmatic teams who wanted to learn more about the theory of change of their collective programming and develop shared measures for their work. I see a lot of potential in this new approach and look forward to building upon it in the future.

The American Evaluation Association is celebrating The Wisconsin Idea in Action Week coordinated by the LEAD Center. The LEAD (Learning through Evaluation, Adaptation, and Dissemination) Center is housed within the Wisconsin Center for Education Research (WCER) at the School of EducationUniversity of Wisconsin-Madison and advances the quality of teaching and learning by evaluating the effectiveness and impact of educational innovations, policies, and practices within higher education. The contributions all this week to aea365 come from student and adult evaluators living in and practicing evaluation from the state of WI. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Pam Larson Nippolt and I am a University of Minnesota Extension Evaluation and Research Specialist working with a team of program evaluators in 4-H youth development programs.

Lesson Learned: Monitoring enrollment data is often a data-related activity that falls under the umbrella of program management.   Monitoring enrollment data enables program leaders to pay attention to some aspect of program implementation via inputs or outputs. What is monitored can be quite distinct, but it still can inform the focus of an evaluation or measurement of an outcome.

When planning with program teams, I use the example that monitoring is similar to setting a metronome while playing piano–it keeps a steady beat going to help the pianist stay in tempo. Evaluation, on the other hand, is the assessment the pianist and audience make about the music created.12

Lesson Learned: Collecting, maintaining, and analyzing data for monitoring purposes are an investment of time and resources that can pay dividends for evaluation in the long run!

Enrollment databases, used in many large youth development programs, are excellent data sources for program monitoring, but are often overlooked. For example, in 4-H, program data (shown below) revealed that a region with the largest Metropolitan area (Central Region) enrolled more youth from farms and small towns than what had been believed to be the case.

aea11

This finding seemed to be counter-intuitive and led to further investigation of the data. We discovered that many youth living in the city and participating in the program were not in the enrollment database because of a particular enrollment practice.

Monitoring the enrollment data led to an awareness about the need to make the process more accessible for all youth and families.   Program staff may not have identified the scale of this discrepancy without this type of monitoring.

Hot Tip: Get started by “whetting the appetite” of your program partners for data use with available data about the program and participants. Build appealing and visually engaging graphics to make the using the data rewarding to staff who don’t typically attend to data. Ask questions and listen to how they make sense of the data. This practice will reveal what can be monitored “right now” for team learning.

Rad Resource:  Consider investing in making your enrollment database more usable and accessible to staff with trend and comparison features. Interfaces can be designed for your enrollment software that provide a dashboard with menus to track changes over program years and geographic comparisons. Think like an interface designer to create tools and reports that will help program staff love their data!

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Brigitte Scott and I am the Evaluation and Research Specialist for the Military Families Learning Network (MFLN), which engages military family service professionals in high-quality, research-based professional development. The MFLN is part of the Department of Defense (DoD) – U.S. Department of Agriculture / National Institute for Food and Agriculture Partnership for Military Families (USDA/NIFA) and is also a part of eXtension—the online branch of America’s Cooperative Extension System (CES). Evaluation for the MFLN comes with a few challenges—leadership, PIs, and staff are spread out across the country; our cooperative funding agreement requires nimble and flexible programming (Hello, developmental evaluation!); and constituents in multiple institutions have different ways of communicating and varied reporting needs.

Lesson Learned: When I first began working with MFLN, I drew heavily on my background in qualitative methods, and all of my mixed methods reports took on a narrative form. However, the reports weren’t getting read. With competitive funding forever at stake in an era of sequestration, this had to change.

Enter data visualization. At AEA 2014, I took a two-day data viz workshop with Stephanie Evergreen. It was invaluable! My reports are still works in progress, but I know now they are being read. How? Folks are actually contacting me with questions! My reports are getting circulated at DoD, which has meant increased awareness of MFLN and a lot of kudos for our work. (It doesn’t hurt come budget time, either.) PIs and staff are utilizing the reports to discuss their progress against dynamic plans of work while focusing on the moving target of program innovation.

Hot tip: CES just celebrated its 100th birthday last year, but make sure your reports aren’t dinosaurs! Your reports—your efforts!—need to be seen and heard to be actionable. I like to think of CES as power to the people. If you agree with me, then give data viz a try to get your points across and support CES in making a difference in counties across the nation.

Hot tip: Data visualization isn’t all about Excel. Arrange key verbal points on a page with clean, clear data. Pull out a thread from a data story and expand it in a text box, or pick up qualitatively where your quantitative story said its piece.

Hot tip: Font and color matter. Use your organization’s visual identities in your reports to let readers know that your report concerns them and their work.

Rad resource: Check out AEA’s offerings on data visualization, including workshops, coffee breaks, and of course, the annual meeting data viz sessions. They really are amazing!

Rad resource: Stephanie’s workshops are a must, but so is her book. Check them both out!

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Salutations from the Land of the Midnight Sun. My name is Alda Norris. I am an evaluation specialist for the University of Alaska Fairbanks Cooperative Extension Service and webmaster for the Alaska Evaluation Network.

There is a lot of activity packed into a single word when you say “evaluation” or “extension.” Have you ever had someone stare at you blankly when you tell them your job title? My background is in the study of interpersonal communication, and I believe developing skills in providing effective comparisons will boost our ability to explain “what we do” to others.

 Hot Tip: A three-step pattern I learned from speech class can be very helpful.

  1. Define the term.
  2. Give examples of what it is.
  3. Give examples of what it is not.

Also, your audience will gain a deeper understanding if the examples you use are surprising. Here’s one from our state sport: Many people hear the term “sled dog” and think of a big fluffy Siberian Husky. However, many purebred Siberians are show dogs not used for mushing. Sled dogs are more commonly of a mixed heritage known as Alaskan Husky, and some are crossed with other breeds like Greyhound or Pointer!

Lesson Learned: Clients may make demands that seem unreasonable because they misunderstand the scope of your expertise or duties. Even worse, they may not seek you out at all because they don’t see a link between your title and what they need. If you’ve ever had someone think evaluation is “just handing out a survey” or extension is “just agriculture stuff” then you know what I mean! Take the time to do some awareness-raising with your target audience.

Hot Tip: Strip away the professional jargon and think about what words the public would use to describe you. Make sure those terms are included on your web page so that search engines will associate you with them. If you haven’t already, add an “About” or “FAQs” page that addresses what you do (and don’t) have to offer.

Rad Resources: Books like Eva the Evaluator are great for providing examples and comparisons of what jobs like “evaluator” entail. Maybe someone will write an Ali the Extension Agent book someday! Also, search the AEA365 archives for related discussions on the difference between evaluation and research, and how to use metaphors to extend understanding.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Older posts >>

Archives

To top