AEA365 | A Tip-a-Day by and for Evaluators

TAG | Monitoring

My name is Pam Larson Nippolt and I am a University of Minnesota Extension Evaluation and Research Specialist working with a team of program evaluators in 4-H youth development programs.

Lesson Learned: Monitoring enrollment data is often a data-related activity that falls under the umbrella of program management.   Monitoring enrollment data enables program leaders to pay attention to some aspect of program implementation via inputs or outputs. What is monitored can be quite distinct, but it still can inform the focus of an evaluation or measurement of an outcome.

When planning with program teams, I use the example that monitoring is similar to setting a metronome while playing piano–it keeps a steady beat going to help the pianist stay in tempo. Evaluation, on the other hand, is the assessment the pianist and audience make about the music created.12

Lesson Learned: Collecting, maintaining, and analyzing data for monitoring purposes are an investment of time and resources that can pay dividends for evaluation in the long run!

Enrollment databases, used in many large youth development programs, are excellent data sources for program monitoring, but are often overlooked. For example, in 4-H, program data (shown below) revealed that a region with the largest Metropolitan area (Central Region) enrolled more youth from farms and small towns than what had been believed to be the case.

aea11

This finding seemed to be counter-intuitive and led to further investigation of the data. We discovered that many youth living in the city and participating in the program were not in the enrollment database because of a particular enrollment practice.

Monitoring the enrollment data led to an awareness about the need to make the process more accessible for all youth and families.   Program staff may not have identified the scale of this discrepancy without this type of monitoring.

Hot Tip: Get started by “whetting the appetite” of your program partners for data use with available data about the program and participants. Build appealing and visually engaging graphics to make the using the data rewarding to staff who don’t typically attend to data. Ask questions and listen to how they make sense of the data. This practice will reveal what can be monitored “right now” for team learning.

Rad Resource:  Consider investing in making your enrollment database more usable and accessible to staff with trend and comparison features. Interfaces can be designed for your enrollment software that provide a dashboard with menus to track changes over program years and geographic comparisons. Think like an interface designer to create tools and reports that will help program staff love their data!

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE AEA Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Rhonda Schlangen, and I’m an independent evaluation consultant specializing in advocacy and campaign evaluation.

Non-governmental service delivery organizations use advocacy as a lever for greater impact.  They often have a tough time finding useful internal evaluation systems to assess their combined service delivery and advocacy work.  Many service providers try to fall back on established evaluation processes for service delivery M&E—logic models, quantitative client counts—with disappointing results.  Advocacy evaluation is a dynamic and innovative corner of the evaluation world, and offers strategies that can improve M&E of both service delivery and advocacy.

Lesson Learned:

Strategies that work well with advocacy monitoring and evaluation position evaluation as a driver of effectiveness, capitalizing on the critical thinking skills of advocates and evaluators.  Repositioning evaluation as a tool for knowledge generation is particularly critical for advocacy.  Making evaluation more accessible and relevant to those doing the work will likely benefit service delivery as well.

Hot Tips:

  • PLANNING:  Link planning for advocacy and service delivery, but plan for different timeframes of change.  Advocacy will likely be cyclical and ongoing, with service delivery on a more linear path.
  • IMPLEMENTING:  Monitoring implementation with reflective processes focusing on progress, rather than outputs, provides a rich opportunity for joint advocacy/services review that is mutually informing.  Advocacy M&E requires consistent and regular review of, and response to, information.  Service delivery M&E could benefit from those processes as well.
  • EVALUATING:  Add innovative methods to the evaluation mix.  Newer approaches like Outcome Mapping or Outcome Harvesting place social change as a broad outcome to which services and advocacy are contributing strategies.

Rad Resources:

The American Evaluation Association is celebrating Advocacy and Policy Change (APC) TIG Week with our colleagues in the APC Topical Interest Group. The contributions all this week to aea365 come from our APC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! My name is Sudharshan Seshadri and I am currently pursuing my Masters degree in Professional Studies specializing in Humanitarian Services Administration.

I started to realize that “data” is being considered as a most promising feature to understand the activity of evaluation. To abstract the data needs, I believe as an evaluator, we should be conscious to explore the resources available in all forms of ubiquitous information.

I would like to share a few resources that are promising to beginners in the conduct of evaluation. For the purpose of ease of use, I shall classify the resources under three headings:

Rad Resources for Program Planning

1.      The Ohio State Evaluation Bulletin, Extension – A systemic approach to design and plan program evaluations. (http://ohioline.osu.edu/b868/)

2.      Program Planning – Program Development and Evaluation. PD & E). UWEX. (http://www.uwex.edu/ces/pdande/planning/index.html)

3. Planning a Program Evaluation: Worksheet (Co-operative Extension)

(http://learningstore.uwex.edu/assets/pdfs/G3658-1W.PDF)

4.      Evaluation design checklist, Daniel L.Stufflebeam, The Evaluation Centre,WesternMichiganUniversity. (http://www.wmich.edu/evalctr/checklists/)

5.      Key Evaluation Checklist (KEC), Michael Scriven. (https://communities.usaidallnet.gov/fa/system/files/Key+Evaluation+Checklist.pdf)

Rad Resources for Program Implementation, Monitoring, and Delivery

1.      W.K.Kellogg Foundation. Evaluation Handbook. (http://www.wkkf.org/knowledge-center/resources/2010/W-K-Kellogg-Foundation-Evaluation-Handbook.aspx)

2.      Program Manager’s Planning, Monitoring and Evaluation tool kit. Division for Oversight Services. Tool number 5. (http://www.unfpa.org/monitoring/toolkit.htm)

3.      Evaluation Models. View Points on Educational and Human Services Evaluation. Second Edition. Edited by Daniel.L. Stufflebeam, George.F. Madaus & Thomas Kellaghan. (http://www.unssc.org/web/programmes/LS/unep-unssc-precourse-material/7_eVALUATIONl%20Models.pdf)

Rad Resources for Program Utilization

1. Utilization – Focused Evaluation. Michael.Q. Patton. Fourth Edition. Sage Publications.

2.      Independent Evaluation Group. (IEG) The World Bank Group. Improving development results through excellence in evaluation. (http://www.worldbank.org/oed/)

3.      My M & E – A platform for sharing knowledge and practice amongst M & E practitioners worldwide. (www.mymande.org)

4. “Evaluate”, Evaluation centre operated by Western Michigan University, Specializing in National Science Foundation (NSF) Evaluations. (www.evalu-ate.org)

5. United Kingdom Evaluation Society.(UKES) Resources/Evaluation Glossary/ (http://www.evaluation.org.uk/resources/glossary.aspx)

Lessons Learned: Always initiate the search for data needs. In the information age, we have plethora of evaluation services in execution all over the world. Data acts a gateway to useful and significant research practices carried out in the profession of evaluation. Clearly, I accord benchmarking as an outcome of consistent resource search and utilization.

Hot Tip #1: How long can you stare at the Google search engine screen for your data needs? Expand your search through a multitude of web resources.

Hot Tip #2: Use networking to get instant responses to your queries. It allows you to create a new dimension for your learning and practice methods. For example, I created a separate page named “The Evaluation Library” for my books, references and tools in Facebook that I use frequently in the evaluation context.

Hot Tip #3: Ease of data access penetrates your interest to dig deeper. Stack or list all your resources in a platform that you visit frequently.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

I’m Susan Kistler, AEA’s Executive Director, and I contribute each Saturday’s post. Since we started aea365 on January 1, I’ve received questions about how we monitor readership.

Rad Resource: Feedburner is a free tool from Google through which you can ‘feed’ the content from your blog to track subscribers. It takes 2 minutes to set up and blog readers will be able to subscribe via email or RSS. Feedburner tracks subscribers (green line below) and how many people view or take action each day (blue line) either in the aggregate or by individual post.  From Feedburner, we learned that we gain 3-4 new subscribers on average each day.

Rad Resource: Feedburner doesn’t tell you about website traffic – people interacting with the blog on the site itself. We use Google Analytics (GA) to learn about site traffic. GA takes a basic understanding of website management to install, but it is relatively quick to do for most sites. GA tells us how people engaged with the blog site as a whole or we can look at any single page. We can also tell things such as how readers found the site (using what search terms, from what referring sites, etc.), how readers move through the site, and how many pages they tend to view with each visit. Here is the blog pageviews report from GA (there are lots of other reports available). One thing we have learned is that site traffic stays relatively constant. It appears that people tend to visit the site and then sign up to receive the content via email or RSS (increases in subscribers seen in Feedburner tend to follow spikes in visits seen in GA). Curious why it shows 682 pages on the blog, but we’re only on day 130 or so? It is treating each page that you can call up based on tags as an independent page.

On both Feedburner and GA, we track changes after external events such as a newsletter article that refers people to the blog or mention of the blog during a webinar or on another blog.

Hot Tip: For many links, we use a URL shortener to track clickthroughs because clicks can be tracked in this way whether on the site, used by subscribers via email, or even if forwarded to a friend or colleague. I wrote more about clickthrough tracking in a previous post.

Finally, we monitor a number of other things about the blog including the nature and extent of comments and what prompts them, the number of backlinks to the blog from other sites, and the diversity of topics and contributors. I’ll write more about these in a future post, as well as how we’ll be reaching out to readers and members.

· · ·

My name is Susan Kistler and I am the Executive Director of the American Evaluation Association. I contribute each Saturday’s post to the aea365 blog.

Hot Tip: Use a URL shortener to monitor information dissemination. A URL shortener takes a long URL (location information for a website such as “http://eval.org/”), and makes it shorter. Just shortening URLs is useful – it makes the unwieldy manageable and decreases the chances of a URL accidentally breaking across lines and becoming unusable when sent via email or posted online. But the power in terms of monitoring comes in that many URL shorteners have built in tracking.

Shortened URLs can be used to track click-throughs when items are posted on websites or blogs, shared via social media, or included in emails. While site statistics, such as those provided by google analytics, can tell you how many people clicked on a link within a given website, or clicked through to a specific page from outside a website, shortened URL tracking allows one to know how many people clicked on a URL regardless of the origin of the URL and as a link is passed from user to user. This is important because it means you can track links to other people’s content, not only your own.

As an example, AEA uses URL shorteners for tracking its headline and resources list. We use Twitter as a content management system (follow aeaweb), so the notices are initially sent out daily via Twitter and most contain a shortened URL for learning more about a headline or resource. The notices also appear in the “News” section of AEA’s LinkedIn Group and on the “Headlines” page on AEA’s website, are shared via a compiled list each Sunday on AEA’s listserv – EVALTALK, and may be subscribed to so as to be received via a weekly email or RSS feed. By using the URL shortener, and tracking use, we gain a better understanding of how the content is being accessed and in what format, and make adjustments accordingly.

How does it work? There are many URL shorteners, but one of the easiest to use is bit.ly. Bit.ly is free and also has the advantage that you can choose the characters for part of the shortened URL.

Here is the quite long URL for a sample course syllabus posted by Gina Weisblat in the AEA eLibrary http://comm.eval.org/EVAL/EVAL/Resources/ViewDocument/Default.aspx?DocumentKey=bd4678d4-4497-47f0-a89c-cbf9ca9987e0. To shorten it, you copy the URL, go to http://bit.ly/, paste the URL into a box, and click “Shorten.”

Bit.ly returns: http://bit.ly/dhi5IZ To make it more user-friendly, once you have shortened a URL, you can also enter characters into the “Custom Name” box and get a more recognizable URL. For instance, I then entered “weisblat’ and bit.ly provided the following shortened URL: http://bit.ly/weisblat

 

If you click on either of the shortened links above, I can see the click-through statistics increase by 1. I can check on clicks over the past day, week, or month, and will know if the URL is shared via twitter or friendfreed. If you send it to a colleague via email or post it on your own website, bit.ly will still maintain the tracking and click counts.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· · · ·

Archives

To top