AEA365 | A Tip-a-Day by and for Evaluators

TAG | Database

My name is Lisa R. Holliday, and I am a Data Architect with The Evaluation Group. One aspect of my job is to work with evaluation teams to create data collection tools and assess the quality of data received. I use data dictionaries as a way to establish quality expectations.

Hot Tip 1: Create Data Dictionaries for All Projects

Data dictionaries are used with databases as a way to maintain records about the data being utilized, including type, format, and information use. Data dictionaries can also be used for evaluations.

I include the following areas in evaluation data dictionaries:

  1. Data Name
  2. Data Description
  3. Data Type
  4. Data Format
  5. Precision
  6. Acceptable Values
  7. Data Collection Cycle
  8. Data Collector Responsible

Hot Tip 2: Define What’s Acceptable

Be as specific as possible when it comes to what is acceptable. What is the data type (numeric, text, date)? How should data be formatted? For example, should dates be entered as mm/dd/yy or dd/mm/yyyy? For numeric data, how many decimal points do you want? If collecting names, what are acceptable values (first and last name, or first initial last name)? Also, are there any data that are required, such as identification numbers?

Once you know what you want your data to look like, determine the minimum amount of non-conformance you can accept. For example, you might establish a threshold that 85% of data submitted must be formatted correctly, and 100% of required data must be submitted.

Hot Tip 3: Enforce Data Quality Expectations

Use your data dictionary to create your data collection tools. Enforce data type, format, precision, and acceptable values where possible. Also, provide instructions for data entry and training to data collectors.

Hot Tip 4: Profile your Data Regularly

At least twice a year, profile your data: how well do the data you collect align with the rules you specified? If you find that the data you are receiving aren’t meeting your expectations, consider modifications to the data collection tools you are using or the use of another collection method. Also, provide follow-up training to data collectors.

Rad Resource: Data Cleaner

This is a free data profiling tool that works with a variety of data sources, including MS Access and MS Excel, as well as numerous relational database management systems.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I’m Harlan Luxenberg from Professional Data Analysts, Inc., a public he alth evaluation firm in Minneapolis, and I’d like to share some thoughts about certain situations where databases may be more useful than Microsoft Excel.  Excel is great for quickly crunching data and managing small datasets; however, using Excel in the wrong situations can actually make your data management tasks trickier.

Below are problems that our colleagues have encountered in Excel and reasons why we think that databases would be better solutions in these cases.

Hot Tip: Know which situations to use a database.

Luxenberg

Rad Resource: Anyone can learn databases!

While the thought of learning databases can sound intimidating, anyone can learn them and there are tons of resources that will help you get going! There are numerous blogs and websites, and even free online classes such as Coursera.

Rad Resource: Start with Microsoft Access

A good database to start with is Microsoft Access (http://office.microsoft.com/en-us/access/), which is part of the Microsoft Office Professional suite, and which may already be installed on your computer. Microsoft Access allows users to build reports, create data collection forms, visually create tables, and integrates seamlessly with Excel (which you can still use to create beautiful charts).

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello, we are Rashon Lane, Alberta Mirambeau from the Centers for Disease Control and Prevention, and Steve Sullivan from Cloudburst Consulting and we work together on an evaluation aimed at assessing the uptake, use and impact of national public health hypertension recommendations by the Institute of Medicine (IOM). If you’ve ever wondered how to assess if public health programs are shifting their priorities to address evidence-based recommendations you might consider a methodology we used entitled alignment scoring analysis. In short, an alignment scoring analysis is a type of content analysis wherein narrative descriptions of organizational activities are analyzed to determine whether they support specific goals or strategies.  We conducted a pre-post alignment scoring analysis of state health department work plans to objectively determine if their project portfolios align with nationally recommended priorities.

Lessons Learned:

  • Conduct pre-post content analysis. During our content analysis we coded state work plan activities as aligned, mis-aligned or neutral to the IOM recommendations.  As a result, we were able to share with program stakeholders that many state health departments were able to adjust their prevention priorities within 18 months to reflect national priorities.  If you are working on an evaluation to assess changes in priorities over time, you might consider conducting a similar pre-post content analysis to determine the degree to which public health programs align with priorities and how these priorities change over time.
  • Use stringent criteria. Use stringent criteria to consider activities as aligned, mis-aligned or neutral for more accurate coding.

Hot Tips:

  • Use a database. Use a database to facilitate the review of documents being analyzed and to speed reporting.  If you plan to use multiple reviewers, be sure to keep track of which reviewer coded a document so you can check inter-rater reliability and improve training on your coding protocol.
  • Use alignment scoring. Use alignment scoring analysis results to provide recommendations to program stakeholders on how they might shift priorities that are NOT aligned with national recommendations that have proven to be effective.

Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · · ·

I am Phil Halbrook and I would like to share with you information about TRASI – a great tool to help evaluators seeking to measure social impact. The issue of measuring social impact is a hot button one right now and the TRASI database and website is extremely useful.

Rad Resource – TRASI: TRASI is “Tools and Resources for Assessing Social Impact” and it is from the Foundation Center. TRASI has over 150 indexed tools, each with a description of what it does and what entity developed it. Full disclosure – all of the resources there aren’t tools per se, they include best practice guides, methodological guides, and what we might more traditionally consider to be a tool. TRASI also includes a blog and discussion groups where you can discuss the resources available there or broader issues related to assessing social impact.

Rad Resources – Examples from the TRASI Database: So, what’s there?

Campaign Champions Data Collection Tool from the Annie E Casey Foundation: “This is a tool that measures strengthened base of public support. This form tracks and measures the number of “champions” (people who take actions to advance the public will) engaged and the actions these champions undertake as a part of the Born Learning campaign.”

Criteria for Philanthropy at Its Best: “This best practice provides guidelines on values…, effectiveness …, ethics …, and commitment … criteria for evaluating philanthropy.”

Interrupted Time Series Designs [Guide]: “This is an observational method that measures impact of education programs in cases where data before the implementation of the program is available. This method compares the data from before the implementation to the same data afterwards to tease out a trend in achievement.”

And 100+ others

Hot Tip – TRASI on Twitter: You can follow TRASI on Twitter at @FCAtlanta. This apst week, on November 14th, they held their first ever tweet chat on social impact assessment using the hashtag #socimp. If you are on Twitter, the conversation is still going – just search for the hashtag. Not on Twitter? No problem. This link http://hashtags.org/socimp will take you access to a record of the discussion to date with the #socimp hashtag – check it out for insights and resources.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

My name is Stuart Henderson. I am the Associate Director of Evaluation for the Clinical and Translational Science Center at the University of California, Davis. We recently used a combination of screen recording software and think aloud methodology to conduct an evaluation of an innovative software program. I’d like to share how screen recording software and think aloud methodology might be useful in meeting other evaluation goals.

Screen recording software, as the name implies, is a program that records everything that is on a user’s computer screen. An example of this software is TechSmith’s CamtasiaStudio, http://www.techsmith.com/camtasia/, but this is just one of many screen recorders on the market.

Think aloud methodology, also referred to as verbal protocol analysis, is a technique where you have someone perform an activity or solve a problem and simultaneously verbally express their thoughts, feelings, and reactions as they are occurring. The theory is that by having subjects think aloud as they are doing something, you can better understand their cognitive processes and logic as they unfold. It is a common research technique in technology usability research as well as in some educational research.

Hot Tip: Many evaluators are turning to web-based surveys for their data collection needs, yet how our subjects are interpreting the questions or organization of our web surveys may be unclear. Conducting think alouds with screen recording software can be used to conduct cognitive interviewing of survey takers to help us understand how people are interpreting the questions and choosing their answers. These techniques also provide the opportunity to identify non-cognitive responses, so you can identify when your survey takers are frustrated, prideful, etc.—reactions that would be very difficult to capture through traditional methods.

Hot Tip: For evaluators who are creating databases or other programs for stakeholders and clients, think alouds and screen recordings might be a useful way to fine-tune these programs. We think we know how people are using the program, but until we watch someone use it and describe their reaction to it, we will be getting only part of the picture. Watching people use programs also allows us to identify active learning, for example, how people improve at using a program and begin to develop “work arounds” so that they get the program to do what they want.

Hot Tip: Screen recordings can also be used to share evaluation findings with stakeholders who are not local. With screen recording technology, it is easy to record your voice over PowerPoint slides or video and share the presentation with others to listen at their convenience.

Rad Resource: slides of our recent AEA talk on this topic can be found in the AEA elibrary. http://comm.eval.org/EVAL/model/Resources/ViewDocument/Default.aspx?DocumentKey=cd4acecd-ed34-4e49-9ec0-e185777e4e93

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, my name is Debbie Cohen. I am the Director of Evaluation at Community Mental Health Center, Inc., a community mental health center that serves five rural counties in Southeastern Indiana. One of my roles in the agency is to pull data from our Electronic Health Record (EHR) and to use it to evaluate programming and to foster continuous quality improvement. Here are my tips for internal evaluators and other agency staff members out there who are planning to implement an EHR and use the data to evaluate programs.

Hot Tip #1: Learn SQL or Find Someone Who Knows SQL

If you plan to pull data from your EHR you will have to have a way to tell the database to pull the data that you want.  For many EHRs, that requires the use of SQL (Structured Query Language). Some EHRs come with “prepackaged” reports, but if you want to have the ability to access any of the data in the system, SQL is a necessity. SQL is a database computer language designed for managing data in relational database management systems. You will not learn SQL overnight, but there are various classes and books available. I recommend SQL for dummies (http://www.amazon.com/SQL-Dummies-Allen-G-Taylor/dp/0470557419 ) it has taken me a year to feel confident in my SQL skills, but now our agency is becoming much more data informed.

Hot Tip #2: Learn about your EHR Database

In order to use the data within the EHR you have to know what data is being entered. Look at the data entry screens and obtain a copy of the database schema from the EHR Company. Once you understand what data is being entered and what is not, you will have a better idea what sort of information you can use to inform practice. Not everything kept in an EHR will “live” in the database, for example, you may retrieve service data for a client, but you will not be able to retrieve the narrative from the progress notes from each service.

Hot Tip #3: Find Ways to Communicate Information Back to Staff

The EHR our agency uses offers a way for me to publish reports to any staff’s “portal.” Now our front line staff can monitor their own performance and management staff can examine their program from a programming, performance, or financial perspective. Our hope is to transform our agency into a data-driven agency that uses data to inform practice from at all levels of the agency.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· · ·

Archives

To top