Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Implementation/Process Evaluation: Strategies to Identify Potential Areas of Focus by Tamara Young

I’m Tamara Young, an associate professor in Educational Evaluation and Policy Analysis at North Carolina State University.  Today, I’m sharing a heuristic I use with students to help them generate a comprehensive list of potential areas of focus for an implementation evaluation (of course, with the understanding that stakeholders, preferable primary intended users, ultimately select the focus).

Hot Tip:

There are six approaches that can help identify potential implementation evaluation focus areas. Below, I briefly describe each approach and provide examples of focus areas that stem from the approach. Certainly, there is some overlap.

  1. Implementation Syntheses or Frameworks

Factors that influence the implementation process that are described in several implementation syntheses and frameworks, such as The Quality Implementation Framework (QIF), are potential foci.

Example: QIF’s Phase One: Initial Considerations Regarding the Host Setting

  • What types of assessments (e.g., needs, resources, fit, and readiness) were employed prior to implementation?
  • How was buy-in obtained from critical stakeholders?
  • What actions were undertaken to foster a supportive climate?

2. Bollman and Deal’s Four Frames

The Four Frames Model describes organizations from four perspectives: structural, human resource, political, and symbolic. Gallos suggests that each frame can be used to develop a checklist for gathering data, for example:

a. Structural – regulations and operations

Examples: rules, regulations, policies, and vertical and horizontal coordinating mechanisms

  • What policies facilitate or impede implementation?
  • Are there incentives in place to support implementation?

b. Human Resource –the fit between individual and organizational needs

Examples: needs, skills, relationships, morale, motivation, and training

  • What types of training is supporting implementation?
  • Do staff feel motivated to implement the innovation?

c. Political – power and conflict

Examples: influence, coalitions, stakeholders, interests, and resource allocation

  • Are there coalitions supporting or opposing the implementation of the innovation?
  • Does implementing the innovation intentionally or unintentionally alter the allocation of resources?

d. Symbolic- ensuring people have a sense of meaning

Examples: values, stories, culture, myths and symbols

  • Is the innovation aligned with the organization’s values?
  • What symbols are used to facilitate implementation?

3. The Program’s Logic Model

 After explicating the logic model for a program, I generate questions from the items specified in the resources/inputs, activities, and outputs components.

4. Logic Model Components

 I keep a list of common questions associated with the first three components of logic models that can be applied to different programs, for example:

Resources/Inputs

  • What is the leader’s role in facilitating implementation?
  • What is the nature of coordinated activity with community partners or other agencies?

Activities

  • What are the actual services being provided?
  • What activities are used to recruit and retain participants?

Outputs

  • What is the quantity of services or products delivered?
  • What is the participation rate of different subpopulations, including the neediest groups (chronic, severe, or highest risk)?

5. Fidelity of Implementation

Saunders, Evans, and Joshi’s how-to-guide explains how FOI can be used to evaluate implementation. Here are a few standard FOI questions:

Fidelity

  • To what extent does each program activity adhere to the intended program design?

Dosage/Exposure

  • What is the quantity and strength of services delivered or products produced?

Coverage/Reach

  • To what extent are the participants of the program the intended target population?

Service Quality/Delivery

  • To what extent does service delivery reflect the underlying philosophy of the program?
  • Is service quality undermining or supporting the achievement of outcomes?
  • Are participants satisfied with the services or products?

Participant responsiveness

  • To what extent are participants engaged with program activities?

Adaptation

  • What changes were made to the program during implementation?

Rad Resource:

Implementation Monitoring and Process Evaluation is a comprehensive guide on implementation evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

2 thoughts on “Implementation/Process Evaluation: Strategies to Identify Potential Areas of Focus by Tamara Young”

  1. Tamara,

    I wanted to thank you for your posting of information in relation to implementation evaluation. I am an elementary school teacher who is currently working through a Professional Masters of Education program at Queens University with the focus of my current course being program evaluation an assessment. Woking within the parameters of the current educational model the need to be actively engaging in evaluations as they pertain to stakeholders (in my case, administration and students) and dynamically augmenting approaches in order to have effective application of changes, is an area of my practice that I am constantly trying to revise.

    Having analyzed a variety of different resources in my program I was drawn to your post and specifically the area in which you mention Bollman and Deals Four Frames. When conducting my own program evaluations I found the need to consider a variety of factors an encompassing and vital component when selecting which evaluation theories to implement into practice in order to properly target and ascertain viable validated feedback for the assessment objective. The Four Frames Model you present in your article encompassing organizations from the four perspective of structural, human resource, political, and symbolic, provides an excellent framework on which to weigh and develop my assessment approach. The human resource pillar that addressed the fit between individual and organization needs is an imperative component in the educational sector as it straddles the peripheral spheres of teacher, student and administrative participation. When this is used in parallel with the Fidelity of Implementation sector that you discussed, I feel that the potential for validation of program approaches and the opportunity for meaningful changes to programs that have become outdated or stagnant can be illuminated and lead to necessary revisions.

    Given your background on this topic I am curious as to what approaches you use in order to measure participant responsiveness and engagement in program activities? Having reviewed the The Quality Implementation Framework (QIF) that you had mentioned in your post I found it to be a beneficial resource. I am just curious if you have any specific questions that you have used that encompass responses that could be used to specifically measure engagement as this is an aspect of my own evaluation and assessment that I have sought additional resources on. I realize that the more specific questions you might use will shift dependant on the program and topics being examined, however I was curious as to whether or not you have some more generalized entry questions that you rely on in your own practice?

    Thank you very much for your posting as it provided some insightful aspects in implementation evaluation and illuminated some resources that were beneficial to my own progression through evaluation practice . I look forward to your response and to the continued discourse on this subject as I work through my Masters program.

    With kind regards,

    John G. Chornell, BA. BEd. PME Graduate Student

  2. Mikkel Møldrup-Lakjer

    For implementation evaluation, I use the VICTORE framework of Pawson (Pawson 2013, which elaborates on earlier publications) where context (the C of VICTORE) is understood as layered on principally four levels: the individual, the interpersonal, the institutional and the infrastructural level. The implementing organisation should work to make forces on these four levels support the workings of the programme. I use the framework to identify which forces are important for the programme to work.

Leave a Reply to John Chornell Cancel Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.