AEA365 | A Tip-a-Day by and for Evaluators

greeting-1

My name is Bianca Montrosse-Moorhead, and I co-chair EvalYouth, along with Marie Gervais and Khalil Bitar.  I also serve as a liaison between the American Evaluation Association, EvalYouth, and EvalPartners. This week, aea365 is focusing on EvalYouth.  Today, I will be talking about what EvalYouth is and introducing you to some of our rad resources.

Hot Tips:

  • EvalYouth is rooted in the idea that an investment in those new to our field and an investment in the inclusion of youth stakeholders is an investment in Evaluation itself.
  • EvalYouth, an EvalPartners initiative, has two overarching goals: (i) to promote Young and Emerging Evaluators (YEE) to become competent, experienced and well-networked professionals who contribute to evaluation capacity at national, regional and international levels; and (ii) to promote the inclusion of Youth and Young People (YYP) in evaluations conducted at the national, regional and international levels.
  • You do not need to be a young or emerging evaluator, nor a youth or young person to be a member of EvalYouth. You simply have to be passionate about and committed to investing in the future of evaluation.
  • EvalYouth is committed to reaching evaluators across the globe, and everything we do happens in more than one language.

Rad Resources:

  • Want to know more about EvalYouth? Check out our website.
  • Have questions about EvalYouth? Send us an email (EvalYouth@gmail.com).
  • Want to be among the first to hear about EvalYouth updates, projects, calls, etc? Follow us on Facebook, Twitter, or LinkedIn.
  • Want to see presentations EvalYouth has given or sponsored? Check out our YouTube channel.

The American Evaluation Association is celebrating EvalYouth week. EvalYouth addresses the need to include youth and young people in evaluation. The contributions all this week to aea365 come from members of EvalYouth. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Feb/17

11

Evaluators and Love… by Sheila B Robinson

I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor. I love evaluation work and evaluators! I wanted to write about evaluation and love, so I decided to revisit what I wrote around this time a couple of years ago. I figured an easy to find the post would be typing “love” into the search box. Turns out, 224 aea365 posts include the word love (well, 225, now!). For you data nerds (and who among us isn’t?!) that’s nearly 9% of our 2557 articles!

Lesson Learned: Many posts include invitations from authors seeking feedback and wanting to connect with other evaluators (e.g We’d love to hear from you…) but others give us insight as to what makes evaluators tick.

Through aea365, we have learned that:

  • Beverly Parsons loves “working with the CLIP process….Communities of Learning, Inquiry, and Practice, informal, dynamic groups of organizational members who learn together about their professional practice.”
  • Laura Peck has “learned to love the counterfactual.”
  • Susan Kistler loves “finding ways to make data understandable and useful.”
  • Susan Eliot claims “Everyone loves a good story.”
  • Carl Brun loves “talking about teaching evaluation.”
  • Matthew von Hendy loves “helping connect people with the information that they need to solve problems or make decisions.”
  • Laura Pryor and Nichole Stewart admit “we both love data.”
  • Bethany Laursen “fell in love with social network analysis (SNA) as a graduate student because SNA gave me words and pictures to describe how I think.”
  • Rita Overton loves “helping programs to improve and having a hand in making the world, or at least my corner of it, just a little bit better.”
  • Nick Fuhrman admits, “Teaching is my passion—I love it!
  • Corey Newhouse has “loved the ways in which (video) has enriched our process and our findings.”

Of course, data visualization is an object of love among evaluators:

  • Stephanie Evergreen is “in love with data visualization and reporting.
  • Yuqi Wang loves “figuring out different ways to visualize data.”
  • Sarah von Schrader and Katie Steigerwalt “love data visualization as a powerful way to share information!”
  • Tony Fujs loves “to visualize the data I have in my hands, but I also like to spend time visualizing data that I don’t have: Missing data.”

AEA and the annual conference also receive some evaluator love:

  • Kathleen Tinworth loves “the exposure to and connections across different disciplines.”
  • Don Glass shares “one of the things that I love about attending the AEA annual conference is getting the opportunity to better understand how my work can relate to and be informed by recent debates and developments in the field.”

Hot Tip: Liz Zadnik, aea365 Curator and sometimes Saturday contributor says “don’t be afraid to let your love of spreadsheets, interview protocols, theories of change, or anything else show!”

Finally, Susan Kistler, AEA Executive Director Emeritus, shares perhaps the most important message about love that we’ve had here on the blog: “Success is made manifest in health and happiness, confidence that you are loved and the capacity to love with others.”

Happy Valentine’s Day!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We are Rebekah Sobel, Manager, Planning & Evaluation, and Dana Burns, Data Analyst, in the United States Holocaust Memorial Museum’s Office of Planning. The Museum is a dynamic institution with a bold and ambitious strategic plan. We are a small unit responsible for leading and strengthening the Museum’s annual and long-term planning, evaluation, and monitoring practices. One of our top priorities — and greatest challenges — has been standardizing processes, systems, and language across our wide-ranging programs. The Museum is still in the early years of taking a more comprehensive and consistent approach to planning and evaluation. To help our colleagues better prioritize projects, set goals and measure success, templates quickly became our “go-to” tool.

Lessons Learned: Conversation and customization are key!

While every Museum program area has established priority outcomes, there is great variation in how they integrate evaluation and measurement in their work. Over many conversations with our colleagues, we have come to understand their challenges and how they operate, which allows us to mix and match our templates to address their needs.  In meeting our colleagues where they are, our office is building important relationships. These partnerships are key to ensuring our ability to achieve our strategic plan goals and set up our work to measure success. Sometimes, our program colleagues approach us with their own internal evaluation and planning documents that the Museum could use more widely. We will adjust their language and format, and then share the revised version with other potential users for feedback. Occasionally, our collaborative work results in a new tool that can be consistently used across the Museum.

Hot Tips:

  • It is important to remember that templates are a means to an end; they are tools to help us maximize our impact.  In order to be effective, they must address a need or solve a problem, and be both user-friendly and efficient.
  • When responsibility for next steps in evaluation work is vague or when enthusiasm for trying out new templates stalls, we keep the ball rolling and offer hands-on support. Sometimes that means filling out the template the first time or providing more one-on-one guidance.
  • We do the project management for evaluation planning, which includes managing the schedule, setting meetings, sending the meeting notes and any follow-up. We ask our partners for their input as we develop templates, and assign small tasks for them to undertake with their teams so they gain experience using new tools. If they don’t own the template, they won’t use it.
  • We bring candy or other treats to our planning meetings. We also have been told playing music in meetings helps to set a lighter mood– we are trying that next!

Rad Resources:

Here are a few of the templates we have tried and adjusted, while making new friends and co-conspirators to organizational change along the way. Try them yourself, change them up, and let us know what works for you:

 The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

salehi-1Greetings from Toronto! My name is Roxana Salehi and I am the principal consultant at Vitus Consulting and a Senior Research Associate at the Centre for Global Child Health, Hospital for Sick Children.  Today I will share with you four ways you can help your organizations use evaluation data to make strategic decisions.

Lesson Learned 1: Create your own business processes if none exist. Having clearly defined and agreed upon processes in place are essential for getting useful results that can lead to action. Evaluators can pro-actively help to create these processes. For example:

  • Create a Terms of Reference document, outlining the role of working group members, decision making structure, etc.; this will teach you a lot about how business is done in the place you are working. In some countries, you may have to pay people to attend meetings – good to know when you are setting your budget! See an example from the Partnership for Maternal, Newborn & Child Health.
  • Create simple protocols to clarify who is doing what, and when, for key tasks, such as data collection. Protocol development makes you consider issues that you may not typically consider. For example, it may not occur to you that in some places certain documents need to be hand delivered.
  • Ask for time on the agenda of meetings, so that evaluation remains an organizational priority.

Lesson Learned 2: Build your utilization-focused philosophy into your plan. In your evaluation plan, list “Utility” as an explicit evaluation standard you want to adhere to.  Utility means that evaluation should be planned and conducted in ways that meets the needs of stakeholders and increases the likelihood of results leading to action. The evaluation plan can act as a strong reference document for bringing people back to the question of: “so what?”. For more on this, see Michael Quinn Patton’s Utilization-Focused Evaluation.

Lesson Learned 3: Let your stakeholders help you create meaning out of data. Instead of holding a “presentation of evaluation results”, consider convening a  “data sense making session.”  My experience? Stakeholders love it! It intrigues them; they come in curious and willing to help pull out the most important findings and actions that can be taken. Just make sure you have some worthy questions for your stakeholders, or else it will be just a presentation of results.

Lesson Learned 4: Stay on top of your game. And I don’t mean it just in terms of evaluation knowledge. That is a given! Also learn from data visualization and communication fields so that you can tell a compelling story that enables action. For quantitative data, I found Stephanie Evergreen’s Effective Data Visualization useful and I am still searching for good resources for qualitative data display.

There you have it. I’d love to hear about other ways you help put data into action!

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We are Ana Flores and Joshua Paul at Volunteers of America – Los Angeles. At the AEA Evaluation 2016 conference, we presented a panel entitled “A Picture is Worth a Thousand Words…But Will They Use It?”. Today, we want to provide additional information regarding how to make data more user-friendly.

The Evaluation Department at Volunteers of America – Los Angeles (VOALA) is tasked with providing evaluation services as needed to more than 70 social service programs. Staff in these programs are dedicated to helping people and many find data unappealing. Addressing communication barriers has given us the opportunity to learn a number of lessons.

Lessons Learned #1: Needs Change, Open a Dialog with Succinct Visualizations  

Understanding stakeholder needs and how they fit into a program model is a major part of any evaluator’s task. Unfortunately, we have found that stakeholder needs and program models can change rapidly, and stakeholders do not always volunteer information about these changes.

We were once mystified as to why one of our programs — whose initial purpose was to connect with and refer homeless veterans to local services — had such poor monitoring results. Traditional reporting methods failed to open a dialog that could bring the core problem to light. After months of discussion, we tried a new visualization-based design (see image) that demonstrated the discrepancy between the goal and present performance and prompted program leadership to identify the issue. The staff had been focused on the transportation of clients to appointments, a secondary program activity, which had not originally been designated as important to track.

internal-eval-image-1

internal-eval-2

Lessons Learned #2: Only Show What You Need to Show

 

Past reports for many of our programs provided detailed data, presenting every single outcome for individual clients. However, this level of information was not necessary for program performance discussions and was a distraction from the overall outcomes included in the report. Using Tableau, the detailed information was removed, and only overall outcome percentages and targets were kept on the graph. With outcomes presented this way, VOALA upper management was able to get the information they needed to make program recommendations and help program directors implement better practices.

 

Lessons Learned #3: Use Interactivity

Giving your audience an opportunity to control the data makes it easier for them to make inferences about the information. Visual analysis programs, like Tableau, allow us to provide interactive reports so that upper management and program directors can filter results by key demographics or periods of time, depending on what is useful to them.

 

Having these types of “quick snapshot” visualizations has helped upper management at VOALA communicate recommendations with programs.

Experimenting with these different data visualization techniques has improved our discussions with key staff, helping us ask hard questions while reducing staff resistance to data. Otherwise, the response to “Why is this benchmark never reached?” might just be silence.

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! We are Laura Beals, Director, and Barbara Perry, Evaluation Manager, of the Department of Evaluation and Learning at Jewish Family and Children’s Service Boston, a multi-service nonprofit in Massachusetts. At Eval 2015, we learned about “Data Placemats” from Veena Pankaj of the Innovation Network. Recently, we held several placemat-focused “Learning Conversations” with one of our multi-program divisions. We created seven placemats for these meetings:

  1. An overview of the Learning Conversation and placemat process.
  2. Client census—new and active—over the past four years for each program.
  3. Client demographics by program.
  4. Client geographic distribution heat map. This placemat was interactive, using Tableau. We wanted not only to show the geographic distribution of clients in Massachusetts, but also to provide an opportunity to explore the data further, through the use of filters for program and key demographics.
  5. A network analysis showing referral sources.
  6. A network analysis showing how clients were served by multiple programs at the agency.

(click for larger image)

beals

7. A learning and dissemination plan. This placemat encouraged meeting participants to use the data and allow our team to     create specific follow-up documents and undertake follow-up analysis.

Lessons Learned:

  • During the planning stages, check-in with stakeholders from around the organization. We asked the program director, division director, grant writers, and development associates what they wanted to learn about the division. Their responses allowed us to tailor the placemats to be as useful to as many people as possible.
  • Don’t forget to include the staff! In order to share the placemats and get feedback from the direct-service staff, at an all-staff meeting we held a shorter placemat discussion, focusing on two placemats; the other placemats were provided for later review. We also hung up the placemats near the staff offices and provided sticky notes for feedback and observations.
  • Be ready to “go on the road” with your placemats. We found that word spread about our placemats and there was interest from various stakeholders who had not been able be part of the original few meetings. By continuing the conversations, we were able to increase learning and generate new ideas.
  • Bring data chocolates! We had been waiting for an opportunity to create data chocolates, after being inspired by Susan Kistler. We wrapped shrunken versions of several of the graphs around chocolates. They put everyone in a good mood to talk data—the lightheartedness of our gesture helped break down barriers and were a great conversation starter.

Rad Resources:

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Stanley Capela and I am the Vice President for Quality Management and Corporate Compliance Officer at HeartShare Human Services of New York.

I have been an internal evaluator for 38 years in the non-profit sector and have reviewed more than 115 organizations in 35 states and four countries as a Peer Reviewer for the Council on Accreditation. Through those experiences, I have come to realize the importance of communication skills in conducting any evaluation. I offer the following tips when conducting reviews.

First, be contextual. Take the time to understand the world of the program and, more specifically, the staff’s work. This context provides the internal evaluator with a better understanding on how to approach the evaluation and, more importantly, how to present the value of that evaluation and that it will have a positive outcome.

Second, communicate in a language that the stakeholder understands. Very often when conducting internal evaluation and presenting the results, evaluators get caught up in their own world and forget that the primary role of evaluation is to assist key stakeholders — specifically, the leadership –in developing a better understanding as to whether or not the program is achieving its goals. If not, what are the issues and how can the evaluator help resolve them?

Finally, carefully choose your words. The language that you use for your evaluation will impact how the stakeholder interprets your findings. Further, it also has an effect on whether or not you will face resistance to the evaluation. For example, when I conduct reviews, I like to use the word “challenges” rather than “deficits.” Also, stakeholders often view evaluation as a tool to identify program weaknesses, so I will make an effort to identify program strengths. These are basic strategies to ensure that stakeholders are more receptive to the evaluation.

Rad Resources: To learn more, I refer you to Michael Patton’s work on Utilization Evaluation and David Fetterman’s work on Empowerment Evaluation for further understanding on the importance of communication skills in evaluation. In addition, I refer you to a presentation that I made at last year’s AEA Conference titled, “Turning Communication into Collaboration: The Development of an Outcomes-Based Management Training Program.”  Finally, check out the Council on Accreditation website at www.coanet.org and look at their Performance Quality Improvement standard and tools.

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello!  I am Jay Szkola, senior program analyst in the Strategy, Evaluation and Learning Division at Good Shepherd Services. As an internal evaluator, I have the opportunity not only to evaluate the programs in my portfolio, but also assist staff in using data in their day-to-day work with participants. This doesn’t have to involve a great deal of complexity, and can be as simple as developing tools that translate data into an accessible format.

Welcome to internal evaluation week, hosted by the Internal Evaluation TIG! Thanks to all of the internal evaluators who contributed to this week’s blogs and to Annie Gleason (Good Shepherd Services in New York City) for coordinating the work! This week is dedicated to sharing insights and tools from evaluators across the country (and Canada!) about how they help organizations to put data and evaluation findings into action. Blogs this week offer a broad range of suggestions for helping stakeholders to use data, drawing on technology, data visualization, communication and framing, and participatory methods.

All of the community-based youth justice programs at our agency use the Positive Youth Development Inventory as a tool to help evaluate our impact on key youth development constructs, such as friendship and future orientation. We use the PYDI to assess our participants at baseline, and gauge pre-to-post changes upon program completion. These insights are useful for program planning and advocacy.

I also wanted to make sure staff could use the results to inform their work with participants directly, but initially experienced limited success with this goal. One barrier was that while staff liked the survey, they were unsure how to translate individual survey responses into insights and action.

Cool Trick:

In response, I constructed a simple Excel sheet to help with this translation.  Program staff enter in participant survey responses, and get out easy-to-read graphs showing how the participant scored in each construct. As staff enter the data, a table aggregates each response by construct, which then links to the graphs.

Program staff and leadership liked the tool so much that when the NYC Department of Probation adopted the PYDI as a measure, the Excel tool was shared and is now used by other youth justice programs in the city.

(click for larger images)

PYDI Template Input

PYDI Template Input

PYDI Template Output

PYDI Template Output

Lessons Learned:

Offer a guide. For those who do not have a high comfort level with data, colorful graphs alone will not bring about understanding. Included on the Excel tool is a brief, jargon-free definition, a sample question, and a list of questions for each construct on the survey. 

Think about multiple uses to get maximum impact. Whatever the data source (surveys, administrative data, etc.), talk to staff about how they could use the data. Once staff in our youth justice programs began seeing the PYDI as something they could use in day-to-day practice, it became less of a task to be completed “for the evaluators” and instead, a shared project.

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings, AEA Members. We are Jennifer Greene (Chair; University of Illinois, Urbana-Champaign), Jim Rugh (Independent Consultant), Maurice Samuels (MacArthur Foundation), Nicole Vicinanza (JBS International), and Anne Vo (University of Southern California) — your 2017 Nominations Working Group. We have the pleasure of announcing the Call for AEA Board Nominations and to encourage you to consider getting involved in our beloved organization by serving on the AEA Board.

Opportunities to Serve

The 2017 slate calls for nominations for President-elect and 3 Board Members-at-Large.

Time Commitment

The President serves a three-year term in the roles of President-elect (2018), President (2019), and Past President/Secretary (2020). The President presides over all Board meetings.

Board Members-at-Large will serve three-year terms, starting January 1, 2018, and are expected to attend three Board meetings per year.

In addition to meeting attendance, each Board member and the President actively participate in the work of the Board, engage in ongoing email communications, and liaise with Board Task Forces, Working Groups, or other Board-focused volunteer groups.

Qualifications

Particularly strong candidates for these leadership positions are individuals who have:

  • Been active members of AEA for a minimum of three years,
  • Served in leadership roles within AEA units such as TIGs, Working Groups, Task Forces, Local Affiliates, and others,
  • Contributed to and served the evaluation profession in other ways.

Note that candidates who would like to be considered for the post of President-elect must be an AEA member who is based in the United States. Prior service on the Board is preferred, though not required. All candidates’ availability and accessibility are also considered.

Diversity

AEA and the Leadership Team are committed to nourishing and establishing diversity in the governance of the association. Particular attention is paid to adequate representation and balance according to the following criteria:

  • Gender (including nondiscrimination based on sexual representation and orientation or gender identity)
  • Racial/ethnic representation
  • Disciplinary heterogeneity
  • Practitioner/academic balance
  • Geographic heterogeneity (within the United States)
  • International representation and perspectives
  • Heterogeneity of areas of application

Learning More

AEA relies on members to provide leadership, and values a large and diverse pool of member leaders! To learn more…

  • Tune into the Nominations Working Group’s Coffee Break Webinar on Thursday, February 9th, at 11am PST (2pm EST) to learn more about:
    • What is involved in running for and serving on the AEA Board of Directors
    • What the nominating committee looks for in potential nominees
    • How you can become involved in AEA leadership
    • Paths to serving on the AEA Board
  • Reach out to current Board Members to learn about their experiences.

Navigate to the AEA’s Resource Page to access materials that describe how to prepare a nomination packet for yourself or others. Contact us! We are also glad to help answer questions about the nomination process. If we do not have the answers, we can definitely offer suggestions for who to ask.

· ·

My name is John LaVelle from Louisiana State University.  The following is an interactive script developed to help my students and stakeholders inductively discover the value of triangulation.  The example is focused on inquiry methods, and can be easily adapted to other forms of triangulation.

Script: Let’s start from the premise that the inquiry process is supposed to help you (the stakeholder) learn in-depth information about something important.  A process, a product, an idea, the possibilities are endless.  An educational program, the process of new employee onboarding, clients’ awareness of new services, the aesthetic quality of my shoes, anything.  I selected this image to represent that thing:

lavelle-1

This image represents the complete truth of a phenomenon, and the borders represent the boundaries of the construct (which can’t really be known).  In this example, we will assume that this construct is orthogonal, so there isn’t any overlap between it and other concepts.  Assume we are not concerned with funding or politics (yet).

Now we study it with systematic inquiry (blue circle).   Let’s call the blue circle a survey, and let’s assume it gets right at the center of the thing we’re studying.  Everything within the circle represents increased understanding.

lavelle-2

Not too bad, huh?  We learned a lot of information about the phenomenon from just one study, and it doesn’t look like we got a lot of information from other concepts involved.  Every data collection tool will have some error (or similar concept), and that can lead to challenges later when you try to make sense of things.  Now, this survey did a good job, but it does not represent the entirety of the phenomenon.  Let’s try another to understand it some more using the same method (survey).

lavelle-3

Well, this looks different.  Not bad, just different.  We learned some new information that helps paint a fuller picture of the phenomenon, and that reinforces some ideas learned from the original study.

Note to facilitator: This can be a place to plant the idea of measurement error by drawing attention to where the circle goes beyond the boundaries of the square.

lavelle-4

Third time is the charm!  It seems like survey methods are painting a robust representation of the construct.  Every time we use a survey, we increase the validation and trustworthiness.  An observation: our inquiry methods seems to be grouping on the right side of the construct and a rather large region of the phenomenon is unexplored.  Could another approach will provide different information and explanation?

lavelle-5

It looks like a different method, such as interviews (orange circle) provided information that helped explain an unexplored region of the construct as well as helped reinforce the first and second studies.   That is interesting.  We should use at least one more approach to really pull things together, especially if we’re trying to learn about something that has direct implications for someone’s health, functioning, economic status, etc.

lavelle-6

Now things are really starting to pull together.  The central aspects of the construct were reinforced through five examinations using three methods, so it looks like the data are trustworthy and seem to be telling a reasonably consistent story.  Excellent work!

This was a lot of information, and there are some important implications here.  What could happen if:

  • You are only familiar with one approach to answering questions?
  • Your stakeholder(s) value one kind of information (quantitative or qualitative) over another?
  • Money for the study or evaluation is a concern? How will you prioritize?
  • One kind of data is more expensive to collect and analyze than another?
  • The inquiry methods aren’t well-refined?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

<< Latest posts

Older posts >>

Archives

To top