We are Valerie Hutcherson and Rebekah Hudgins, Research and Evaluation Consultants with the Georgia Family Connection Partnership (GaFCP) (gafcp.org). Started with 15 communities in 1991, Family Connection is the only statewide network of its kind in the nation with collaboratives in all 159 counties dedicated to the health and well-being of families and communities. Through local collaboratives, partners are brought together to identify critical issues facing the community and to develop and implement strategies to improve outcomes for children and families. The GaFCP strongly believes that collaboration and collective effort yield collective impact. Evaluation has always been a significant part of Family Connection, though capacity within each local collaborative greatly differs.
In 2013, GaFCP invited 6 counties to participate in a cohort focused on early childhood health and education (EC-HEED) using the Developmental Evaluation (DE) framework developed by Michael Quinn Patton. (Patton, 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use). Each county was identified by GaFCP based on need and interest in developing a EC-HEED strategy and had the autonomy to identify collaborative partners, programs and activities to create a strategy tailored to meet the needs and resources of the county. As evaluators we recognized the collaborative and their strategy formation as existing in a complex system with multiple partners and no single model to follow. The DE approach was the best fit for capturing data on the complexity of the collaborative process in developing and implementing their strategies. DE allows for and encourages innovation which is a cornerstone of the Family Connection Collaborative model. Further, this cohort work gave us, as evaluation consultants, the unique opportunity to implement an evaluation system that recognized that understanding this complexity and innovation was as important as collecting child and family outcome data. With DE, the evaluator’s primary functions are to elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time, data-based decision-making. Using this approach, we were able to engage in and document the decision making process, the complexity of the relationships among partners and how those interactions impact the work.
Lessons Learned: Just a few of the lessons we’ve learned are:
- Participants using a DE approach may not recognize real-time feedback and evaluation support as “evaluation”. Efforts must be taken throughout the project to clarify the role of evaluation as an integral part of the work.
- Successful DE evaluation in a collaborative setting requires attention to the needs of individual partners and organizations.
- The DE evaluator is part anthropologist thus is required to be comfortable in the emic-etic (insider-outsider) role as a member of the team as well as one involved in elucidating the practice and work of the team.
We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.
Hi Valerie & Rebecca,
My name is Jessica and I am currently taking a Professional Master’s of Education course on Program Inquiry and Evaluation. I was drawn to your article because I have a degree in Child Studies and currently work as the Head of Athletics at my school and thus have an interest in health education. In addition, in my current course, we have been learning about and discussing process use.
I recently commented online to a post a classmate had made about the shift in popular frameworks from findings use to process use when it comes to evaluation. His research on the subject had indicated that stakeholders reported learning more about programs and were much more invested in the evaluation process when they were engaged from beginning to end. With this, he indicated that they gained a much more positive attitude regarding evaluation. After having contributed my thoughts on the topic, our instructor made a comment about an evaluation approach that utilizes process use called, Developmental Evaluation. I hadn’t heard of this framework before and, as such, was excited to read your article when I saw it appear in the title.
Thus far, I have gained an understanding that process use is rooted in the concept of collaboration between stakeholders and users and that this fosters a much more meaningful evaluation process since it is continuous. This is, in turn, more effective than findings use as this framework provides more immediate feedback throughout the program. You had stated in your article that according to the DE framework, “evaluation in a collaborative setting requires attention to the needs of individual partners and organization”. I was wondering if you could potentially elaborate for me, or provide an example of some of the potential needs stakeholders might want addressed. Currently I am engaging in my own program evaluation and am quite interested in this concept of trying to facilitate continuous, real-time, data driven decision making but also want to ensure that the administration feels that their needs are being met and, more importantly, valued.
Thank you in advance for your insight and reply,
Jessica
Hello Valerie and Rebekah,
As a current Master’s of Education student in Canada I was required to review articles found on AEA365. My task for this particular course (Program Inquiry and Evaluation) was to select an article that interests me and write to the author(s).
I selected your article as I have a keen interest in Patton’s work. Having reviewed some of his material for my course I seem to identify more easily with the Developmental Evaluation Approach. Working currently as an administrator in an elementary school I was intrigued by your article as it ties with Early Childhood Education and Developmental Evaluation. My personal philosophy is similar in nature and I believe that for evaluation to be effective in education it needs to be as Patton describes “… rapid, real time interactions that generate learning, evolution, and development …” (Patton, 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use).
I further identify with the need to have DE evaluation in a collaborative, inclusive setting and value your reflection on how “Successful DE evaluation in a collaborative setting requires attention to the needs of individual partners and organizations.” (Hutcherson and Hudgins, 2013. Building an Early Childhood Health and Education Strategy Using a Developmental Evaluation Approach).
Overall an interesting read, thank you.
Kind Regards,
Sara MacDonald