We are the 2016 AEA Minority Serving Institutions (MSI) Fellows: Cirecie West-Olatunji (counselor education-Xavier University of Louisiana), Jeiru Bai (social work-University of Nebraska at Omaha), Kate Cartwright (health administration-University of New Mexico), Smita Shukla Mehta (special education-University of North Texas), and Chandra Story (public health-Oklahoma State University).
It seems that it was only a few weeks ago that we shared our biographical and personal goal statements and listened to our evaluation mentor, Art Hernandez, share a (long!) list of reading resources. Hailing from diverse academic disciplines, we wondered how we would integrate seemingly disparate ideas and philosophies to jointly construct a presentation for the annual conference. After 12 months of biweekly telephone conference calls, the week-long AEA Summer Institute, a joint AEA conference presentation, and life changes (e.g., Smita was promoted to Full Professor rank and Jieru had a beautiful 7 lb, 8 oz. baby boy), we now share key lessons learned from our multidisciplinary thinking.
#1: Set Aside Time to Read
We are often too busy to set aside time for reading, reflection, and dialogue with others. Being involved in this fellowship, I found it critical to schedule time to acquire knowledge that I could integrate into my existing skill set.
#2: Evaluation can be Creative
Prior to this fellowship, I thought that data collection methods for culturally responsive evaluation were limited. My learning experiences through the AEA conference and the summer institute have changed my paradigm! There are many creative approaches to evaluation, including ripple effects mapping. These approaches provide proper context for evaluation while honoring communities.
#3: Transcend Disciplinary Boundaries
As a relatively new evaluator, I learned to always remember that evaluation theory and practice transcend disciplinary boundaries. When planning an evaluation, I now look beyond practices in any one discipline. A good starting place is the AEA website!
#4: Distinguish Research Methods from Program Evaluation
While I acknowledged a difference between research methods and program evaluation, the distinction became clearer after the summer institute and AEA conference. Evaluation design requires a lot more technical skills in mixed methods data collection and analyses. Conducting an evaluation also requires social skills (e.g., trust, compassion, connection, communication, facilitation) to connect with stakeholders.
We are grateful to the AEA community for creating the MSI Fellowship program. Thanks to you, we can continue crystallizing our evaluation identity and competence.
The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.