AEA365 | A Tip-a-Day by and for Evaluators

TAG | Graduate Education Diversity Internship Program

Hi, we are Ashaki Jackson, Stewart Donaldson, and John LaVelle, and we are the leaders and coordinators of the American Evaluation Association’s Graduate Education Diversity Internship program (GEDI).  November 24-29 is the GEDI sponsored week, and we thought it best to share some important resources about the GEDI program itself (www.eval.org/GEDI).

Hot Tip: Program Overview. The GEDI program works to engage and support MA and PhD students from groups that are traditionally under-represented in the field of evaluation, draws from a range of disciplines, and is centered on the principles of culturally responsive evaluation.  Interns participate in a range of training and professional development opportunities, such as the Claremont Evaluation Center’s August Workshop Series (2013 schedule available at: www.cgu.edu/workshops), AEA’s annual conference (www.eval.org), a winter training conference, and the AEA/CDC conference in Atlanta.  Interns also participate in webinars, network with established evaluators, and complete an evaluation project for their placement site.

Hot Tip: Encourage your students and degree-seeking colleagues to apply to the GEDI program!  Applicants can come from any field that relates to evaluation, and it is very exciting to see the range of professional and personal interests from people interested in learning more about culturally-responsive evaluation.  The call for intern applications will be distributed in spring 2014, and every year we enjoy seeing the strong pool of applications, and only wish we had enough placement sites to accept everyone!

Hot Tip: Get your organization involved!  We are always interested in forging relationships with new internship sites across the United States.  Some recent sites have focused on educational evaluation, others on policy evaluation, and still others on evaluating outreach efforts or human services, to name just a few. If you are interested in working with the GEDI program and helping shape the next generation of evaluators, we encourage you to connect with us at gedi@eval.org, or watch for the Call for Applications, which will go out in the spring for the 2014-15 cohort.

Hot Tip: 2013 marked a very important year for the GEDI program, as it was our 10 year anniversary.  Special thanks to all the program leaders, the interns, alumni, and evaluation community members that have helped support this important program.  With their (and your) support, we look forward to celebrating 20 years in 2023!

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello, I am Dawn Henderson, a doctoral student in the Psychology in the Public Interest program at North Carolina State University and 2010 – 2011 fellow of the AEA’s Graduate Education Diversity Internship (GEDI). I am currently working as an evaluator with a community-based organization and wanted to share some tips on managing boundaries. So what happens when “role slippage” occurs in our role as evaluators and we are also viewed as a technical advisor, grant writer, and public relations consultant by the organization? In this blog, I would like to provide you with three useful tips that I hope can offer you some assistance in your evaluation work.

Hot Tip 1:  Know your boundaries.  Clearly outline to the organization your role and skills that you offer in the capacity of an evaluator. Depending on your background an organization will look to you as an expert across a variety of issues and you have to ensure that you are NOT there to serve in that capacity. During this process you and the organization outline the specifics of what you will be doing, how you will be doing it and disseminate findings to the organization and larger community. Most often this is done through a contract, memorandum of understanding, etc.

Hot Tip 2: Resist the gravitational pull. Depending on your background you may have a shared interest in the services the organization is providing and pulled into wanting to contribute to an altruistic goal. I believe Stake (2004) called it “confluence of spirit.” Albeit an important goal, you cannot become so immersed in what the organization is doing that you forget that you are there to evaluate the program—including the positive and negative processes and outcomes. As an evaluator you should work to communicate all outcomes effectively and accurately.

Hot Tip 3: You cannot “do” everything. Depending on your background you may have a variety of skills that could be of interest to the organization you are working with.  They may ask you to help write, review and submit a grant. You may be excited about the opportunity and tempted, but this extends your work beyond the contractual agreement and can be a distractor in achieving the original goals of the evaluation. Go back to your expectations and communicate to the organization that if they want you to do this kind of work, it has to be renegotiated in an agreement and be within the parameters of the evaluation timeline.

Rad Resources: I found Volume 108 of New Directions for Evaluation on Evaluating Nonformal Education Programs and Settings, and the wisdom of Stake’s 2004 article in the American Journal of Evaluation to be highly useful in generating these tips.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello AEA! This is Tamara Williams and Ciara Zachary, and we are members of the seventh cohort for the Graduate Education Diversity Internship (GEDI) Program. Tamara is a doctoral student at the University of Colorado-Boulder in the department of Sociology. Ciara is doctoral candidate at Johns Hopkins Bloomberg School of Public Health in the department of Health, Behavior and Society. For her internship Tamara is working on campus at Community Health and focusing on the sexual health information and education needs of women of color. Ciara is assisting the Annie E. Casey Foundation and Carson Research Consultants with the evaluation of the Elev8 program in Baltimore City schools. While our internships may have different populations, we are both using focus groups as a tool to evaluate intermediate outcomes for our respective projects. Moreover, the current GEDI interns engaged in an AEA-sponsored focus group project during last year’s conference in San Antonio, so we wanted to share some lessons and tips we, as emerging evaluators, have learned about them.

Hot Tip:  Focus groups can be very practical for small projects or solo evaluators. They are cost effective, and they may not always require transcription.

With the Elev8 program, focus groups will be conducted with both students and parents to understand their perspectives on the program. Ciara is designing focus group guides to help gather data concerning student and parent awareness, use, and opinions of Elev8.

Hot Tip: Conducting the focus groups in the early stages of the program is useful in that group dynamics can yield information that students and parents may not want to share during a face-to-face interview or even while completing a survey. Most of all, responses will guide future program activities, etc. to ensure that Elev8 meets it’s long term goals.

Hot Tip: When conducting focus groups with different stakeholder groups such as parents, children, and vulnerable populations, it’s important to remain cognizant of group dynamics, ethical considerations, the ordering of questions, and even whether or not the focus group feels like a discussion. Ensuring that group members are comfortable with one another and with the moderator, feel that they are not being exploited or being examined can help identify group norms and priorities and even empower community members.1 Additionally, the data collected can improve programs and improve the evaluation process.

Furthermore, is a focus group a good data collection method for your project? Here are some questions to ask yourself:

1. What types of information do I need to collect to answer my evaluation question(s)?

2. Who will be asked to provide the information, and is it feasible to gather them together for a conversation?

Sources/Resources

1. Kitzinger, J.(1995). Qualitative research: introducing focus groups. British Medical Journal, 311(7000), 299.

2. O’Sullivan, R.G. (2004). Practicing Evaluation: A Collaborative Approach. Thousand Oaks, CA: Sage Publications. PP

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. This is a shortly monthly series highlighting contributions from AEA’s Graduate Education Diversity Internship (GEDI) Program and its interns. You can learn more about the GEDI program by visiting its webpage. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Our names are Dawn Henderson and Ebun Odeneye. We are members of the 2010-2011 cohort of the AEA Graduate Diversity Education Internship (GEDI) Program. New internal evaluators often struggle with balancing general program duties with evaluation-specific responsibilities, so we will be sharing some effective tips to help evaluators facing this same challenge.

Hot Tip: As funding for new and existing programs continue to dwindle, more evaluators find themselves negotiating two roles – part program developer, part program evaluator. This is particularly true for evaluators who function within the context of the organization and its programs, e.g., internal evaluators.  In this case, offering expertise in designing a program is generally confounded with evaluating the program’s quality and effectiveness.  Balancing these two roles can be beneficial from the evaluation perspective for a few reasons:

1) The evaluator has an established rapport among key stakeholders;

2) The evaluator has insight into the inner-workings of the program and/or organization;

3) The evaluator can integrate evaluation throughout various stages of the program; thus, evaluation is embedded within the entire program and not seen an external process; and

4) It is cost-effective and time-efficient.

Hot Tip: Furthermore, when evaluators work within an organization, they can facilitate a collaborative and inclusive approach the process, from program development to evaluation. While there are numerous advantages here, this situation can be quite challenging. Therefore, in order to foster greater utility of their time and efforts to the program, evaluators should employ the following strategies while adhering to the guiding principles of evaluation:

1) Organize teams for each phase of the program and evaluation (if applicable), i.e. formative/planning, development, implementation, and joint/overall program team;

2) Delineate clear roles and expectations of team members under each phase by developing clear in-house job descriptions with tasks and responsibilities for the team and its members;

3) Specify percentage of each team member’s work hours designated for program-related work and evaluation-related work, i.e. 20% in Year 1, 30% in Years 2-3, 40% in Years 4-5;

4) Train other program staff in order to build evaluation expertise and capacity to facilitate organizational growth and further overall understanding of the “evaluator role”;

5) Outline specific benchmarks of each phase of the program (from planning through implementation) and evaluation.

References/Resources:

American Evaluation Association (2004). Guiding Principles for Evaluators.

O’Sullivan, R. G. and O’Sullivan, J.M. (1998). Evaluation Voices: Promoting Evaluation from within Programs through Collaboration. Evaluation and Program Planning, 21 (1): 21-29.

Patton, M. Q. (1994). Developmental evaluation. Evaluation Practice, 15 (3), 311 – 319.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. This is a shortly monthly series highlighting contributions from AEA’s Graduate Education Diversity Internship (GEDI) Program and its interns. You can learn more about the GEDI program by visiting its webpage. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

We are Alison Mendoza and Yusuf Ransome, interns in AEA’s Graduate Education Diversity Internship (GEDI) program.  GEDI interns are graduate students in various disciplines in the social sciences who, through a nine-month internship and participation in various workshops throughout the year, gain hands-on experience in evaluation. GEDI interns also receive support from experienced mentors both at their internship site and at their school.  Alison is a 2nd year master’s student at the University of North Carolina’s Gillings School of Global Public Health in the Department of Social and Behavioral Sciences.  Yusuf is a 2nd year doctoral student at Columbia University’s Mailman School of Public Health in the Department of Sociomedical Sciences

For us, the experience of attending the annual conference was gratifying and helped us realize our potential as future evaluators. We felt like evaluation was “demystified”; in essence, we came to understand what it was all about. In particular, Yusuf’s experience moderating his very first focus group at the conference left him feeling like there was a place in evaluation for him.

Based on our experiences, we wanted to offer a few hot tips and cool tricks for grad students and new evaluators that we learned at the conference.

Hot Tip: Make connections!  Evaluation, like many fields, is largely based on who you know.  Although it can be intimidating walking into a room of seasoned professionals, we’ve found that most people will eagerly engage in conversation–sharing advice and information about opportunities for new graduates entering the field of evaluation.  Immerse yourself in all the activities and events as much as you can. Also, invest in business cards!

Hot Tip: Always have your resume/CV updated and ready to distribute. It could be that a new acquaintance will request that you to send it to them, and may even volunteer to hand-deliver it to HR.

Hot Tip: Coordinate and share resources. The conference can be very overwhelming; there are just too many events for one person to attend all. So, if you are attending with friends, split up and coordinate the sessions that you all attend. That way you can share resources with each other.

Cool Trick: Use what you know!  Alison is finding that her past public health experience in participatory workshop facilitation, community engagement, and program planning have helped her develop similar skills that are necessary for collaborative evaluation.

Thank you for reading about our experience, we look forward to seeing you there next year!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. This is a shortly monthly series highlighting contributions from AEA’s Graduate Education Diversity Internship (GEDI) Program and its interns. You can learn more about the GEDI program by visiting its webpage. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I am Frances Carter, a Public Policy PhD Candidate at the University of Maryland Baltimore County. As a member of AEA’s 2009-2010 cohort of GEDIs (pronounced JEDI; acronym for Graduate Education and Diversity Internship), I developed an evaluation plan for the Annie E. Casey Foundation’s Race Matter’s Toolkit. The Race Matters Toolkit (RMT) was designed to address and produce racially equitable results and opportunities for all children, families, and communities. The toolkit is based on assumptions that race matters in creating opportunities for all and that embedded racial inequities present the greatest barrier to equitable opportunities and results. By using the RMT, the Annie E. Casey Foundation’s goal is to help organizations make the case, shape the message, and do their work from the perspective that race matters. After being used for several years, there was a need to evaluate the RMT and its work.

Top 5 Lessons Learned in Developing the RMT Evaluation Plan:

  1. Evaluations in large non-profit and government organizations are often conducted by external evaluators and managed by evaluation teams within the organization.
  2. Research methodologies learned in academic environments need to take a more practical approach when applied to evaluations in results driven environments.
  3. Various evaluation styles (i.e. culturally relevant and responsive evaluation and participatory evaluation) were used to develop a comprehensive RMT evaluation proposal.
  4. Standard evaluation tools such as logic models, theories of change, and evaluation proposals are critical for new evaluators both to understand the program being evaluated and to develop appropriate evaluation planning.
  5. Work in academic, non-profit, government, and private organizations is often, and should continue to be, integrated to address social problems.

Hot Tip: When I meet current or future graduate students interested in evaluation, I point them to AEA’s programs for developing new and diverse evaluators. These include the GEDI program, AEA’s Type I and II Travel awards, and the various Topical Interest Groups within AEA, specifically the Multiethnic Issues in Evaluation TIG. After participating in the Pipeline Program in 2007 (which serves students local to the conference locale and is being revamped for 2011), presenting at Evaluation 2008 and 2009 conferences, and being a GEDI, I gained valuable evaluation experience. I feel completely aware of and engaged in the AEA community as well as national and local evaluation communities with understanding for how evaluation supports addressing important social problems in the United States and internationally.

Rad Resource: The Annie E. Casey Foundation has made the Race Matters Toolkit available for free online download. Please notify the Foundation at racematters@aecf.org if and when you use the toolkit to assist with the RMT evaluation efforts.

Rad Resource: RMT component and evaluation information are available in AEA’s eLibrary.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Archives

To top