AEA365 | A Tip-a-Day by and for Evaluators

CAT | Graduate Student and New Evaluators

Hi, we’re Audrey McIntyre and Michael Prideaux. We recently completed internships at The Improve Group, an evaluation consulting firm based in Minnesota.

As new evaluators, we have interesting perspectives from serving on evaluation teams during our internships. We worked with The Improve Group colleagues and clients on projects for the Angel Foundation, the Highland Friendship Club, and the Minnesota Department of Agriculture. In our work, we were truly members of the evaluation teams: We designed surveys and conducted interviews, analyzed data, and helped clients understand the findings.

Lesson Learned: Communication is key

One of the main avenues to success in an evaluation project team is strong communication. That extends to sharing core values. If you’re working from the same premise to the same goal, you only have to figure out the steps in between, rather than also having to put in time to determine a shared starting point.

Aside from moving a project forward, we found that strong communication allows team members to learn from each other. Especially being new to evaluation, we really valued hearing others’ ideas on projects and learned a lot just from listening to what our team members suggested. We met regularly with organization leaders about our projects to co-develop ideas on how to engage clients – it was through these meetings that we, too, were able to contribute our ideas and perspectives to The Improve Group’s work.

Hot Tip: Take full advantage of bright, engaged interns on evaluation teams

By being integrated into The Improve Group’s project work, we were able to contribute fully to the organization. Often organizations delegate less interesting tasks, like data entry, to interns. And while that is an important skill to grow, and we did do some data entry, we also did a lot of brainstorming, problem-solving, and development of things that made a difference and contributed to the team – which is what we loved the most.

Working on projects as interns also allowed us to be contributing to a larger goal as we were learning. Take Audrey’s contributions to The Improve Group’s project providing technical assistance and program evaluation to Minnesota Alcohol, Tobacco, and Other Drugs grantees as an example. She had some experience with data analysis at the time, but not enough to think of half the things the team suggested regarding how to analyze, visualize, and report the information we had gathered. If she hadn’t worked on a team, she wouldn’t have been able to do the good work she did on that project.

Rad Resource:  AEA’s Graduate Education Diversity Internship program provides paid internship and training opportunities during the academic year. Additional internship opportunities are posted on local AEA affiliate sites in April each year.

The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

What’s up! We are Gwendolyn Baxley and Larry D. Brown Jr., doctoral students at the University of Wisconsin-Madison and evaluators as part of the Wisconsin Evaluation Collaborative Clinic. The Clinic responds to the small-scale evaluation needs by matching trained graduate students at the University of Wisconsin-Madison with schools and education-focused community organizations in Dane County.

As graduate students, there are many benefits to engaging as professional evaluators, including obtaining applied and practical “research” experience in the field. The Clinic, and evaluation experience in general, provides us with an opportunity to connect our methodological and content expertise as trained academic scholars to serve and meet the evaluation needs of local schools and community organizations. Beyond solely publishing on or about organizations, we partner with them to provide real-time and annual feedback and technical assistance to better understand, improve, or transform their programs.

While conducting evaluations in the Clinic, we have learned two major lessons:

Lesson Learned: Teamwork and collaboration is key.

You cannot do this work alone.  It is not only important to leverage the perspective and expertise of colleagues, but also imperative to work in partnership with stakeholders in the programs we are evaluating. These include working in partnerships with youth, parents, program staff, and community members. With their local knowledge and expertise, youth, parents, program staff, and community members offer distinct sources of expertise and knowledge that enhance evaluation design, implementation, and use.

Lesson Learned: Critical reflection is integral to evaluation.

It is important to constantly reflect on one’s identity (culture, race, class, gender, educational level, sexual orientation, and social status, etc) and the sociopolitical contexts in which we do our work. Aspects of society and our own background may shape the evaluation design and process, in both intended and unintended ways.  Critical Reflexivity, particularly regarding issues of race, racism and marginality, is important in helping evaluators understand the ways in which  sociopolitical contexts and their own identities shape how they interact with evaluation “participants”,  view and interpret data, as well as frame and report evaluation findings.

As scholar-evaluators, graduate student evaluators gain valuable skills and experience that are rarely offered in a traditional academic program. The Clinic provides comprehensive training that prepares students for immediate hands-on opportunities in the field to apply obtained academic knowledge through practical experiences. Exposure to evaluation gives students insight into potential non-tenure track career options. Moreover, graduate student evaluators build networks, connect with and learn from the community in meaningful ways, and can engage in a culturally responsive manner.

The American Evaluation Association is celebrating The Wisconsin Idea in Action Week coordinated by the LEAD Center. The LEAD (Learning through Evaluation, Adaptation, and Dissemination) Center is housed within the Wisconsin Center for Education Research (WCER) at the School of EducationUniversity of Wisconsin-Madison and advances the quality of teaching and learning by evaluating the effectiveness and impact of educational innovations, policies, and practices within higher education. The contributions all this week to aea365 come from student and adult evaluators living in and practicing evaluation from the state of WI. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

No tags

Warm wishes from wonderful Wisconsin! I’m Kimberly Kile, Project Manager for the LEAD Center, housed within WCER at the University of Wisconsin. The LEAD Center is comprised of professional staff who conduct program evaluation within and about higher education both locally and nationally. I had the opportunity to take a leading role in developing our center as a host site for an intern through the American Evaluation Association’s Graduate Education Diversity Internship (AEA GEDI) program. This post will share some hot tips and lessons learned in becoming a host site.

The host site information on the AEA GEDI website identifies the site’s responsibilities, as well as the roles of the intern’s mentor. Once we reviewed these materials and knew that we were able to meet these expectations, we moved forward with the application.

Concurrently, we identified a potential project for the intern to work on. It was important to us to have a project that could be started and completed within the internship timeline (Sept through June). We also wanted the intern to see the entire process of an evaluation project, from the planning stages through an end product.

Hot Tip:

Consider finding a partner or project to share the cost of hosting an intern. In our case, our center paid the GEDI’s salary and benefits, while the project paid for the GEDI’s professional development expenses. Be sure to work closely with your financial folks to work out all the payment details.

Hot Tip:

Because of the tight timeline, in our case, we included a note in the application that funding for the position was still pending. There is no financial obligation unless you select an intern.

Lesson Learned:

AEA reviews the applications and then forwards potential GEDI applicants to each host site. Because travel can be a significant financial burden to graduate students, we offered interviews both in-person and via Skype.

Lesson Learned:

The interview window is set by the GEDI program so the sites have little flexibility related to the interview schedule. We blocked a couple of half-days within the interview window to be sure all interviewers could participate. This occurs in summer and vacations can conflict with interviews. If you partner with someone to share the cost (like we did), be sure to invite the partner to participate in the interviews. We also blocked an hour or so of time after all the interviews, so that all interviewers could discuss the applicants and everyone could make a decision together.

The LEAD Center had a delightful experience as an AEA GEDI host site. The GEDI at our site brought fresh ideas to our staff. We would highly recommend others consider hosting an AEA GEDI!

The American Evaluation Association is celebrating The Wisconsin Idea in Action Week coordinated by the LEAD Center. The LEAD (Learning through Evaluation, Adaptation, and Dissemination) Center is housed within the Wisconsin Center for Education Research (WCER) at the School of EducationUniversity of Wisconsin-Madison and advances the quality of teaching and learning by evaluating the effectiveness and impact of educational innovations, policies, and practices within higher education. The contributions all this week to aea365 come from student and adult evaluators living in and practicing evaluation from the state of WI. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Dawn X. Henderson, a past fellow of AEA’s Graduate Education Diversity Initiative (GEDI) and member of the Ann E. Casey’s Expanding the Bench Initiative. I recently developed an undergraduate seminar course in Community Psychology at a Minority Serving Institution. Program evaluation is a competency in Community Psychology and modeling evaluation was critical in passing my evaluation “wisdom” on to a group of “underrepresented” students through a partnership with a nonprofit. I aim to share some hot tips and lessons learned with those interested in teaching and working in evaluation.

Hot Tips:

  • Practice logic models. In preparation of the evaluation report, the class met with the Executive Director to obtain information about the nonprofit, focusing on their programming and key activities. The process of building logic models allowed students to become familiar with services provided by the nonprofit and develop visual connections between inputs, activities, etc.
  • Recognize the individual strengths and knowledge of your students/team. Students worked in pairs to perform the quantitative and qualitative analysis; each pair had a student familiar with the methodology and a weaker student. Weaker students learned new knowledge about data analysis and students collaboratively compiled findings into text and graphs.
  • Divide the report in sections and assign main duties and responsibilities. Each section of the evaluation report had a student leader responsible for collecting information, majority of writing, and maintaining communication with students and faculty. Each student also had to review and summarize an article related to the nonprofit’s programs and services; summaries were integrated into the discussion or recommendation section of the report.

Lessons Learned:

  • Maintain lines of communication on progress with the nonprofit. Maintaining contact with the nonprofit about status, challenges, and their needs can be useful in building feedback and recommendations to improve content. Using this process allows undergraduate students to understand the important role of integrating the nonprofit throughout the process in order to ensure the evaluation report is an accurate representation of their program.
  • Develop timelines for important milestones/benchmarks. The majority of the evaluation report was completed at the end of the academic semester, making it a stressful process for students and myself. Building in benchmarks for each section of the evaluation report would have provided more opportunities for feedback and editing. I literally had to go through the entire report the night before its draft was due to the nonprofit.

The students approached the preparation of the evaluation report with limited knowledge in evaluation but some familiarity in traditional research in psychology. In the end, students discovered ways to translate research processes into evaluation and the nonprofit received useful information to support their programming and funding efforts.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello everyone! We are Indira Phukan and Rachel Tripathy, graduate students at the Stanford Graduate School of Education. This past summer, we worked with a local environmental education organization to pilot several tools for evaluating learning outcomes related to stewardship and environmental behavior change. When it comes to understanding students’ perceptions of and relationships with nature, traditional assessment strategies often fall short, so we looked to creative, alternative evaluation techniques to understand student learning. We also conducted interviews and observations to understand how these tools might inform our research. One of the tools we explored this summer was an art-based embedded assessment.

In our pilot, five- to seven-year-old students participating in a weeklong camp at a coastal California campus were asked to make drawings before and after a scheduled activity where learning took place. The drawing prompts provided by their educators were broad enough to allow for the students to make choices about what they drew, but were also designed to direct student thinking toward the target activity. We collected student art and analyzed it with a rubric that considered thematic, analytic, informational, and contextual details. It was incredible to see the kinds of observations being made by six-year-olds! Their drawings definitely captured learning details that a written assessment would not, and the children had fun in the process. Moving forward, we are excited to see how this tool works with other age groups, and how it might be adopted as an embedded assessment strategy by other organizations.

Hot Tip: Site observations and interviews with educators can help researchers and practitioners design embedded assessments that fit seamlessly into existing curriculum and programming. The educators will thank you, and your data will reflect a more representative student experience.

Lesson Learned 1: When analyzing subjective student work, like art, the type of rubric being used is exceedingly important. The rubric should be well thought-out, and designed to tease out information that will answer your research questions. Ultimately, an effective rubric will go through various iterations during the pilot phase before a final version is decided upon.

Lesson Learned 2: Informative student art takes time. Initially, we wanted to give students about 5-10 minutes to produce a drawing, but we quickly learned that they needed at least 15 minutes to create something that we could properly analyze for insights about their learning.

Rad Resource 1: Rubric Library is a great online resource for browsing rubrics that others have used, and for finding inspiration for creating your own.

Rad Resource 2: Ami Flowers and her colleagues wrote a great article on using art elicitation for assessment. We drew a lot of inspiration from their findings.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello fellow evaluation lovers! My name is Elizabeth Grim and I work as an evaluation consultant with The Consultation Center at Yale, where I primarily consult with community-based agencies to build evaluation capacity. Prior to joining The Consultation Center, I worked as a policy analyst for Connecticut’s statewide campaign to end homelessness.

With the growing popularity of social media, evaluators increasingly discuss data visualization and how to communicate evaluation findings to stakeholders. Yet we don’t always talk about how to effectively communicate within our own teams, which is just as important to the success of a project. Effective communication involves fostering workplaces and teams in which people are heard, understood, and acknowledged for their unique contributions.

Lesson 1: Encourage curiosity: Communication is easier when questions and comments come from a place of curiosity rather than judgment. Ask questions when discussing a project or deliverable rather than jumping immediately to feedback and conclusions.

Lesson 2: Know your colleagues: The first step in fostering better communication is developing a relationship with members of the team. How do your colleagues prefer to communicate? What are their unique skills and professional goals? What are they passionate about inside and outside of the office?

Lesson 3: Table technology: Technology provides us with more flexibility in the workplace and allows us to communicate with partners across the globe. However, technology also allows people to talk around issues, reduces the ability to contextualize information through tone of voice or facial expressions, and encourages multitasking, all of which can result in a breakdown in communication. Ask team members to check their non-essential technology at the beginning of a meeting. Consider providing an incentive like a monthly gift card drawing for those that go low-tech.

Rad Resource 1: 4 Pillars of Integrity Video Series – Make impeccable agreements. Making impeccable agreements means that you only agree to what you are able and willing to complete and that you follow through with your agreements. On the flipside, this means that you also say no to those that you are unable and/or unwilling to complete. Teams are more effective and members have more trust in each other when each member takes 100% responsibility for themselves and their actions.

Rad Resource 2: The Great Genius Swap – Work environments and teams are more effective when people enjoy what they’re doing. Conduct a genius swap with your team. Gather the team together and ask each person to write down the one task they most love to do at their job and the one task they would like to stop doing. Find opportunities for team members to continue doing what they love and explore whether you can swap responsibilities around to minimize those they don’t enjoy.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Megan Olshavsky and I’ve been an evaluator of PreK-12 educational programs for about a year and half now. Before starting my work in a public school district, I was researching learning and memory processes in rats, earning my Ph.D. in Psychology – Behavioral Neuroscience. My experiments were very controlled: the rats exhibited the behavior or they did not, neurons were active or they were not, results were statistically significant or they were not.

Moving from that environment to the “real world” of a school district which employs and serves humans in all their messiness caused some growing pains. How was I supposed to decide whether an educational intervention lead to academic improvement without proper control and experimental conditions?! One of the first projects I’ve worked on is a developmental evaluation of a technology initiative. Developmental Evaluation made me feel ever more flakey –“Hey everyone! Let’s monitor things as they unfold. What are we looking for? Not sure, but we’ll know it when we see it.”

As I’ve transitioned from researcher to evaluator, three things have helped me feel more legit.

Lesson Learned 1: Trust yourself. You may not be an expert in the area you are evaluating, but you do have expertise looking at data with a critical eye, asking probing questions, and synthesizing information from a variety of sources.

Lesson Learned 2: Collaborate with a team who has diverse expertise. Our developmental evaluation team engaged teachers, instructional technology specialists, information systems staff, and evaluators. When everyone on that team can come to the same conclusion, I feel confident we’re making the right decision.

Lesson Learned 3: Embrace capacity building as part of your work. No one would recommend training-up stakeholders to do their own inferential statistics. You can, however, influence the people around you to be critical about their work. Framing is critical. “Evaluation” is a scary word, but “proving the project/program/intervention is effective” is a win for everyone. Building relationships and modeling that expertise we talked about in Lesson #1 leads to gradual institutional shift toward evaluative thinking.

Rad Resource: Notorious R.B.G: The Life and Times of Ruth Bader Ginsburg. Let RBG be your guide as you gather and synthesize the relevant information, discuss with your diverse team, and advocate for slow institutional change.”

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

 Hello! My name is Alana Kinarsky, a PhD student in the Evaluation program at University of California, Los Angeles (UCLA). As a graduate student, I regularly read evaluation theory and conduct my own research on evaluation. However, I often wonder how research on evaluation does or should help practitioners in their daily work.

Cool Trick: To connect research and practice, I organized a pop-up journal club during my summer internship at Informing Change. The journal club gave staff an opportunity to read and discuss evaluation research. I circulated a few theory papers via email and the group elected to read “Evaluation and Organizational Learning: Past, Present, and Future” by Rosalie Torres and Hallie Preskill. The following week, about 10 of us got together over pizza for a facilitated yet casual conversation. Discussing theory can help evaluation practitioners meaningfully reflect on their practice

Lesson Learned 1: Theory offers practitioners a framework and context for their evaluation work. As our conversation of the paper unfolded, we zeroed in on a question that weighs heavily in both theory and practice: what is the role of the evaluator? As people around the table began talking through different roles, I noticed their ideas began to align with the Evaluation Theory Tree developed by Marv Alkin and Tina Christie. I sketched it on the board and walked through the different “branches” of evaluation theory.   The Theory Tree focused our conversation and grounded some of these theoretical elements–like the role of an evaluator–in a visual that was analogous to the roles the Informing Change practitioners recognized in their work.

Lesson Learned 2: These conversations are an opportunity for teambuilding. A conversation about theory creates an opportunity for people from different backgrounds and leadership levels to participate in a shared dialogue. During our discussion, we shared personal stories, current challenges, and ideas for future team conversations rose to the surface. Furthermore, people who rarely work together had the opportunity to collaborate and brainstorm with peers.

Lesson Learned 3: Reflection on practice is important. Our work as evaluators is often fast paced so it is easy to get caught up in execution. However, it is important to make time to reflect on the big picture and think creatively. This not only improves an individual’s practice, but also supports organizational learning.

At the end of the hour, I asked the group to quickly share one takeaway from our conversation. The room was buzzing with energy as people shared what they learned and expressed enthusiasm for continuing this practice. The group agreed that stepping away from their desks to talk about theory offered them an opportunity to reflect, build relationships, and generate new ideas.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! We are Laura Sundstrom and Megan Elyse Williams, Evaluation Associates at the Curtis Center Program Evaluation Group (CC-PEG) at the University of Michigan School of Social Work.

At CC-PEG, we train Master of Social Work students in program evaluation through providing high-quality evaluation services to community-based organizations.  Our students enter our unit with a variety of experiences and skills.  When we were first growing our center, students would get assigned to tasks that were beyond their skill level out of project need.  As a result, we developed the Tiers of Skill Development to guide students logically and intentionally through their skill development and professional preparation.

Hot Tip: Make it applicable for your context. We developed these Tiers based on the skills needed to be successful within our Center.  Your organization may value different skills or use a different order of development.

Cool Trick: There are many uses for the Tiers – be creative!

  • Orientation to evaluation. Helping students understand all of the different components and skills that go into evaluation practice.
  • Supervision and mentoring. Working with students to assess their self-efficacy in these skills and where they have practiced these skills in project work.
  • Project management. Helping lead evaluators assign tasks that challenge students but are not out of their reach.
  • Identifying trainings. Skills that many students have not had a chance to develop may be appropriate for a larger training.
  • Personal development. Assisting students in their professional development, advocating for their own learning, and in their job search.

Lessons Learned: After using the Tiers for over a year, we have learned a lot!

  • Project work cuts across the tiers. Students don’t have to complete one full tier before moving on to the next. They can develop skills in certain strands of work – such as qualitative data collection and analysis.
  • Response set is important for understanding “mastery” of a skill. Highest level on the Tiers is “can teach someone else to do it.”  This helps contextualize for students what “mastery” of skill means in the professional world.
  • Identify peer support. Identify students that are ready to work towards “mastery” of a skill and pair them for peer support with another student who needs training.
  • Skill development rather than self-efficacy. The Tiers focus on demonstrating skill development rather than reporting self-efficacy.  Students will be able to point to specific tasks where they practiced a skill instead of saying they are confident in their skill level.

SundstromAEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Sophia Guevara and I am a recent graduate of Wayne State University. I am writing about my experience identifying and collaborating with other evaluation leaders and professionals to motivate them to take action around an important issue.

For the Chicago conference, I worked with the leadership of several Topical Interest Groups to develop a toiletry collection for a homeless women’s shelter. As this was the second year our TIG partnership asked AEA for permission to collect toiletries for the homeless, there were some important lessons learned from the project.

Lesson Learned 1: Build awareness to motivate others to take action. With the leadership of the Social Network Analysis, Nonprofit and Foundation, Social Work and Alcohol, Drug Abuse and Mental Health TIGs informing their members about the collection, many AEA members knew that we were collecting free travel-sized toiletries that were placed in attendees’ hotel rooms. With mentions in newsletters and posts on LinkedIn Groups, the collection netted a box of donations from generous American Evaluation Association conference attendees.

Lesson Learned 2: Sometimes you will find your best partners through recommendations from those you connect with. In the beginning, I emailed the leadership of larger Topical Interest Groups to gain the support I thought I needed to make the proposal successful in front of the American Evaluation Association. These contacts recommended I also contact the leadership of smaller Topical Interest Groups whose focus was closely aligned to those who were experiencing homelessness.

Lesson Learned 3: Seek expertise from those who may know an area better than you. The idea of collecting toiletries this year for Deborah’s Place came from a professional contact I know who works at a Chicago-based foundation that has supported several activities related to homelessness. Since I was not from Chicago, I reached out to this person who was able to use her expertise to recommend the organization.

Rad Resource 1: The American Evaluation Association Community provides an opportunity to identify the number of members enrolled in Topical Interest Group communities. By researching potential Topical Interest Groups to contact to gain their support, I initially focused on accessing larger TIGs first and then identified others by whether or not their topic of interest might be related to the issue of homelessness.

Rad Resource 2: The AEA Topical Interest Group List provides visitors with contact information for Topical Interest Group leadership and a direct link to each group’s website.

The Graduate Student & New Evaluator TIG is sponsoring this week’s AEA365, so be sure to check out the blogs this week! It’ll be worthwhile for new and seasoned evaluators alike!

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Older posts >>

Archives

To top